Jump to content

Risk

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 132.211.195.38 (talk) at 21:02, 31 July 2007 (Risk in auditing). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Risk is a concept that denotes a potential negative impact to an asset or some characteristic of value that may arise from some present process or future event. In everyday usage, "risk" is often used synonymously with the probability of a known loss. Paradoxically, a probable loss can be uncertain and relative in an individual event while having a certainty in the aggregate of multiple events (see risk vs. uncertainty below).

Risk is the uncertainty of an event occuring that could have an impact on the achievement of objectives. According to IIA glossary's definition of risk by http://www.seattle.gov/audit/training_files/259,4,Risk.

Risk communication and risk perception are essential factors for every human decision making.

Definitions of risk

There are many more and less precised definitions of risk, they depend on specific applications and situational contexts. It can be assessed qualitatively or quantitatively.

Qualitatively, risk is considered proportional to the expected losses which can be caused by an event and to the probability of this event. The harsher the loss and the more likely the event, the greater the overall risk.

Frequently in the subject matter literature, risk is defined in pseudo-formal forms where the components of the definition are vague and ill defined, for example, risk is considered as an indicator of threat, or depends on threats, vulnerability, impact and uncertainty.[citation needed]


In engineering, the quantitative engineering definition of risk is:

.

Independently on the wide use this definition, for example in nuclear energy and other potentially dangerous industries, measuring engineering risk is often difficult; the probability is assessed by the frequency of the past similar events, (or by event-tree methods) but rare failures are hard to estimate if an event tree cannot be formulated, and loss of human life is generally considered beyond estimation [citation needed] - however, radiological release (eg GBq of radio-Iodine) is usually used as a surrogate. There are many formal methods used to assess or to "measure" risk considered as one of the critical indicators important for human decision making.

Financial risk is often defined as the unexpected variability or volatility of returns, and thus includes both potential worse than expected as well as better than expected returns. References to negative risk below should be read as applying to positive impacts or opportunity (e.g. for loss read "loss or gain") unless the context precludes.

In statistics, risk is often mapped to the probability of some event which is seen as undesirable. Usually the probability of that event and some assessment of its expected harm must be combined into a believable scenario (an outcome) which combines the set of risk, regret and reward probabilities into an expected value for that outcome. (See also Expected utility)

Thus in statistical decision theory, the risk function of an estimator δ(x) for a parameter θ, calculated from some observables x; is defined as the expectation value of the loss function L,

where:

  • δ(x) = estimator
  • θ = the parameter of the estimator

In information security [citation needed], a "risk" is defined as a function of three variables:

  1. the probability that there's a threat
  2. the probability that there are any vulnerabilities
  3. the potential impact.

If any of these variables approaches zero, the overall risk approaches zero.

The management of actuarial risk is called risk management.

Historical background

Scenario analysis matured during Cold War confrontations between major powers, notably the USA and USSR. It became widespread in insurance circles in the 1970s when major oil tanker disasters forced a more comprehensive foresight.[citation needed] The scientific approach to risk entered finance in the 1980s when financial derivatives proliferated. It reached general professions in the 1990s when the power of personal computing allowed for wide spread data collection and numbers crunching.

Governments are apparently only now learning to use sophisticated risk methods, most obviously to set standards for environmental regulation, e.g. "pathway analysis" as practiced by the United States Environmental Protection Agency.

Risk vs. uncertainty

In his seminal work "Risk, Uncertainty, and Profit", Frank Knight (1921) established the distinction between risk and uncertainty.

... Uncertainty must be taken in a sense radically distinct from the familiar notion of Risk, from which it has never been properly separated. The term "risk," as loosely used in everyday speech and in economic discussion, really covers two things which, functionally at least, in their causal relations to the phenomena of economic organization, are categorically different. ... The essential fact is that "risk" means in some cases a quantity susceptible of measurement, while at other times it is something distinctly not of this character; and there are far-reaching and crucial differences in the bearings of the phenomenon depending on which of the two is really present and operating. ... It will appear that a measurable uncertainty, or "risk" proper, as we shall use the term, is so far different from an unmeasurable one that it is not in effect an uncertainty at all. We ... accordingly restrict the term "uncertainty" to cases of the non-quantitive type.

Insurance and health risk

Insurance is a risk-reducing investment in which the buyer pays a small fixed amount to be protected from a potential large loss. Gambling is a risk-increasing investment, wherein money on hand is risked for a possible large return, but with the possibility of losing it all. Purchasing a lottery ticket is a very risky investment with a high chance of no return and a small chance of a very high return. In contrast, putting money in a bank at a defined rate of interest is a risk-averse action that gives a guaranteed return of a small gain and precludes other investments with possibly higher gain.

Risks in personal health may be reduced by primary prevention actions that decrease early causes of illness or by secondary prevention actions after a person has clearly measured clinical signs or symptoms recognized as risk factors. Tertiary prevention (medical) reduces the negative impact of an already established disease by restoring function and reducing disease-related complications. Ethical medical practice requires careful discussion of risk factors with individual patients to obtain informed consent for secondary and tertiary prevention efforts, whereas public health efforts in primary prevention require education of the entire population at risk. In each case, careful communication about risk factors, likely outcomes and certainty must distinguish between causal events that must be decreased and associated events that may be merely consequences rather than causes.

Economic risk

In business

Means of assessing risk vary widely between professions. Indeed, they may define these professions; for example, a doctor manages medical risk, while a civil engineer manages risk of structural failure. A professional code of ethics is usually focused on risk assessment and mitigation (by the professional on behalf of client, public, society or life in general).

In the workplace exist incidental and inherent risks. Incidental risks are those which occur naturally in the business, but are not part of the core of the business. Inherent risks have a negative effect on the operating profit of the business.

Risk-sensitive industries

Some industries manage risk in a highly quantified and numerate way. These include the nuclear power and aircraft industries, where the possible failure of a complex series of engineered systems could result in highly undesirable outcomes. The usual measure of risk for a class of events is then, where P is probability and C is consequence;

The total risk is then the sum of the individual class-risks.

In the nuclear industry, 'consequence' is often measured in terms of off-site radiological release, and this is often banded into five or six decade-wide bands.
.

The risks are evaluated using Fault Tree/Event Tree techniques (see safety engineering). Where these risks are low they are normally considered to be 'Broadly Acceptable'. A higher level of risk (typically up to 10 to 100 times what is considered broadly acceptable) has to be justified against the costs of reducing it further and the possible benefits that make it tolerable - these risks are described as 'Tolerable if ALARP'. Risks beyond this level are classified as 'Intolerable'.

The level of risk deemed 'Broadly Acceptable' has been considered by Regulatory bodies in various countries - an early attempt by UK government regulator & academic F. R. Farmer used the example of hill-walking and similar activities which have definable risks that people appear to find acceptable. This resulted in the so-called Farmer Curve, of acceptable probability of an event versus its consequence.

The technique as a whole is usually referred to as Probabilistic Risk Assessment (PRA), (or Probabilistic Safety Assessment, PSA). See WASH-1400 for an example of this approach.

In finance

In finance, risk is the probability that an investment's actual return will be different than expected. This includes the possibility of losing some or all of the original investment. It is usually measured by calculating the standard deviation of the historical returns or average returns of a specific investment. [citation needed]

In finance "risk" has no one definition, but some theorists, notably Ron Dembo, have defined quite general methods to assess risk as an expected after-the-fact level of regret. Such methods have been uniquely successful in limiting interest rate risk in financial markets. Financial markets are considered to be a proving ground for general methods of risk assessment.

However, these methods are also hard to understand. The mathematical difficulties interfere with other social goods such as disclosure, valuation and transparency. In particular, it is often difficult to tell if such financial instruments are "hedging" (purchasing/selling a financial instrument specifically to reduce or cancel out the risk in another investment) or "gambling" (increasing measurable risk and exposing the investor to catastrophic loss in pursuit of very high windfalls that increase expected value).

As regret measures rarely reflect actual human risk-aversion, it is difficult to determine if the outcomes of such transactions will be satisfactory. Risk seeking describes an individual whose utility function's second derivative is positive. Such an individual would willingly (actually pay a premium to) assume all risk in the economy and is hence not likely to exist.

In financial markets one may need to measure credit risk, information timing and source risk, probability model risk, and legal risk if there are regulatory or civil actions taken as a result of some "investor's regret".

"A fundamental idea in finance is the relationship between risk and return. The greater the amount of risk that an investor is willing to take on, the greater the potential return. The reason for this is that investors need to be compensated for taking on additional risk".

"For example, a US Treasury bond is considered to be one of the safest investments and, when compared to a corporate bond, provides a lower rate of return. The reason for this is that a corporation is much more likely to go bankrupt than the U.S. government. Because the risk of investing in a corporate bond is higher, investors are offered a higher rate of return".

In public works

In a peer reviewed study of risk in public works projects located in 20 nations on five continents, Flyvbjerg, Holm, and Buhl (2002, 2005) documented high risks for such ventures for both costs [1] and demand [2]. Actual costs of projects were typically higher than estimated costs; cost overruns of 50% were common, overruns above 100% not uncommon. Actual demand was often lower than estimated; demand shortfalls of 25% were common, of 50% not uncommon.

Due to such cost and demand risks, cost-benefit analyses of public works projects have proved to be highly uncertain.

The main causes of cost and demand risks were found to be optimism bias and strategic misrepresentation. Measures identified to mitigate this type of risk are better governance through incentive alignment and the use of reference class forecasting [3].

Risk in psychology

Regret

In decision theory, regret (and anticipation of regret) can play a significant part in decision-making, distinct from risk aversion (preferring the status quo in case one becomes worse off).

Framing

Framing[citation needed] is a fundamental problem with all forms of risk assessment. In particular, because of bounded rationality (our brains get overloaded, so we take mental shortcuts) the risk of extreme events is discounted because the probability is too low to evaluate intuitively. As an example, one of the leading causes of death is road accidents caused by drunk driving - partly because any given driver frames the problem by largely or totally ignoring the risk of a serious or fatal accident.

The above examples: body, threat, price of life, professional ethics and regret show that the risk adjustor or assessor often faces serious conflict of interest. The assessor also faces cognitive bias and cultural bias, and cannot always be trusted to avoid all moral hazards. This represents a risk in itself, which grows as the assessor is less like the client.

For instance, an extremely disturbing event that all participants wish not to happen again may be ignored in analysis despite the fact it has occurred and has a nonzero probability. Or, an event that everyone agrees is inevitable may be ruled out of analysis due to greed or an unwillingness to admit that it is believed to be inevitable. These human tendencies to error and wishful thinking often affect even the most rigorous applications of the scientific method and are a major concern of the philosophy of science. But all decision-making under uncertainty must consider cognitive bias, cultural bias, and notational bias: No group of people assessing risk is immune to "groupthink": acceptance of obviously-wrong answers simply because it is socially painful to disagree.

One effective way to solve framing problems in risk assessment or measurement (although some argue that risk cannot be measured, only assessed) is to ensure that scenarios, as a strict rule, must include unpopular and perhaps unbelievable (to the group) high-impact low-probability "threat" and/or "vision" events. This permits participants in risk assessment to raise others' fears or personal ideals by way of completeness, without others concluding that they have done so for any reason other than satisfying this formal requirement.

For example, an intelligence analyst with a scenario for an attack by hijacking might have been able to insert mitigation for this threat into the U.S. budget. It would be admitted as a formal risk with a nominal low probability. This would permit coping with threats even though the threats were dismissed by the analyst's superiors. Even small investments in diligence on this matter might have disrupted or prevented the attack-- or at least "hedged" against the risk that an Administration might be mistaken.

Fear as intuitive risk assessment

For the time being, people rely on their fear and hesitation to keep them out of the most profoundly unknown circumstances.

In "The Gift of Fear", Gavin de Becker argues that "True fear is a gift. It is a survival signal that sounds only in the presence of danger. Yet unwarranted fear has assumed a power over us that it holds over no other creature on Earth. It need not be this way."

Risk could be said to be the way we collectively measure and share this "true fear" - a fusion of rational doubt, irrational fear, and a set of unquantified biases from our own experience.

The field of behavioral finance focuses on human risk-aversion, asymmetric regret, and other ways that human financial behavior varies from what analysts call "rational". Risk in that case is the degree of uncertainty associated with a return on an asset.

Recognizing and respecting the irrational influences on human decision making may do much to reduce disasters caused by naive risk assessments that pretend to rationality but in fact merely fuse many shared biases together.

Risk in auditing

The audit risk model can be analytically expressed as:

AR = IR x CR x DR

Where AR is audit risk, IR is inherent risk, and DR is detection risk.

References

Articles & Papers

  • Clark, L., Manes, F., Antoun, N., Sahakian, B. J., & Robbins, T. W. (2003). The contributions of lesion laterality and lesion volume to decision-making impairment following frontal lobe damage. Neuropsychologia, 41, 1474-1483.
  • Drake, R. A. (1985). Decision making and risk taking: Neurological manipulation with a proposed consistency mediation. Contemporary Social Psychology, 11, 149-152.
  • Drake, R. A. (1985). Lateral asymmetry of risky recommendations. Personality and Social Psychology Bulletin, 11, 409-417.
  • Hansson, Sven Ove. (2007). Risk, "The Stanford Encyclopedia of Philosophy" (Summer 2007 Edition), Edward N. Zalta (ed.), forthcoming URL = <http://plato.stanford.edu/archives/sum2007/entries/risk/>.
  • Holton, Glyn A. (2004). Defining Risk, Financial Analysts Journal, 60 (6), 19–25. A paper exploring the foundations of risk. (PDF file)
  • Knight, F. H. (1921) Risk, Uncertainty and Profit, Chicago: Houghton Mifflin Company. (Cited at: [4], § I.I.26.)
  • Miller, L. (1985). Cognitive risk taking after frontal or temporal lobectomy I. The synthesis of fragmented visual information. Neuropsychologia, 23, 359 369.
  • Miller, L., & Milner, B. (1985). Cognitive risk taking after frontal or temporal lobectomy II. The synthesis of phonemic and semantic information. Neuropsychologia, 23, 371 379.

Books

Historian David A. Moss's book When All Else Fails explains the U.S. government's historical role as risk manager of last resort.
Peter L. Bernstien. Against the Gods ISBN 0-471-29563-9. Risk explained and its appreciation by man traced from earliest times through all the major figures of their ages in mathematical circles.
Porteous, Bruce T. (2005). Economic Capital and Financial Risk Management for Financial Services Firms and Conglomerates. Palgrave Macmillan. ISBN 1-4039-3608-0. {{cite book}}: Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)

Magazines

Journals

Societies

See also

{{Top}} may refer to:

{{Template disambiguation}} should never be transcluded in the main namespace.


| class="col-break " |

Template:Bottom