List of cognitive biases
A cognitive bias is a pattern of deviation in judgement that occurs in particular situations (see also cognitive distortion and the lists of thinking-related topics). Implicit in the concept of a "pattern of deviation" is a standard of comparison; this may be the judgment of people outside those particular situations, or may be a set of independently verifiable facts. The existence of some of these cognitive biases has been verified empirically in the field of psychology.
Cognitive biases are instances of evolved mental behavior. Some are presumably adaptive, for example, because they lead to more effective actions or enable faster decisions. Others presumably result from a lack of appropriate mental mechanisms, or from the misapplication of a mechanism that is adaptive under different circumstances.
Decision-making and behavioral biases
It has been suggested that Buyer decision processes#Cognitive and personal biases in decision making be merged into this article. (Discuss) Proposed since September 2008. |
Many of these biases are studied for how they affect belief formation, business decisions, and scientific research.
- Bandwagon effect – the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behaviour.
- Base rate fallacy – ignoring available statistical data in favor of particulars.
- Bias blind spot – the tendency not to compensate for one's own cognitive biases.[1]
- Choice-supportive bias – the tendency to remember one's choices as better than they actually were.
- Confirmation bias – the tendency to search for or interpret information in a way that confirms one's preconceptions.
- Congruence bias – the tendency to test hypotheses exclusively through direct testing, in contrast to tests of possible alternative hypotheses.
- Contrast effect – the enhancement or diminishing of a weight or other measurement when compared with a recently observed contrasting object.
- Déformation professionnelle – the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view.
- Denomination effect – the tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).[2]
- Distinction bias – the tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.[3]
- Endowment effect – "the fact that people often demand much more to give up an object than they would be willing to pay to acquire it".[4]
- Experimenter's or Expectation bias – the tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.[5]
- Extraordinarity bias – the tendency to value an object more than others in the same category as a result of an extraordinarity of that object that does not, in itself, change the value.[citation needed]
- Focusing effect – prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting the utility of a future outcome.
- Framing – Using an approach or description of the situation or issue that is too narrow. Also framing effect – drawing different conclusions based on how data is presented.
- Hyperbolic discounting – the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, where the tendency increases the closer to the present both payoffs are.
- Illusion of control – the tendency for human beings to believe they can control or at least influence outcomes that they clearly cannot.
- Impact bias – the tendency for people to overestimate the length or the intensity of the impact of future feeling states.
- Information bias – the tendency to seek information even when it cannot affect action.
- Interloper effect – the tendency to value third party consultation as objective, confirming, and without motive. Also consultation paradox, the conclusion that solutions proposed by existing personnel within an organization are less likely to receive support than from those recruited for that purpose.
- Irrational escalation – the tendency to make irrational decisions based upon rational decisions in the past or to justify actions already taken.
- Just-world phenomenon - witnesses of an "inexplicable injustice . . . will rationalize it by searching for things that the victim might have done to deserve it"
- Loss aversion – "the disutility of giving up an object is greater than the utility associated with acquiring it".[6] (see also sunk cost effects and Endowment effect).
- Mere exposure effect – the tendency for people to express undue liking for things merely because they are familiar with them.
- Money illusion – the tendency of people to concentrate on the nominal (face value) of money rather than its value in terms of purchasing power.
- Moral credential effect – the tendency of a track record of non-prejudice to increase subsequent prejudice.
- Need for Closure – the need to reach a verdict in important matters; to have an answer and to escape the feeling of doubt and uncertainty. The personal context (time or social pressure) might increase this bias.[7]
- Negativity bias – phenomenon by which humans pay more attention to and give more weight to negative than positive experiences or other kinds of information.
- Neglect of probability – the tendency to completely disregard probability when making a decision under uncertainty.
- Normalcy bias – the refusal to plan for, or react to, a disaster which has never happened before.
- Not Invented Here – the tendency to ignore that a product or solution already exists, because its source is seen as an "enemy" or as "inferior".
- Omission bias – the tendency to judge harmful actions as worse, or less moral, than equally harmful omissions (inactions).
- Outcome bias – the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
- Planning fallacy – the tendency to underestimate task-completion times.
- Post-purchase rationalization – the tendency to persuade oneself through rational argument that a purchase was a good value.
- Pseudocertainty effect – the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
- Reactance – the urge to do the opposite of what someone wants you to do out of a need to resist a perceived attempt to constrain your freedom of choice.
- Restraint bias - the tendency to overestimate one's ability to show restraint in the face of temptation.
- Selective perception – the tendency for expectations to affect perception.
- Semmelweis reflex – the tendency to reject new evidence that contradicts an established paradigm.[8]
- Status quo bias – the tendency for people to like things to stay relatively the same (see also loss aversion, endowment effect, and system justification).[9]
- Von Restorff effect – the tendency for an item that "stands out like a sore thumb" to be more likely to be remembered than other items.
- Wishful thinking – the formation of beliefs and the making of decisions according to what is pleasing to imagine instead of by appeal to evidence or rationality.
- Zero-risk bias – preference for reducing a small risk to zero over a greater reduction in a larger risk.
Biases in probability and belief
Many of these biases are often studied for how they affect business and economic decisions and how they affect experimental research.
- Ambiguity effect – the avoidance of options for which missing information makes the probability seem "unknown".
- Anchoring effect – the tendency to rely too heavily, or "anchor," on a past reference or on one trait or piece of information when making decisions (also called "insufficient adjustment").
- Attentional bias – neglect of relevant data when making judgments of a correlation or association.
- Authority bias – the tendency to value an ambiguous stimulus (e.g., an art performance) according to the opinion of someone who is seen as an authority on the topic.
- Availability heuristic – estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples.
- Availability cascade – a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").
- Belief bias – an effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.
- Clustering illusion – the tendency to see patterns where actually none exist.
- Capability bias – The tendency to believe that the closer average performance is to a target, the tighter the distribution of the data set.
- Conjunction fallacy – the tendency to assume that specific conditions are more probable than general ones.
- Disposition effect – the tendency to sell assets that have increased in value but hold assets that have decreased in value.
- Gambler's fallacy – the tendency to think that future probabilities are altered by past events, when in reality they are unchanged. Results from an erroneous conceptualization of the Law of large numbers. For example, "I've flipped heads with this coin five times consecutively, so the chance of tails coming out on the sixth flip is much greater than heads."
- Hawthorne effect – the tendency of people to perform or perceive differently when they know that they are being observed.
- Hindsight bias – sometimes called the "I-knew-it-all-along" effect, the inclination to see past events as being predictable.
- Illusory correlation – beliefs that inaccurately suppose a relationship between a certain type of action and an effect.[10]
- Last illusion — the belief that someone must know what is going on. Coined by Brian Eno.
- Neglect of prior base rates effect – the tendency to neglect known odds when reevaluating odds in light of weak evidence.
- Observer-expectancy effect – when a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it (see also subject-expectancy effect).
- Optimism bias – the systematic tendency to be over-optimistic about the outcome of planned actions.
- Ostrich effect – ignoring an obvious (negative) situation.
- Overconfidence effect – excessive confidence in one's own answers to questions. For example, for certain types of question, answers that people rate as "99% certain" turn out to be wrong 40% of the time.
- Positive outcome bias – a tendency in prediction to overestimate the probability of good things happening to them (see also wishful thinking, optimism bias, and valence effect).
- Pareidolia – a vague and random stimulus (often an image or sound) is perceived as significant, e.g., seeing images of animals or faces in clouds, the man in the moon, and hearing hidden messages on records played in reverse.
- Primacy effect – the tendency to weigh initial events more than subsequent events.
- Recency effect – the tendency to weigh recent events more than earlier events (see also peak-end rule).
- Disregard of regression toward the mean – the tendency to expect extreme performance to continue.
- Selection bias – a distortion of evidence or data that arises from the way that the data are collected.
- Stereotyping – expecting a member of a group to have certain characteristics without having actual information about that individual.
- Subadditivity effect – the tendency to judge probability of the whole to be less than the probabilities of the parts.
- Subjective validation – perception that something is true if a subject's belief demands it to be true. Also assigns perceived connections between coincidences.
- Survivorship bias - concentrating on the people or things that "survived" some process and ignoring those that didn't, or arguing that a strategy is effective given the winners, while ignoring the large amount of losers.
- Telescoping effect – the effect that recent events appear to have occurred more remotely and remote events appear to have occurred more recently.
- Texas sharpshooter fallacy – the fallacy of selecting or adjusting a hypothesis after the data is collected, making it impossible to test the hypothesis fairly. Refers to the concept of firing shots at a barn door, drawing a circle around the best group, and declaring that to be the target.
- Well travelled road effect - underestimation of the duration taken to traverse oft-traveled routes and over-estimate the duration taken to traverse less familiar routes.
Social biases
Most of these biases are labeled as attributional biases.
- Actor-observer bias – the tendency for explanations of other individuals' behaviors to overemphasize the influence of their personality and underemphasize the influence of their situation (see also fundamental attribution error). However, this is coupled with the opposite tendency for the self in that explanations for our own behaviors overemphasize the influence of our situation and underemphasize the influence of our own personality.
- Egocentric bias – occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
- Forer effect (aka Barnum Effect) – the tendency to give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically for them, but are in fact vague and general enough to apply to a wide range of people. For example, horoscopes.
- False consensus effect – the tendency for people to overestimate the degree to which others agree with them.
- Fundamental attribution error – the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior (see also actor-observer bias, group attribution error, positivity effect, and negativity effect).
- Halo effect – the tendency for a person's positive or negative traits to "spill over" from one area of their personality to another in others' perceptions of them (see also physical attractiveness stereotype).
- Herd instinct – Common tendency to adopt the opinions and follow the behaviors of the majority to feel safer and to avoid conflict.
- Illusion of asymmetric insight – people perceive their knowledge of their peers to surpass their peers' knowledge of them.
- Illusion of transparency – people overestimate others' ability to know them, and they also overestimate their ability to know others.
- Illusory superiority – overestimating one's desirable qualities, and underestimating undesirable qualities, relative to other people. Also known as Superiority bias (also known as "Lake Wobegon effect", "better-than-average effect", "superiority bias", or Dunning-Kruger effect).
- Ingroup bias – the tendency for people to give preferential treatment to others they perceive to be members of their own groups.
- Just-world phenomenon – the tendency for people to believe that the world is just and therefore people "get what they deserve."
- Notational bias – a form of cultural bias in which the notational conventions of recording data biases the appearance of that data toward (or away from) the system upon which the notational schema is based.
- Outgroup homogeneity bias – individuals see members of their own group as being relatively more varied than members of other groups.
- Projection bias – the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
- Self-serving bias (also called "behavioral confirmation effect") – the tendency to claim more responsibility for successes than failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests (see also group-serving bias).
- Self-fulfilling prophecy – the tendency to engage in behaviors that elicit results which will (consciously or not) confirm existing attitudes.[11]
- System justification – the tendency to defend and bolster the status quo. Existing social, economic, and political arrangements tend to be preferred, and alternatives disparaged sometimes even at the expense of individual and collective self-interest. (See also status quo bias.)
- Trait ascription bias – the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable.
- Ultimate attribution error – Similar to the fundamental attribution error, in this error a person is likely to make an internal attribution to an entire group instead of the individuals within the group.
Memory errors
- Consistency bias – incorrectly remembering one's past attitudes and behaviour as resembling present attitudes and behaviour.
- Cryptomnesia – a form of misattribution where a memory is mistaken for imagination.
- Egocentric bias – recalling the past in a self-serving manner, e.g. remembering one's exam grades as being better than they were, or remembering a caught fish as being bigger than it was
- False memory – confusion of imagination with memory, or the confusion of true memories with false memories.
- Hindsight bias – filtering memory of past events through present knowledge, so that those events look more predictable than they actually were; also known as the 'I-knew-it-all-along effect'.
- Reminiscence bump – the effect that people tend to recall more personal events from adolescence and early adulthood than from other lifetime periods.
- Rosy retrospection – the tendency to rate past events more positively than they had actually rated them when the event occurred.
- Self-serving bias – perceiving oneself responsible for desirable outcomes but not responsible for undesirable ones.
- Suggestibility – a form of misattribution where ideas suggested by a questioner are mistaken for memory.
Common theoretical causes of some cognitive biases
- Attribute substitution – making a complex, difficult judgement by unconsciously substituting an easier judgement[12]
- Attribution theory, especially:
- Cognitive dissonance, and related:
- Heuristics, including:
- Availability heuristic – estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples[10]
- Representativeness heuristic – judging probabilities on the basis of resemblance[10]
- Affect heuristic – basing a decision on an emotional reaction rather than a calculation of risks and benefits [13]
- Adaptive bias
- Misinterpretations or misuse of statistics.
See also
- Attribution theory
- Black swan theory
- Groupthink
- List of common misconceptions
- List of fallacies
- List of memory biases
- List of topics related to public relations and propaganda
- Logical fallacy
- Media bias
- Psychological immune system
- Self-deception
- System justification
- Systematic bias
Notes
- ^ Pronin, Emily (July 2007). "Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot". Journal of Experimental Social Psychology. 43 (4). Elsevier: 565–578. doi:10.1016/j.jesp.2006.05.011. ISSN 0022-1031.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - ^ Why We Spend Coins Faster Than Bills by Chana Joffe-Walt. All Things Considered, 12 May 2009.
- ^ (Hsee & Zhang, 2004)
- ^ (Kahneman, Knetsch, and Thaler 1991: 193) Richard Thaler coined the term "endowment effect."
- ^ M. Jeng, "A selected history of expectation bias in physics", American Journal of Physics 74 578-583 (2006)
- ^ (Kahneman, Knetsch, and Thaler 1991: 193) Daniel Kahneman, together with Amos Tversky, coined the term "loss aversion."
- ^ Kruglanski, 1989; Kruglanski & Webster, 1996
- ^ Edwards, W. (1968). Conservatism in human information processing. In: B. Kleinmutz (Ed.), Formal Representation of Human Judgment. (pp. 17-52). New York: John Wiley and Sons.
- ^ (Kahneman, Knetsch, and Thaler 1991: 193)
- ^ a b c Tversky, Amos (September 27, 1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157). American Association for the Advancement of Science: 1124–1131.
{{cite journal}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - ^ Darley, John M. (2000). "A Hypothesis-Confirming Bias in Labelling Effects". In Charles Stangor (ed.). Stereotypes and prejudice: essential readings. Psychology Press. p. 212. ISBN 9780863775895.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - ^ Kahneman, Daniel (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich, Dale Griffin, Daniel Kahneman (ed.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 9780521796798. OCLC 47364085.
{{cite book}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help)CS1 maint: multiple names: editors list (link) - ^ Slovic, Paul (2002). "The Affect Heuristic". In Thomas Gilovich, Dale Griffin, Daniel Kahneman (ed.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press. pp. 397–420. ISBN 0521796792.
{{cite book}}
: Unknown parameter|coauthor=
ignored (|author=
suggested) (help)CS1 maint: multiple names: editors list (link)
References
- Baron, Johnathan (2000), Thinking and deciding (3rd ed.), New York: Cambridge University Press, ISBN 0-521-65030-5
- Bishop, Michael A. (2004), Epistemology and the Psychology of Human Judgment, New York: Oxford University Press, ISBN 0-19-516229-3
{{citation}}
: Unknown parameter|coauthor=
ignored (|author=
suggested) (help) - Gilovich, Thomas (1993), How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life, New York: The Free Press, ISBN 0-02-911706-2
- Gilovich, Thomas (2002), Heuristics and biases: The psychology of intuitive judgment, Cambridge, UK: Cambridge University Press, ISBN 0-521-79679-2
{{citation}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - Greenwald, A. (1980), "The Totalitarian Ego: Fabrication and Revision of Personal History", American Psychologist, 35 (7), American Psychological Association, ISSN 0003-066X
- Kahneman, Daniel (1982), Judgment under Uncertainty: Heuristics and Biases, Cambridge, UK: Cambridge University Press, ISBN 0-521-28414-7
{{citation}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - Kahneman, Daniel (1991), "Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias", The Journal of Economic Perspectives, 5 (1), American Economic Association: 193–206
{{citation}}
: Unknown parameter|coauthors=
ignored (|author=
suggested) (help) - Plous, Scott (1993), The Psychology of Judgment and Decision Making, New York: McGraw-Hill, ISBN 0-07-050477-6
- Schacter, Daniel L. (1999), "The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience", American Psychologist, 54 (3), American Psychological Association: 182–203, ISSN 0003-066X
- Tetlock, Philip E. (2005), Expert Political Judgment: how good is it? how can we know?, Princeton: Princeton University Press, ISBN 978-0-691-12302-8
- Virine, L. (2007), Project Decisions: The Art and Science, Vienna, VA: Management Concepts, ISBN 978-1567262179
{{citation}}
: External link in
(help); Unknown parameter|title=
|coauthor=
ignored (|author=
suggested) (help)