Cognitive bias: Difference between revisions
No edit summary |
→Criticism: Removed uncited content Tags: Mobile edit Mobile web edit |
||
(404 intermediate revisions by more than 100 users not shown) | |||
Line 1: | Line 1: | ||
{{Short description|Systematic pattern of deviation from norm or rationality in judgment}} |
|||
{{psychology sidebar}} |
{{psychology sidebar}} |
||
[[File:Cognitive_Bias_Codex_-_180%2B_biases%2C_designed_by_John_Manoogian_III_%28jm3%29.jpg|thumb|The Cognitive Bias Codex]] |
|||
A '''cognitive bias''' refers to a systematic pattern of deviation from norm or rationality in judgment, whereby inferences about other people and situations may be drawn in an illogical fashion.<ref>{{cite book|last=Haselton, M. G., Nettle, D., & Andrews, P. W.|title=The evolution of cognitive bias.|year=2005|publisher=Hoboken, NJ, US: John Wiley & Sons Inc|location=In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology|pages=724–746}}</ref> Individuals create their own "subjective [[social reality]]" from their perception of the input.<ref>{{cite book|last=Bless, H., Fiedler, K., & Strack, F.|title=Social cognition: How individuals construct social reality.|year=2004|publisher=Hove and New York: Psychology Press|pages=2}}</ref> An individual's construction of social reality, not the [[Objectivity (philosophy)|objective]] input, may dictate their behaviour in the social world.<ref>{{cite book|last=Bless, H., Fiedler, K., & Strack, F.|title=Social cognition: How individuals construct social reality|year=2004|publisher=Hove and New York: Psychology Press}}</ref> Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called [[irrationality]].<ref>{{cite journal |last=Kahneman |first=D. |last2=Tversky |first2=A. |year=1972 |title=Subjective probability: A judgment of representativeness |journal=Cognitive Psychology |volume=3 |issue=3 |pages=430–454 |doi=10.1016/0010-0285(72)90016-3 }}</ref><ref>Baron, J. (2007). ''Thinking and Deciding'' (4th ed.). New York, NY: Cambridge University Press.</ref><ref>Ariely, D. (2008). ''[[Predictably Irrational]]: The Hidden Forces That Shape Our Decisions''. New York, NY: HarperCollins.</ref> |
|||
A '''cognitive bias''' is a systematic pattern of deviation from [[norm (philosophy)|norm]] or rationality in judgment.<ref name = "Haselton_2005">{{cite book| vauthors = Haselton MG, Nettle D, Andrews PW | chapter = The evolution of cognitive bias.|year=2005|location = Hoboken, NJ, US | publisher = John Wiley & Sons Inc| veditors = Buss DM | title = The Handbook of Evolutionary Psychology|pages=724–746 }}</ref> Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the [[Objectivity (philosophy)|objective]] input, may dictate their [[behavior]] in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and [[irrationality]].<ref>{{cite journal |vauthors=Kahneman D, Tversky A |year=1972 |title=Subjective probability: A judgment of representativeness |journal=Cognitive Psychology |volume=3 |issue=3 |pages=430–454 |doi=10.1016/0010-0285(72)90016-3 |url=http://datacolada.org/wp-content/uploads/2014/08/Kahneman-Tversky-1972.pdf |access-date=2017-04-01 |archive-url=https://web.archive.org/web/20191214120047/http://datacolada.org/wp-content/uploads/2014/08/Kahneman-Tversky-1972.pdf |archive-date=2019-12-14 |url-status=dead }}</ref><ref>{{cite book| vauthors = Baron J | date = 2007 | title = Thinking and Deciding | edition = 4th | location = New York, NY | publisher = Cambridge University Press }}</ref><ref name="Ariely.2008">{{cite book| last=Ariely |first=Dan | name-list-style = vanc |author-link=Dan Ariely| year=2008| title=Predictably Irrational: The Hidden Forces That Shape Our Decisions| location=New York, NY| publisher=[[HarperCollins]] |isbn=978-0-06-135323-9|title-link=Predictably Irrational }}</ref> |
|||
Some cognitive [[bias]]es are presumably adaptive. Cognitive biases may lead to more effective actions in a given context.<ref>For instance: {{cite journal|last=Gigerenzer, G. & Goldstein, D. G.|title=Reasoning the fast and frugal way: Models of bounded rationality.|journal=Psychological Review|year=1996|volume=103|pages=650–669|doi=10.1037/0033-295X.103.4.650|pmid=8888650|issue=4}}</ref> Furthermore, cognitive biases enable faster decisions when timeliness is more valuable than accuracy, as illustrated in [[heuristics in judgment and decision-making|heuristics]].<ref>{{cite journal|last=Tversky, A., & Kahneman, D.|title=Judgement under uncertainty: Heuristics and biases.|journal=Sciences|year=1974|volume=185|pages=1124–1131.|doi=10.1126/science.185.4157.1124|pmid=17835457|issue=4157}}</ref> Other cognitive biases are a "by-product" of human processing limitations,<ref name="Haselton, M. G., Nettle, D., & Andrews, P. W. 2005 724–746">{{cite book|last=Haselton, M. G., Nettle, D., & Andrews, P. W.|title=The evolution of cognitive bias|year=2005|publisher=Hoboken, NJ, US: John Wiley & Sons Inc.|location=In D. M. Buss (Ed.), The Handbook of Evolutionary Psychology|pages=724–746}}</ref> resulting from a lack of appropriate mental mechanisms ([[bounded rationality]]), or simply from a limited capacity for information processing.<ref>{{cite book|last=Bless, H., Fiedler, K., & Strack, F.|title=Social cognition: How individuals construct social reality.|year=2004|publisher=Hove and New York: Psychology Press.}}</ref><ref>{{Cite journal|title = Associative processes in intuitive judgment|url = http://www.cell.com/article/S1364661310001713/abstract|journal = Trends in Cognitive Sciences|date = 2010-01-10|issn = 1364-6613|pmid = 20696611|pages = 435-440|volume = 14|issue = 10|doi = 10.1016/j.tics.2010.07.004|language = English|first = Carey K.|last = Morewedge|first2 = Daniel|last2 = Kahneman}}</ref> |
|||
While cognitive biases may initially appear to be negative, some are adaptive. They may lead to more effective actions in a given context.<ref>For instance: {{cite journal | vauthors = Gigerenzer G, Goldstein DG | title = Reasoning the fast and frugal way: models of bounded rationality | journal = Psychological Review | volume = 103 | issue = 4 | pages = 650–69 | date = October 1996 | pmid = 8888650 | doi = 10.1037/0033-295X.103.4.650 | hdl = 21.11116/0000-0000-B771-2 | url = http://library.mpib-berlin.mpg.de/ft/gg/gg_reasoning_1996.pdf | citeseerx = 10.1.1.174.4404 }}</ref> Furthermore, allowing cognitive biases enables faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in [[Heuristic (psychology)|heuristics]].<ref name="h_and_b">{{cite journal | vauthors = Tversky A, Kahneman D | title = Judgment under Uncertainty: Heuristics and Biases | journal = Science | volume = 185 | issue = 4157 | pages = 1124–31 | date = September 1974 | pmid = 17835457 | doi = 10.1126/science.185.4157.1124 | bibcode = 1974Sci...185.1124T | s2cid = 143452957 }}</ref> Other cognitive biases are a "by-product" of human processing limitations,<ref name="Haselton_2005" /> resulting from a lack of appropriate mental mechanisms ([[bounded rationality]]), the impact of an individual's constitution and biological state (see [[embodied cognition]]), or simply from a limited capacity for information processing.<ref>{{cite book| vauthors = Bless H, Fiedler K, Strack F |title=Social cognition: How individuals construct social reality.|year=2004|publisher=Hove and New York: Psychology Press.}}</ref><ref>{{cite journal | vauthors = Morewedge CK, Kahneman D | title = Associative processes in intuitive judgment | journal = Trends in Cognitive Sciences | volume = 14 | issue = 10 | pages = 435–40 | date = October 2010 | pmid = 20696611 | pmc = 5378157 | doi = 10.1016/j.tics.2010.07.004 }}</ref> Research suggests that cognitive biases can make individuals more inclined to endorsing pseudoscientific beliefs by requiring less evidence for claims that confirm their preconceptions. This can potentially distort their perceptions and lead to inaccurate judgments.<ref>{{Cite journal |last1=Rodríguez-Ferreiro |first1=Javier |last2=Barberia |first2=Itxaso |date=2021-12-21 |title=Believers in pseudoscience present lower evidential criteria |journal=Scientific Reports |language=en |volume=11 |issue=1 |pages=24352 |doi=10.1038/s41598-021-03816-5 |issn=2045-2322 |pmc=8692588 |pmid=34934119|bibcode=2021NatSR..1124352R }}</ref> |
|||
A continually evolving [[list of cognitive biases]] has been identified over the last six decades of research on human judgment and decision-making in [[cognitive science]], [[social psychology]], and [[behavioral economics]]. Kahneman and Tversky (1996) argue that cognitive biases have efficient practical implications for areas including clinical judgment.<ref>{{cite journal|last=Kahneman, D., & Tversky, A.|title=On the reality of cognitive illusions|journal=Psychological Review|year=1996|volume=103|issue=3|pages=582–591|doi=10.1037/0033-295X.103.3.582|pmid=8759048}}</ref> |
|||
A continually evolving [[list of cognitive biases]] has been identified over the last six decades of research on human judgment and decision-making in [[cognitive science]], [[social psychology]], and [[behavioral economics]]. The study of cognitive biases has practical implications for areas including clinical judgment, entrepreneurship, finance, and management.<ref>{{cite journal | vauthors = Kahneman D, Tversky A | title = On the reality of cognitive illusions | journal = Psychological Review | volume = 103 | issue = 3 | pages = 582–91; discussion 592–6 | date = July 1996 | pmid = 8759048 | doi = 10.1037/0033-295X.103.3.582 | url = http://psy.ucsd.edu/%7Emckenzie/KahnemanTversky1996PsychRev.pdf | citeseerx = 10.1.1.174.5117 }}</ref><ref name="S.X. Zhang and J. Cueto 2015">{{cite journal | vauthors = Zhang SX, Cueto J |title=The Study of Bias in Entrepreneurship |journal= Entrepreneurship Theory and Practice |volume=41 |issue=3 |pages=419–454 |doi= 10.1111/etap.12212 |year=2015 |s2cid=146617323 |url=http://psyarxiv.com/76rkv/ }}</ref> |
|||
==Overview== |
|||
== Overview == |
|||
[[File:Daniel KAHNEMAN.jpg|thumb|180px|[[Daniel Kahneman]]]] |
|||
The notion of cognitive biases was introduced by [[Amos Tversky]] and [[Daniel Kahneman]] in 1972<ref>{{cite book | vauthors = Kahneman D, Frederick S |chapter=Representativeness Revisited: Attribute Substitution in Intuitive Judgment | veditors = Gilovich T, Griffin DW, Kahneman D |title=Heuristics and Biases: The Psychology of Intuitive Judgment |publisher=Cambridge University Press |location=Cambridge |year=2002 |pages=51–52 |isbn=978-0-521-79679-8}}</ref> and grew out of their experience of people's ''[[Numeracy#Innumeracy and dyscalculia|innumeracy]]'', or inability to reason intuitively with the greater [[orders of magnitude]]. Tversky, Kahneman, and colleagues demonstrated several [[reproducibility|replicable]] ways in which human judgments and decisions differ from [[rational choice theory]]. Tversky and Kahneman explained human differences in judgment and decision-making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences.<ref name=Baumeister2>{{cite book| vauthors = Baumeister RF, Bushman BJ |title=Social psychology and human nature: International Edition|year=2010|publisher=Wadsworth|location=Belmont, US|pages=141}}</ref> Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors."<ref name="h_and_b" /> For example, the representativeness heuristic is defined as "The tendency to judge the frequency or likelihood" of an occurrence by the extent of which the event "resembles the typical case."<ref name="Baumeister2" /> |
|||
Bias arises from various processes that are sometimes difficult to distinguish. These include |
|||
The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983<ref>{{cite journal| vauthors = Tversky A, Kahneman D |title=Extensional versus intuitive reasoning: The conjunction fallacy in probability judgement|journal=Psychological Review|year=1983|volume=90|issue=4 |pages=293–315|doi=10.1037/0033-295X.90.4.293 |url=http://psy.ucsd.edu/%7Emckenzie/TverskyKahneman1983PsychRev.pdf |archive-url=https://web.archive.org/web/20070928091331/http://psy.ucsd.edu/~mckenzie/TverskyKahneman1983PsychRev.pdf |archive-date=2007-09-28 |url-status=live}}</ref>). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be (a) a "bank teller" or (b) a "bank teller and active in the feminist movement." A majority chose answer (b). Independent of the information given about Linda, though, the more restrictive answer (b) is under any circumstance statistically less likely than answer (a). This is an example of the "[[conjunction fallacy]]". Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others (Haselton et al., 2005, p. 726). |
|||
* information-processing shortcuts ([[heuristics]])<ref>Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases (1st ed.). Cambridge University Press.</ref> |
|||
Critics of Kahneman and Tversky, such as [[Gerd Gigerenzer]], alternatively argued that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases. They should rather conceive [[rationality]] as an adaptive tool, not identical to the rules of [[formal logic]] or the [[probability calculus]].<ref>{{cite book | vauthors = Gigerenzer G |chapter=Bounded and Rational | veditors = Stainton RJ |title=Contemporary Debates in Cognitive Science |publisher=Blackwell |year=2006 |page=129 |isbn=978-1-4051-1304-5 }}</ref> Nevertheless, experiments such as the "Linda problem" grew into heuristics and biases research programs, which spread beyond academic psychology into other disciplines including medicine and [[political science]]. |
|||
* mental noise<ref name="HilbertPsychBull"/> |
|||
===Definitions=== |
|||
* the brain's limited information processing capacity<ref>{{cite journal | last1 = Simon | first1 = H. A. | year = 1955 | title = A behavioral model of rational choice | url = | journal = The Quarterly Journal of Economics | volume = 69 | issue = 1| pages = 99–118 | doi = 10.2307/1884852 }}</ref> |
|||
{| class="wikitable" |
|||
|- |
|||
! Definition !! Source |
|||
|- |
|||
| ''"bias ... that occurs when humans are processing and interpreting information"'' |
|||
|| ISO/IEC TR 24027:2021(en), 3.2.4,<ref>{{cite ISO standard |csnumber=77607 |title=ISO/IEC TR 24027:2021 Information technology — Artificial intelligence (AI) — Bias in AI systems and AI aided decision making ||section=3.2.4 |date=2021 |publisher=[[International Organization for Standardization|ISO]] |access-date=21 June 2023}}</ref> ISO/IEC TR 24368:2022(en), 3.8<ref>{{cite ISO standard |csnumber=78507 |title=ISO/IEC TR 24368:2022 Information technology — Artificial intelligence — Overview of ethical and societal concerns ||section=3.8 |date=2022 |publisher=[[International Organization for Standardization|ISO]] |access-date=21 June 2023}}</ref> |
|||
|} |
|||
== Types == |
|||
* emotional and moral motivations<ref>{{cite journal | last1 = Pfister | first1 = H.-R. | last2 = Böhm | first2 = G. | year = 2008 | title = The multiplicity of emotions: A framework of emotional functions in decision making | url = | journal = Judgment and Decision Making | volume = 3 | issue = | pages = 5–17 }}</ref> |
|||
Biases can be distinguished on a number of dimensions. Examples of cognitive biases include - |
|||
* Biases specific to ''groups'' (such as the [[Group polarization#Risky shift|risky shift]]) versus biases at the individual level. |
|||
* Biases that affect [[decision-making]], where the ''desirability'' of options has to be considered (e.g., [[sunk costs]] fallacy). |
|||
* Biases, such as [[illusory correlation]], that affect ''judgment'' of how likely something is or whether one thing is the cause of another. |
|||
* Biases that affect ''memory'',<ref name="Schacter.1999">{{cite journal | vauthors = Schacter DL | title = The seven sins of memory. Insights from psychology and cognitive neuroscience | journal = The American Psychologist | volume = 54 | issue = 3 | pages = 182–203 | date = March 1999 | pmid = 10199218 | doi = 10.1037/0003-066X.54.3.182 | s2cid = 14882268 }}</ref> such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes). |
|||
* Biases that reflect a subject's ''motivation'',<ref name="Kunda1990">{{cite journal | vauthors = Kunda Z | title = The case for motivated reasoning | journal = Psychological Bulletin | volume = 108 | issue = 3 | pages = 480–98 | date = November 1990 | pmid = 2270237 | doi = 10.1037/0033-2909.108.3.480 | s2cid = 9703661 | url = http://synapse.princeton.edu/~sam/kunda90_psychol_bulletin_the-case-for-motivated-reasoning.pdf | access-date = 2017-10-27 | archive-url = https://web.archive.org/web/20170706055600/http://synapse.princeton.edu/~sam/kunda90_psychol_bulletin_the-case-for-motivated-reasoning.pdf | archive-date = 2017-07-06 }}</ref> for example, the desire for a positive self-image leading to [[egocentric bias]] and the avoidance of unpleasant [[cognitive dissonance]].<ref name="Hoorens1993">{{cite book | vauthors = Hoorens V |year=1993 |contribution=Self-enhancement and Superiority Biases in Social Comparison |title=European Review of Social Psychology 4 |editor=Stroebe, W. |editor-link=Wolfgang Stroebe |editor2=Hewstone, Miles |publisher=Wiley }}</ref> |
|||
Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "[[hot cognition]]" versus "cold cognition", as [[motivated reasoning]] can involve a state of [[arousal]]. Among the "cold" biases, |
|||
* [[social influence]]<ref>{{cite journal | last1 = Wang | first1 = X. T. | last2 = Simons | first2 = F. | last3 = Brédart | first3 = S. | year = 2001 | title = Social cues and verbal framing in risky choice | url = | journal = Journal of Behavioral Decision Making | volume = 14 | issue = 1| pages = 1–15 | doi = 10.1002/1099-0771(200101)14:1<1::AID-BDM361>3.0.CO;2-N }}</ref> |
|||
* some are due to ''ignoring relevant information'' (e.g., [[neglect of probability]]), |
|||
The notion of cognitive biases was introduced by [[Amos Tversky]] and [[Daniel Kahneman]] in 1972<ref name="revisited">{{cite book |last=Kahneman |first=Daniel |author2=Shane Frederick |title=Heuristics and Biases: The Psychology of Intuitive Judgment |editor=Thomas Gilovich, Dale Griffin, Daniel Kahneman |publisher=Cambridge University Press |location=Cambridge |year=2002 |pages=51–52 |chapter=Representativeness Revisited: Attribute Substitution in Intuitive Judgment |isbn=978-0-521-79679-8}}</ref> and grew out of their experience of people's ''[[Numeracy#Innumeracy|innumeracy]]'', or inability to reason intuitively with the greater [[orders of magnitude]]. Tversky, Kahneman and colleagues demonstrated several [[reproducibility|replicable]] ways in which human judgments and decisions differ from [[rational choice theory]]. Tversky and Kahneman explained human differences in judgement and decision making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences (Baumeister & Bushman, 2010, p. 141). Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors" (Tversky & Kahneman, 1974, p. 1125).<ref>{{cite journal|last=Tversky, A., & Kahneman, D.|title=Judgement under uncertainty: Heuristics and biases.|journal=Sciences|year=1974|volume=185|pages=1124–1131|doi=10.1126/science.185.4157.1124|pmid=17835457|issue=4157}}</ref> |
|||
* some involve a decision or judgment being ''affected by irrelevant information'' (for example the [[Framing (social sciences)|framing effect]] where the same problem receives different responses depending on how it is described; or the [[distinction bias]] where choices presented together have different outcomes than those presented separately), and |
|||
* others give ''excessive weight'' to an unimportant but salient feature of the problem (e.g., [[Anchoring (cognitive bias)|anchoring]]). |
|||
As some biases reflect motivation specifically the motivation to have positive attitudes to oneself.<ref name="Hoorens1993" /> It accounts for the fact that many biases are self-motivated or self-directed (e.g., [[illusion of asymmetric insight]], [[self-serving bias]]). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily defined ([[ingroup bias]], [[outgroup homogeneity bias]]). |
|||
For example, the representativeness heuristic is defined as the tendency to "judge the frequency or likelihood" of an occurrence by the extent of which the event "resembles the typical case" (Baumeister & Bushman, 2010, p. 141). The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983<ref>{{cite journal|last=Tversky, A., & Kahneman, D.|title=Extensional versus intuitive reasoning: The conjunction fallacy in probability judgement|journal=Psychological Review|year=1983|volume=90|pages=293–315|doi=10.1037/0033-295X.90.4.293}}</ref> ). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be a "(a) bank teller" or a "(b) bank teller and active in the feminist movement". A majority chose answer (b). This error (mathematically, answer (b) cannot be more likely than answer (a)) is an example of the “[[conjunction fallacy]]”; Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgements of others (Haselton et al., 2005, p. 726). |
|||
Some cognitive biases belong to the subgroup of [[attentional bias]]es, which refers to paying increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the [[Stroop effect|Stroop task]]<ref name="pmid5328883">{{cite journal | vauthors = Jensen AR, Rohwer WD | title = The Stroop color-word test: a review | journal = Acta Psychologica | volume = 25 | issue = 1 | pages = 36–93 | year = 1966 | pmid = 5328883 | doi = 10.1016/0001-6918(66)90004-7 }}</ref><ref name="pmid2034749">{{cite journal | vauthors = MacLeod CM | title = Half a century of research on the Stroop effect: an integrative review | journal = Psychological Bulletin | volume = 109 | issue = 2 | pages = 163–203 | date = March 1991 | pmid = 2034749 | doi = 10.1037/0033-2909.109.2.163 | hdl = 11858/00-001M-0000-002C-5646-A | url = http://content.apa.org/journals/bul/109/2/163 | citeseerx = 10.1.1.475.2563 }}</ref> and the [[Dot-probe paradigm|dot probe task]]. |
|||
Alternatively, critics of Kahneman and Tversky such as [[Gerd Gigerenzer]] argue that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases, but rather to conceive rationality as an adaptive tool that is not identical to the rules of [[formal logic]] or the [[probability calculus]].<ref>{{cite book |last=Gigerenzer |first=G. |chapter=Bounded and Rational |editor-first=R. J. |editor-last=Stainton |title=Contemporary Debates in Cognitive Science |location= |publisher=Blackwell |year=2006 |page=129 |isbn=1-4051-1304-9 }}</ref> Nevertheless, experiments such as the “Linda problem” grew into the heuristics and biases research program which spread beyond academic psychology into other disciplines including medicine and political science. |
|||
Individuals' susceptibility to some types of cognitive biases can be measured by the [[Cognitive reflection test|Cognitive Reflection Test]] (CRT) developed by Shane Frederick (2005).<ref>{{cite journal |last=Frederick |first=Shane | name-list-style = vanc |date=2005|title=Cognitive Reflection and Decision Making|journal=Journal of Economic Perspectives|language=en|volume=19|issue=4|pages=25–42|doi=10.1257/089533005775196732|issn=0895-3309|doi-access=free}}</ref><ref>{{cite journal|last1=Oechssler|first1=Jörg|last2=Roider|first2=Andreas|last3=Schmitz|first3=Patrick W. | name-list-style = vanc |date=2009|title=Cognitive abilities and behavioral biases|journal=Journal of Economic Behavior & Organization|volume=72|issue=1|pages=147–152|doi=10.1016/j.jebo.2009.04.018|issn=0167-2681|url=https://epub.uni-regensburg.de/21701/2/roder2.pdf |archive-url=https://web.archive.org/web/20160803225638/http://epub.uni-regensburg.de/21701/2/roder2.pdf |archive-date=2016-08-03 |url-status=live}}</ref> |
|||
==Types== |
|||
Biases can be distinguished on a number of dimensions. For example, there are biases specific to ''groups'' (such as the [[Group polarization#Risky shift|risky shift]]) as well as biases at the individual level. |
|||
Some biases affect [[decision-making]], where the ''desirability'' of options has to be considered (e.g., [[sunk costs]] fallacy). Others such as [[illusory correlation]] affect judgment of how likely something is, or of whether one thing is the cause of another. A distinctive class of biases affect memory,<ref name=Schacter1999>{{Cite journal |author=Schacter, D.L. |year=1999 |title=The Seven Sins of Memory: Insights From Psychology and Cognitive Neuroscience |journal=American Psychologist |volume=54 |issue=3 |pages=182–203 |doi=10.1037/0003-066X.54.3.182 |pmid=10199218 |postscript=<!--None-->}}</ref> such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes). |
|||
Some biases reflect a subject's ''motivation'',<ref name=Kunda1990>{{Cite journal |author=Kunda, Z. |year=1990 |title=The Case for Motivated Reasoning |journal=Psychological Bulletin |volume=108 |issue=3 |pages= 480–498 |doi=10.1037/0033-2909.108.3.480 |pmid=2270237 |postscript=<!--None-->}}</ref> for example, the desire for a positive self-image leading to [[egocentric bias]]<ref name=Hoorens1993>{{Cite book |author=Hoorens, V. |year=1993 |contribution=Self-enhancement and Superiority Biases in Social Comparison |title=European Review of Social Psychology 4 |editor=[[Wolfgang Stroebe|Stroebe, W.]] and Hewstone, Miles |publisher=Wiley |postscript=<!--None-->}}</ref> and the avoidance of unpleasant [[cognitive dissonance]]. Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "[[Hot cognition]]" versus "Cold Cognition", as [[motivated reasoning]] can involve a state of [[arousal]]. |
|||
Among the "cold" biases, |
|||
* some are due to ''ignoring relevant information'' (e.g. [[neglect of probability]]) |
|||
* some involve a decision or judgement being ''affected by irrelevant information'' (for example the [[Framing (social sciences)|framing effect]] where the same problem receives different responses depending on how it is described; or the [[distinction bias]] where choices presented together have different outcomes than those presented separately) |
|||
* others give ''excessive weight'' to an unimportant but salient feature of the problem (e.g., [[anchoring]]) |
|||
The fact that some biases reflect motivation, and in particular the motivation to have positive attitudes to oneself<ref name="Hoorens1993"/> accounts for the fact that many biases are self-serving or self-directed (e.g. [[illusion of asymmetric insight]], [[self-serving bias]], [[projection bias]]). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily-defined ([[ingroup bias]], [[outgroup homogeneity bias]]). |
|||
Some cognitive biases belong to the subgroup of [[attentional bias]]es which refer to the paying of increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop Task<ref name="pmid5328883">{{cite journal |author=Jensen AR, Rohwer WD |title=The Stroop color-word test: a review |journal=Acta psychologica |volume=25 |issue=1 |pages=36–93 |year=1966 |pmid=5328883 |doi=10.1016/0001-6918(66)90004-7}}</ref><ref name="pmid2034749">{{cite journal |author=MacLeod CM |title=Half a century of research on the Stroop effect: an integrative review |journal=Psychological Bulletin |volume=109 |issue=2 |pages=163–203 |date=March 1991 |pmid=2034749 |url=http://content.apa.org/journals/bul/109/2/163 |doi=10.1037/0033-2909.109.2.163}}</ref> and the [[Dot-probe paradigm|Dot Probe Task]]. |
|||
=== List of biases === |
|||
{{main|List of cognitive biases}} |
|||
The following is a list of the more commonly studied cognitive biases: |
The following is a list of the more commonly studied cognitive biases: |
||
{{For|other noted biases|List of cognitive biases}} |
|||
{| class="wikitable" |
{| class="wikitable" |
||
|- |
|- |
||
Line 54: | Line 53: | ||
! Description |
! Description |
||
|- |
|- |
||
| [[Fundamental attribution error]] (FAE) |
| [[Fundamental attribution error]] (FAE, aka correspondence bias<ref name="Baumeister" />) |
||
| |
| Tendency to overemphasize personality-based explanations for behaviors observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behavior. Edward E. Jones and Victor A. Harris' (1967)<ref>{{cite journal | vauthors = Jones EE, Harris VA |title=The attribution of attitudes|journal=Journal of Experimental Social Psychology|year=1967|volume=3|pages=1–24|doi=10.1016/0022-1031(67)90034-0}}</ref> classic study illustrates the FAE. Despite being made aware that the target's speech direction (pro-Castro/anti-Castro) was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes. |
||
|- |
|||
|[[Implicit stereotype|Implicit bias]] (aka implicit stereotype, unconscious bias) |
|||
|Tendency to attribute positive or negative qualities to a group of individuals. It can be fully non-factual or be an abusive generalization of a frequent trait in a group to all individuals of that group. |
|||
|- |
|||
|[[Priming (psychology)|Priming bias]] |
|||
|Tendency to be influenced by the first presentation of an issue to create our preconceived idea of it, which we then can adjust with later information. |
|||
|- |
|- |
||
| [[Confirmation bias]] |
| [[Confirmation bias]] |
||
| |
| Tendency to search for or interpret information in a way that confirms one's preconceptions, and discredit information that does not support the initial opinion.<ref>{{cite journal | vauthors = Mahoney MJ |title=Publication prejudices: An experimental study of confirmatory bias in the peer review system|journal=Cognitive Therapy and Research |year=1977 |volume=1 |issue=2 |pages=161–175 |doi=10.1007/bf01173636 |s2cid=7350256}}</ref> Related to the concept of [[cognitive dissonance]], in that individuals may reduce inconsistency by searching for information which reconfirms their views (Jermias, 2001, p. 146).<ref>{{cite journal| vauthors = Jermias J |title=Cognitive dissonance and resistance to change: The influence of commitment confirmation and feedback on judgement usefulness of accounting systems|journal=Accounting, Organizations and Society|year=2001|volume=26|issue=2|pages=141–160|doi=10.1016/s0361-3682(00)00008-8}}</ref> |
||
|- |
|||
| [[Affinity bias]] |
|||
|Tendency to be favorably biased toward people most like ourselves.<ref>{{Cite web|url=https://www.forbes.com/sites/forbescoachescouncil/2018/11/19/unconscious-bias-and-three-ways-to-overcome-it/|title=Council Post: Unconscious Bias And Three Ways To Overcome It|first=Monica|last=Thakrar|website=Forbes}}</ref> |
|||
|- |
|- |
||
| [[Self-serving bias]] |
| [[Self-serving bias]] |
||
| |
| Tendency to claim more responsibility for successes than for failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests. |
||
|- |
|- |
||
| [[Belief bias]] |
| [[Belief bias]] |
||
| |
| Tendency to evaluate the logical strength of an argument based on current belief and perceived plausibility of the statement's conclusion. |
||
|- |
|- |
||
| [[Framing |
| [[Framing effect (psychology)|Framing]] |
||
| |
| Tendency to narrow the description of a situation in order to guide to a selected conclusion. The same primer can be framed differently and therefore lead to different conclusions. |
||
|- |
|- |
||
| [[Hindsight bias]] |
| [[Hindsight bias]] |
||
| |
| Tendency to view past events as being predictable. Also called the "I-knew-it-all-along" effect. |
||
|- |
|||
| [[Embodied cognition]] |
|||
| Tendency to have selectivity in perception, attention, decision making, and motivation based on the biological state of the body. |
|||
|- |
|||
|[[Anchoring (cognitive bias)|Anchoring bias]] |
|||
|The inability of people to make appropriate adjustments from a starting point in response to a final answer. It can lead people to make sub-optimal decisions. Anchoring affects decision making in [[negotiation]]s, [[Medical diagnosis|medical diagnoses]], and [[Sentence (law)|judicial sentencing]].<ref>Cho, I. et al. (2018) 'The Anchoring Effect in Decision-Making with Visual Analytics', 2017 IEEE Conference on Visual Analytics Science and Technology, VAST 2017 - Proceedings. IEEE, pp. 116–126. {{doi|10.1109/VAST.2017.8585665}}.</ref> |
|||
|- |
|||
|[[Status quo bias]] |
|||
|Tendency to hold to the current situation rather than an alternative situation, to avoid risk and loss ([[loss aversion]]).<ref>Kahneman, D., Knetsch, J. L. and Thaler, R. H. (1991) Anomalies The Endowment Effect, |
|||
Loss Aversion, and Status Quo Bias, Journal of Economic Perspectives.</ref> In status quo bias, a decision-maker has the increased propensity to choose an option because it is the default option or [[status quo]]. Has been shown to affect various important economic decisions, for example, a choice of [[Vehicle insurance|car insurance]] or [[Electric utility|electrical service]].<ref>Dean, M. (2008) 'Status quo bias in large and small choice sets', New York, p. 52. Available |
|||
at: http://www.yorkshire-exile.co.uk/Dean_SQ.pdf {{Webarchive|url=https://web.archive.org/web/20101225094920/http://www.yorkshire-exile.co.uk/Dean_SQ.pdf |date=2010-12-25 }}.</ref> |
|||
|- |
|||
|[[Overconfidence effect]] |
|||
|Tendency to overly trust one's own capability to make correct decisions. People tended to overrate their abilities and skills as decision makers.<ref>{{Citation|last=Gimpel|first=Henner|title=Cognitive Biases in Negotiation Processes|date=2008|url=http://link.springer.com/10.1007/978-3-540-77554-6_16|work=Negotiation, Auctions, and Market Engineering|series=Lecture Notes in Business Information Processing|volume=2|pages=213–226|editor-last=Gimpel|editor-first=Henner|place=Berlin, Heidelberg|publisher=Springer Berlin Heidelberg|language=en|doi=10.1007/978-3-540-77554-6_16|isbn=978-3-540-77553-9|access-date=2020-11-25|editor2-last=Jennings|editor2-first=Nicholas R.|editor3-last=Kersten|editor3-first=Gregory E.|editor4-last=Ockenfels|editor4-first=Axel}}</ref> See also the [[Dunning–Kruger effect]]. |
|||
|- |
|||
|[[Physical attractiveness stereotype]] |
|||
|The tendency to assume people who are [[physical attractiveness|physically attractive]] also possess other desirable personality traits.<ref>Lorenz, Kate. (2005). "[http://www.cnn.com/2005/US/Careers/07/08/looks/ Do Pretty People Earn More?]" http://www.CNN.com.</ref> |
|||
|} |
|} |
||
== Practical significance == |
|||
A 2012 [[Psychological Bulletin]] article suggests that at least 8 seemingly unrelated biases can be produced by the same [[information-theoretic]] generative mechanism.<ref name="HilbertPsychBull">Martin Hilbert (2012) ''[http://psycnet.apa.org/psycinfo/2011-27261-001 "Toward a synthesis of cognitive biases: How noisy information processing can bias human decision making"]''" ''Psychological Bulletin'' 138(2), 211–237; free access to the study here: martinhilbert.net/HilbertPsychBull.pdf</ref> It is shown that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce [[List of cognitive biases#Decision-making.2C belief and behavioral biases|regressive conservatism]], the [[Conservatism (belief revision)|belief revision]] (Bayesian conservatism), [[illusory correlation]]s, [[illusory superiority]] (better-than-average effect) and [[worse-than-average effect]], [[subadditivity effect]], [[List of cognitive biases#Decision-making.2C belief and behavioral biases|exaggerated expectation]], [[overconfidence]], and the [[hard–easy effect]]. |
|||
{{Further|Confirmation bias#Consequences}} |
|||
==Practical significance== |
|||
Many social institutions rely on individuals to make rational judgments. |
Many social institutions rely on individuals to make rational judgments. |
||
The securities regulation regime largely assumes that all investors act as perfectly |
The securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects. |
||
A fair [[jury trial]], for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist [[fallacies]] such as [[appeal to emotion]]. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things.<ref>{{cite book | last = Sutherland | first = Stuart | name-list-style = vanc | date = 2007 | title = Irrationality: The Enemy Within | edition = Second | publisher = Pinter & Martin | isbn = 978-1-905177-07-3 }}</ref> However, they fail to do so in systematic, directional ways that are predictable.<ref name="Ariely.2008"/> |
|||
In some academic disciplines, the study of bias is very popular. For instance, bias is a wide spread and well studied phenomenon because most decisions that concern the minds and hearts of entrepreneurs are computationally intractable.<ref name="S.X. Zhang and J. Cueto 2015"/> |
|||
Cognitive biases can create other issues that arise in everyday life. One study showed the connection between cognitive bias, specifically approach bias, and inhibitory control on how much unhealthy snack food a person would eat.<ref>{{cite journal | vauthors = Kakoschke N, Kemps E, Tiggemann M | title = Combined effects of cognitive bias for food cues and poor inhibitory control on unhealthy food intake | journal = Appetite | volume = 87 | pages = 358–64 | date = April 2015 | pmid = 25592403 | doi = 10.1016/j.appet.2015.01.004 | hdl = 2328/35717 | s2cid = 31561602 | hdl-access = free }}</ref> They found that the participants who ate more of the unhealthy snack food, tended to have less inhibitory control and more reliance on approach bias. Others have also hypothesized that cognitive biases could be linked to various eating disorders and how people view their bodies and their body image.<ref>{{cite journal | vauthors = Williamson DA, Muller SL, Reas DL, Thaw JM | title = Cognitive bias in eating disorders: implications for theory and treatment | journal = Behavior Modification | volume = 23 | issue = 4 | pages = 556–77 | date = October 1999 | pmid = 10533440 | doi = 10.1177/0145445599234003 | s2cid = 36189809 }}</ref><ref>{{cite journal|last=Williamson|first=Donald A.| name-list-style = vanc |date=1996|title=Body image disturbance in eating disorders: A form of cognitive bias? |journal=Eating Disorders|language=en|volume=4|issue=1|pages=47–58|doi=10.1080/10640269608250075|issn=1064-0266 }}</ref> |
|||
It has also been argued that cognitive biases can be used in destructive ways.<ref>{{cite journal| vauthors = Trout J |date=2005|title=Paternalism and Cognitive Bias|journal=Law and Philosophy|language=en|volume=24|issue=4|pages=393–434|doi=10.1007/s10982-004-8197-3|s2cid=143783638|issn=0167-5249}}</ref> Some believe that there are people in authority who use cognitive biases and heuristics in order to manipulate others so that they can reach their end goals. Some medications and other health care treatments rely on cognitive biases in order to persuade others who are susceptible to cognitive biases to use their products. Many see this as taking advantage of one's natural struggle of judgement and decision-making. They also believe that it is the government's responsibility to regulate these misleading ads. |
|||
Cognitive biases also seem to play a role in property sale price and value. Participants in the experiment were shown a residential property.<ref>{{cite journal |last1=Levy |first1=Deborah S. |last2=Frethey-Bentham |first2=Catherine | name-list-style = vanc |date=2010|title=The effect of context and the level of decision maker training on the perception of a property's probable sale price |journal=Journal of Property Research|language=en|volume=27|issue=3|pages=247–267|doi=10.1080/09599916.2010.518406|s2cid=154866472 |issn=0959-9916}}</ref> Afterwards, they were shown another property that was completely unrelated to the first property. They were asked to say what they believed the value and the sale price of the second property would be. They found that showing the participants an unrelated property did have an effect on how they valued the second property. |
|||
Cognitive biases can be used in non-destructive ways. In team science and collective problem-solving, the [[superiority bias]] can be beneficial. It leads to a diversity of solutions within a group, especially in complex problems, by preventing premature consensus on suboptimal solutions. This example demonstrates how a cognitive bias, typically seen as a hindrance, can enhance collective decision-making by encouraging a wider exploration of possibilities.<ref>{{cite journal |last1=Boroomand |first1=Amin |last2=Smaldino |first2=Paul E. |title=Superiority bias and communication noise can enhance collective problem-solving. |journal=Journal of Artificial Societies and Social Simulation |date=2023 |volume=26 |issue=3 |doi=10.18564/jasss.5154|doi-access=free }}</ref> |
|||
== Reducing == |
|||
{{main|Cognitive bias mitigation| Cognitive bias modification}} |
|||
Because they cause [[systematic error]]s, cognitive biases cannot be compensated for using a [[wisdom of the crowd]] technique of averaging answers from several people.<ref>{{cite magazine |url=https://hbr.org/2019/03/the-feedback-fallacy |title=The Feedback Fallacy | first1 = Marcus | last1 = Buckingham | first2 = Ashley | last2 = Goodall | name-list-style = vanc |issue=March–April 2019 |magazine=[[Harvard Business Review]]}}</ref> [[Debiasing]] is the reduction of biases in judgment and decision-making through incentives, nudges, and training. [[Cognitive bias mitigation]] and [[cognitive bias modification]] are forms of debiasing specifically applicable to cognitive biases and their effects. [[Reference class forecasting]] is a method for systematically debiasing estimates and decisions, based on what [[Daniel Kahneman]] has dubbed the [[outside view]]. |
|||
Similar to Gigerenzer (1996),<ref>{{cite journal| vauthors = Gigerenzer G |title=On narrow norms and vague heuristics: A reply to Kahneman and Tversky (1996)|journal=Psychological Review|year=1996|volume=103|issue=3|pages=592–596|doi=10.1037/0033-295x.103.3.592|citeseerx=10.1.1.314.996}}</ref> Haselton et al. (2005) state the content and direction of cognitive biases are not "arbitrary" (p. 730).<ref name="Haselton_2005"/> Moreover, cognitive biases can be controlled. One debiasing technique aims to decrease biases by encouraging individuals to use controlled processing compared to automatic processing.<ref name="Baumeister">{{cite book| vauthors = Baumeister RF, Bushman BJ |title=Social psychology and human nature: International Edition|year=2010|publisher=Belmont, USA: Wadsworth.}}</ref> In relation to reducing the [[Fundamental attribution error|FAE]], monetary incentives<ref>{{cite journal| vauthors = Vonk R |title=Effects of outcome dependency on correspondence bias|journal=Personality and Social Psychology Bulletin|year=1999|volume=25|issue=3|pages=382–389|doi=10.1177/0146167299025003009|s2cid=145752877}}</ref> and informing participants they will be held accountable for their attributions<ref>{{cite journal| vauthors = Tetlock PE |title=Accountability: A social check on the fundamental attribution error|journal=Social Psychology Quarterly|year=1985|volume=48|issue=3|pages=227–236|doi=10.2307/3033683|jstor=3033683}}</ref> have been linked to the increase of accurate attributions. Training has also shown to reduce cognitive bias. Carey K. Morewedge and colleagues (2015) found that research participants exposed to one-shot training interventions, such as educational videos and debiasing games that taught mitigating strategies, exhibited significant reductions in their commission of six cognitive biases immediately and up to 3 months later.<ref>{{cite journal|title = Debiasing Decisions Improved Decision Making With a Single Training Intervention|journal = Policy Insights from the Behavioral and Brain Sciences|date = 2015-08-13|issn = 2372-7322|pages = 129–140|doi = 10.1177/2372732215600886|first1 = Carey K.|last1 = Morewedge|first2 = Haewon|last2 = Yoon|first3 = Irene|last3 = Scopelliti|first4 = Carl W.|last4 = Symborski|first5 = James H.|last5 = Korris|first6 = Karim S.|last6 = Kassam | name-list-style = vanc |volume=2|s2cid = 4848978|url = http://openaccess.city.ac.uk/12324/1/Debiasing_Decisions_PIBBS.pdf}}</ref> |
|||
[[Cognitive bias modification]] refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although cognitive bias modification can refer to modifying cognitive processes in healthy individuals, CBMT is a growing area of evidence-based psychological therapy, in which cognitive processes are modified to relieve suffering<ref>{{cite journal | vauthors = MacLeod C, Mathews A, Tata P | title = Attentional bias in emotional disorders | journal = Journal of Abnormal Psychology | volume = 95 | issue = 1 | pages = 15–20 | date = February 1986 | pmid = 3700842 | doi = 10.1037/0021-843x.95.1.15 }}</ref><ref>{{cite journal | vauthors = Bar-Haim Y, Lamy D, Pergamin L, Bakermans-Kranenburg MJ, van IJzendoorn MH | title = Threat-related attentional bias in anxious and nonanxious individuals: a meta-analytic study | journal = Psychological Bulletin | volume = 133 | issue = 1 | pages = 1–24 | date = January 2007 | pmid = 17201568 | doi = 10.1037/0033-2909.133.1.1 | citeseerx = 10.1.1.324.4312 | s2cid = 2861872 }}</ref> from serious [[Major depressive disorder|depression]],<ref>{{cite journal | vauthors = Holmes EA, Lang TJ, Shah DM | title = Developing interpretation bias modification as a "cognitive vaccine" for depressed mood: imagining positive events makes you feel better than thinking about them verbally | journal = Journal of Abnormal Psychology | volume = 118 | issue = 1 | pages = 76–88 | date = February 2009 | pmid = 19222316 | doi = 10.1037/a0012590 }}</ref> [[Anxiety disorder|anxiety]],<ref>{{cite journal | vauthors = Hakamata Y, Lissek S, Bar-Haim Y, Britton JC, Fox NA, Leibenluft E, Ernst M, Pine DS | display-authors = 6 | title = Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety | journal = Biological Psychiatry | volume = 68 | issue = 11 | pages = 982–90 | date = December 2010 | pmid = 20887977 | pmc = 3296778 | doi = 10.1016/j.biopsych.2010.07.021 }}</ref> and addiction.<ref>{{cite journal | vauthors = Eberl C, Wiers RW, Pawelczack S, Rinck M, Becker ES, Lindenmeyer J | title = Approach bias modification in alcohol dependence: do clinical effects replicate and for whom does it work best? | journal = Developmental Cognitive Neuroscience | volume = 4 | pages = 38–51 | date = April 2013 | pmid = 23218805 | doi = 10.1016/j.dcn.2012.11.002 | pmc = 6987692 }}</ref> CBMT techniques are technology-assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety,<ref>{{cite book | vauthors = Clark DA, Beck AT | date = 2009 | title = Cognitive Therapy of Anxiety Disorders: Science and Practice. | location = London | publisher = Guildford }}</ref> cognitive neuroscience,<ref>{{cite journal | vauthors = Browning M, Holmes EA, Murphy SE, Goodwin GM, Harmer CJ | title = Lateral prefrontal cortex mediates the cognitive modification of attentional bias | journal = Biological Psychiatry | volume = 67 | issue = 10 | pages = 919–25 | date = May 2010 | pmid = 20034617 | pmc = 2866253 | doi = 10.1016/j.biopsych.2009.10.031 }}</ref> and attentional models.<ref>{{cite journal | vauthors = Eysenck MW, Derakshan N, Santos R, Calvo MG | title = Anxiety and cognitive performance: attentional control theory | journal = Emotion | volume = 7 | issue = 2 | pages = 336–53 | date = May 2007 | pmid = 17516812 | doi = 10.1037/1528-3542.7.2.336 | citeseerx = 10.1.1.453.3592 | s2cid = 33462708 }}</ref> |
|||
Cognitive bias modification has also been used to help those with obsessive-compulsive beliefs and obsessive-compulsive disorder.<ref>{{cite journal|last1=Beadel|first1=Jessica R.|last2=Smyth|first2=Frederick L.|last3=Teachman|first3=Bethany A.| name-list-style = vanc |date=2014|title=Change Processes During Cognitive Bias Modification for Obsessive Compulsive Beliefs|journal=Cognitive Therapy and Research|language=en|volume=38|issue=2|pages=103–119|doi=10.1007/s10608-013-9576-6|s2cid=32259433|issn=0147-5916}}</ref><ref>{{cite journal | vauthors = Williams AD, Grisham JR | title = Cognitive Bias Modification (CBM) of obsessive compulsive beliefs | journal = BMC Psychiatry | volume = 13 | issue = 1 | pages = 256 | date = October 2013 | pmid = 24106918 | pmc = 3851748 | doi = 10.1186/1471-244X-13-256 | doi-access = free }}</ref> This therapy has shown that it decreases the obsessive-compulsive beliefs and behaviors. |
|||
== Common theoretical causes of some cognitive biases == |
|||
Bias arises from various processes that are sometimes difficult to distinguish. These include: |
|||
*[[Bounded rationality]] — limits on optimization and rationality |
|||
**[[Prospect theory]] |
|||
*[[Evolutionary psychology]] — Remnants from evolutionary adaptive mental functions.<ref>{{cite journal| vauthors = Van Eyghen H |year=2022|title=Cognitive Bias. Philogenesis or Ontogenesis|journal= Frontiers in Psychology|volume=13|doi=10.3389/fpsyg.2022.892829 |pmid=35967732 |pmc=9364952 |doi-access=free}}</ref> |
|||
**[[Mental accounting]] |
|||
**[[Adaptive bias]] — basing decisions on limited information and biasing them based on the costs of being wrong |
|||
*[[Attribute substitution]] — making a complex, difficult judgment by unconsciously replacing it with an easier judgment<ref>{{cite book | vauthors = Kahneman D, Frederick S | chapter = Representativeness revisited: Attribute substitution in intuitive judgment | veditors = Gilovich T, Griffin DW, Kahneman D |title=Heuristics and Biases: The Psychology of Intuitive Judgment |publisher=Cambridge University Press |location=Cambridge |year=2002 |pages=49–81 |isbn=978-0-521-79679-8 |oclc=47364085}}</ref> |
|||
*[[Attribution theory]] |
|||
**[[Salience (neuroscience)|Salience]] |
|||
**[[Naïve realism (psychology)|Naïve realism]] |
|||
*[[Cognitive dissonance]], and related: |
|||
**[[Impression management]] |
|||
**[[Self-perception theory]] |
|||
* Information-processing shortcuts ([[Heuristics in judgment and decision making|heuristics]]),<ref>Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases (1st ed.). Cambridge University Press.</ref> including: |
|||
**[[Availability heuristic]] — estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples<ref name="h_and_b" /> |
|||
**[[Representativeness heuristic]] — judging probabilities based on resemblance<ref name="h_and_b"/> |
|||
**[[Affect heuristic]] — basing a decision on an emotional reaction rather than a calculation of risks and benefits<ref>{{cite book | vauthors = Slovic P, Finucane M, Peters E, MacGregor DG |chapter=The Affect Heuristic |pages=397–420 | veditors = Gilovich T, Griffin D, Kahneman D |year=2002 |publisher=Cambridge University Press |title=Heuristics and Biases: The Psychology of Intuitive Judgment |isbn=978-0-521-79679-8}}</ref> |
|||
*[[Emotion]]al and moral motivations<ref>{{cite journal| vauthors = Pfister HR, Böhm G |year=2008|title=The multiplicity of emotions: A framework of emotional functions in decision making|journal=Judgment and Decision Making|volume=3|pages=5–17|doi=10.1017/S1930297500000127 |doi-access=free}}</ref> deriving, for example, from: |
|||
** The [[two-factor theory of emotion]] |
|||
** The [[somatic markers hypothesis]] |
|||
*[[Introspection illusion]] |
|||
* Misinterpretations or [[misuse of statistics]]; [[innumeracy]]. |
|||
*[[Social influence]]<ref>{{cite journal| vauthors = Wang X, Simons F, Brédart S |year=2001|title=Social cues and verbal framing in risky choice|journal=Journal of Behavioral Decision Making|volume=14|issue=1|pages=1–15|doi=10.1002/1099-0771(200101)14:1<1::AID-BDM361>3.0.CO;2-N}}</ref> |
|||
*The brain's limited information processing capacity<ref>{{cite journal| vauthors = Simon HA |year=1955|title=A behavioral model of rational choice|journal=The Quarterly Journal of Economics|volume=69|issue=1|pages=99–118|doi=10.2307/1884852|jstor=1884852}}</ref> |
|||
*Noisy information processing (distortions during storage in and retrieval from memory).<ref name="HilbertPsychBull" /> For example, a 2012 ''[[Psychological Bulletin]]'' article suggests that at least eight seemingly unrelated biases can be produced by the same [[information-theoretic]] generative mechanism.<ref name="HilbertPsychBull">{{cite journal | vauthors = Hilbert M | title = Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making | journal = Psychological Bulletin | volume = 138 | issue = 2 | pages = 211–37 | date = March 2012 | pmid = 22122235 | doi = 10.1037/a0025940 | url = http://www.martinhilbert.net/HilbertPsychBull.pdf | citeseerx = 10.1.1.432.8763}}</ref> The article shows that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce [[List of cognitive biases#Belief, decision-making and behavioral|regressive conservatism]], the [[Conservatism (belief revision)|belief revision]] (Bayesian conservatism), [[illusory correlation]]s, [[illusory superiority]] (better-than-average effect) and [[worse-than-average effect]], [[subadditivity effect]], [[List of cognitive biases#Belief, decision-making and behavioral|exaggerated expectation]], [[overconfidence]], and the [[hard–easy effect]]. |
|||
== Individual differences in cognitive biases == |
|||
A fair [[jury trial]], for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist [[fallacies]] such as [[appeal to emotion]]. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things.<ref>Sutherland, Stuart (2007) ''Irrationality: The Enemy Within'' Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3</ref> However, they fail to do so in systematic, directional ways that are predictable.<ref>{{cite book |ref=harv |last=Ariely |first=Dan |authorlink=Dan Ariely |year=2008 |title=[[Predictably Irrational|Predictably Irrational: The Hidden Forces That Shape Our Decisions]] |publisher=[[HarperCollins]] |page=304 |isbn=978-0-06-135323-9}}</ref> |
|||
[[File:Relation between Bias, habit and convention.png|alt=Bias habit convention|thumb|The relation between cognitive bias, habit and social convention is still an important issue.]] |
|||
People do appear to have stable individual differences in their susceptibility to decision biases such as [[Overconfidence effect|overconfidence]], [[temporal discounting]], and [[bias blind spot]].<ref>{{cite journal|title = Bias Blind Spot: Structure, Measurement, and Consequences|journal = Management Science|date = 2015-04-24|doi = 10.1287/mnsc.2014.2096|first1 = Irene|last1 = Scopelliti|first2 = Carey K.|last2 = Morewedge|first3 = Erin|last3 = McCormick|first4 = H. Lauren|last4 = Min|first5 = Sophie|last5 = Lebrecht|first6 = Karim S.|last6 = Kassam | name-list-style = vanc |volume=61|issue = 10|pages=2468–2486|doi-access = free}}</ref> That said, these stable levels of bias within individuals are possible to change. Participants in experiments who watched training videos and played debiasing games showed medium to large reductions both immediately and up to three months later in the extent to which they exhibited susceptibility to six cognitive biases: [[Anchoring (cognitive bias)|anchoring]], bias blind spot, [[confirmation bias]], [[fundamental attribution error]], [[projection bias]], and [[Representativeness heuristic|representativeness]].<ref>{{cite journal|title = Debiasing Decisions Improved Decision Making With a Single Training Intervention|journal = Policy Insights from the Behavioral and Brain Sciences|date = 2015-10-01|issn = 2372-7322|pages = 129–140|volume = 2|issue = 1|doi = 10.1177/2372732215600886|first1 = Carey K.|last1 = Morewedge|first2 = Haewon|last2 = Yoon|first3 = Irene|last3 = Scopelliti|first4 = Carl W.|last4 = Symborski|first5 = James H.|last5 = Korris|first6 = Karim S.|last6 = Kassam |s2cid = 4848978|url = http://openaccess.city.ac.uk/12324/1/Debiasing_Decisions_PIBBS.pdf| name-list-style = vanc }}</ref> |
|||
Individual differences in cognitive bias have also been linked to varying levels of cognitive abilities and functions.<ref>{{cite journal | vauthors = Vartanian O, Beatty EL, Smith I, Blackler K, Lam Q, Forbes S, De Neys W | title = The Reflective Mind: Examining Individual Differences in Susceptibility to Base Rate Neglect with fMRI | journal = Journal of Cognitive Neuroscience | volume = 30 | issue = 7 | pages = 1011–1022 | date = July 2018 | pmid = 29668391 | doi = 10.1162/jocn_a_01264 | s2cid = 4933030 | url = https://portal.findresearcher.sdu.dk/da/publications/dd1e7c5b-482d-4470-8a24-6ce013f1211a | doi-access = free }}</ref> The Cognitive Reflection Test (CRT) has been used to help understand the connection between cognitive biases and cognitive ability. There have been inconclusive results when using the Cognitive Reflection Test to understand ability. However, there does seem to be a correlation; those who gain a higher score on the Cognitive Reflection Test, have higher cognitive ability and rational-thinking skills. This in turn helps predict the performance on cognitive bias and heuristic tests. Those with higher CRT scores tend to be able to answer more correctly on different heuristic and cognitive bias tests and tasks.<ref>{{cite journal | vauthors = Toplak ME, West RF, Stanovich KE | title = The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks | journal = Memory & Cognition | volume = 39 | issue = 7 | pages = 1275–89 | date = October 2011 | pmid = 21541821 | doi = 10.3758/s13421-011-0104-1 | doi-access = free }}</ref> |
|||
Cognitive biases are also related to the persistence of superstition, to large social issues such as prejudice, and they also work as a hindrance in the acceptance of scientific non-intuitive knowledge by the public.<ref name=mot>{{cite book|title=Motivation in language: studies in honor of Günter Radden|url=http://books.google.com/books?id=qzhJ3KpLpQUC&pg=PA275&dq=essentialism+definition&lr=&cd=3#v=onepage&q=essentialism%20definition&f=false|author=Günter Radden, H. Cuyckens|publisher=[[John Benjamins]]|year=2003|page=275 | isbn=978-1-58811-426-6}}</ref> |
|||
Age is another individual difference that has an effect on one's ability to be susceptible to cognitive bias. Older individuals tend to be more susceptible to cognitive biases and have less [[cognitive flexibility]]. However, older individuals were able to decrease their susceptibility to cognitive biases throughout ongoing trials.<ref>{{cite journal | vauthors = Wilson CG, Nusbaum AT, Whitney P, Hinson JM | title = Age-differences in cognitive flexibility when overcoming a preexisting bias through feedback | journal = Journal of Clinical and Experimental Neuropsychology | volume = 40 | issue = 6 | pages = 586–594 | date = August 2018 | pmid = 29161963 | doi = 10.1080/13803395.2017.1398311 | s2cid = 13372385 }}</ref> These experiments had both young and older adults complete a framing task. Younger adults had more cognitive flexibility than older adults. Cognitive flexibility is linked to helping overcome pre-existing biases. |
|||
==Reducing cognitive bias== |
|||
Similar to Gigerenzer (1996),<ref>{{cite journal|last=Gigerenzer, G.|title=On narrow norms and vague heuristics: A reply to Kahneman and Tversky (1996)|journal=Psychological Review|year=1996|volume=103|issue=3|pages=592–596|doi=10.1037/0033-295x.103.3.592}}</ref> Haselton et al. (2005) state the content and direction of cognitive biases are not "arbitrary" (p. 730).<ref name="Haselton, M. G., Nettle, D., & Andrews, P. W. 2005 724–746"/> Moreover, cognitive biases can be controlled. Debiasing is a technique which aims to decrease biases by encouraging individuals to use controlled processing compared to automatic processing (Baumeister & Bushman, 2010, p. 155).<ref>{{cite book|last=Baumeister, R. F. & Bushman, B. J.|title=Social psychology and human nature: International Edition|year=2010|publisher=Belmont, USA: Wadsworth.}}</ref> In relation to reducing the FAE, monetary incentives<ref>{{cite journal|last=Vonk, R.|title=Effects of outcome dependency on correspondence bias.|journal=Personality and Social Psychology Bulletin|year=1999|volume=25|pages=382–389|doi=10.1177/0146167299025003009}}</ref> and informing participants they will be held accountable for their attributions<ref>{{cite journal|last=Tetlock, P. E.|title=Accountability: A social check on the fundamental attribution error|journal=Social Psychology Quarterly|year=1985|volume=48|pages=227–236|doi=10.2307/3033683}}</ref> have been linked to the increase of accurate attributions. Training has also shown to reduce cognitive bias. Morewedge and colleagues (2015) found that research participants exposed to one-shot training interventions, such as educational videos and debiasing games that taught mitigating strategies, exhibited significant reductions in their commission of six cognitive biases immediately and up to 3 months later.<ref>{{Cite journal|title = Debiasing Decisions Improved Decision Making With a Single Training Intervention|url = http://bbs.sagepub.com/content/early/2015/08/12/2372732215600886|journal = Policy Insights from the Behavioral and Brain Sciences|date = 2015-08-13|issn = 2372-7322|pages = 2372732215600886|doi = 10.1177/2372732215600886|language = en|first = Carey K.|last = Morewedge|first2 = Haewon|last2 = Yoon|first3 = Irene|last3 = Scopelliti|first4 = Carl W.|last4 = Symborski|first5 = James H.|last5 = Korris|first6 = Karim S.|last6 = Kassam}}</ref> |
|||
== Criticism == |
|||
[[Cognitive bias modification]] refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although cognitive bias modification can refer to modifying cognitive processes in healthy individuals, CBMT is a growing area of evidence-based psychological therapy, in which cognitive processes are modified to relieve suffering<ref>{{cite journal | last1 = MacLeod | first1 = C. | last2 = Mathews | first2 = A. | last3 = Tata | first3 = P. | year = 1986 | title = Attentional Bias in Emotional Disorders | url = | journal = Journal of Abnormal Psychology | volume = 95 | issue = 1| pages = 15–20 | doi=10.1037/0021-843x.95.1.15}}</ref><ref>{{cite journal | last1 = Bar-Haim | first1 = Y. | last2 = Lamy | first2 = D. | last3 = Pergamin | first3 = L. | last4 = Bakermans-Kranenburg | first4 = M. J. | year = 2007 | title = Threat-related attentional bias in anxious and nonanxious individuals: a meta-analytic study | url = | journal = Psychol Bull | volume = 133 | issue = 1| pages = 1–24 | doi = 10.1037/0033-2909.133.1.1 }}</ref> from serious [[Major depressive disorder|depression]],<ref>{{cite journal | last1 = Holmes | first1 = E. A. | last2 = Lang | first2 = T. J. | last3 = Shah | first3 = D. M. | year = 2009 | title = Developing interpretation bias modification as a "cognitive vaccine" for depressed mood: imagining positive events makes you feel better than thinking about them verbally | url = | journal = J Abnorm Psychol | volume = 118 | issue = 1| pages = 76–88 | doi = 10.1037/a0012590 }}</ref> [[Anxiety disorder|anxiety]],<ref>{{cite journal | last1 = Hakamata | first1 = Y. | last2 = Lissek | first2 = S. | last3 = Bar-Haim | first3 = Y. | last4 = Britton | first4 = J. C. | last5 = Fox | first5 = N. A. | last6 = Leibenluft | first6 = E. | last7 = Pine | first7 = D. S. | year = 2010 | title = Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety | url = | journal = Biol Psychiatry | volume = 68 | issue = 11| pages = 982–990 | doi = 10.1016/j.biopsych.2010.07.021 }}</ref> and addiction.<ref>{{cite journal | last1 = Eberl | first1 = C. | last2 = Wiers | first2 = R. W. | last3 = Pawelczack | first3 = S. | last4 = Rinck | first4 = M. | last5 = Becker | first5 = E. S. | last6 = Lindenmeyer | first6 = J. | year = 2013 | title = Approach bias modification in alcohol dependence: Do clinical effects replicate and for whom does it work best? | url = | journal = Developmental Cognitive Neuroscience | volume = 4 | issue = 0| pages = 38–51 | doi = 10.1016/j.dcn.2012.11.002 }}</ref> CBMT techniques are technology assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety,<ref>Clark, D. A., & Beck, A. T. (2009). Cognitive Therapy of Anxiety Disorders: Science and Practice. London: Guildford.</ref> cognitive neuroscience,<ref>{{cite journal | last1 = Browning | first1 = M. | last2 = Holmes | first2 = E. A. | last3 = Murphy | first3 = S. E. | last4 = Goodwin | first4 = G. M. | last5 = Harmer | first5 = C. J. | year = 2010 | title = Lateral prefrontal cortex mediates the cognitive modification of attentional bias | url = | journal = Biol Psychiatry | volume = 67 | issue = 10| pages = 919–925 | doi = 10.1016/j.biopsych.2009.10.031 }}</ref> and attentional models.<ref>{{cite journal | last1 = Eysenck | first1 = M. W. | last2 = Derakshan | first2 = N. | last3 = Santos | first3 = R. | last4 = Calvo | first4 = M. G. | year = 2007 | title = Anxiety and cognitive performance: Attentional control theory | url = | journal = Emotion | volume = 7 | issue = 2| pages = 336–353 | doi = 10.1037/1528-3542.7.2.336 }}</ref> |
|||
The list of cognitive biases has long been a topic of critique. In psychology a "rationality war"<ref>{{cite journal|last=Sturm|first=T.|title=The “Rationality Wars” in Psychology: Where they are and Where they Could Go|journal=Inquiry|date=2012|volume=55|issue=1|pages=66–81|doi=10.1080/0020174X.2012.643060}}</ref> unfolded between [[Gerd Gigerenzer]] and the Kahneman and Tversky school, which pivoted on whether biases are primarily defects of human cognition or the result of behavioural patterns that are actually adaptive or "[[ecological rationality|ecologically rational]]" <ref>{{cite book|last=Todd|first=P.M.|author2=Gigerenzer, G.|title=Ecological Rationality: Intelligence in the World|year=2012|publisher=OUP USA}}</ref> |
|||
. Gerd Gigerenzer has historically been one of the main opponents to cognitive biases and heuristics.<ref>{{cite journal|last=Clavien|first=Christine | name-list-style = vanc |date=2010|title=Gerd Gigerenzer, Gut Feelings: Short Cuts to Better Decision Making: Penguin Books, 2008 (1st ed. 2007), £ 8.99 (paperback), {{text|ISBN-13}}: 978-0141015910|journal=Ethical Theory and Moral Practice|language=en|volume=13|issue=1|pages=113–115|doi=10.1007/s10677-009-9172-8|s2cid=8097667 |issn=1386-2820|url=https://serval.unil.ch/notice/serval:BIB_68DBD7560A77 }}</ref><ref>{{cite book|last=Gigerenzer | first = Gerd | name-list-style = vanc |title=Adaptive thinking : rationality in the real world|date=2000|publisher=Oxford Univ. Press|isbn=978-0-19-803117-8|location=Oxford|oclc=352897263}}</ref><ref>{{cite book|last=Gigerenzer | first = Gerd | name-list-style = vanc |title=Simple heuristics that make us smart|date=1999|publisher=Oxford University Press|others=Todd, Peter M., ABC Research Group.|isbn=0-585-35863-X|location=New York|oclc=47009468}}</ref> |
|||
This debate has recently reignited, with critiques arguing there has been an overemphasis on biases in human cognition.<ref>{{cite book|last=Page|first=Lionel|title=Optimally Irrational|year=2022|publisher=Cambridge University Press}}</ref> |
|||
==Criticisms== |
|||
There are criticisms against theories of cognitive biases based on the fact that both sides in a [[debate]] often claim each other's thoughts to be in [[human nature]] and the result of cognitive bias, while claiming their own viewpoint as being the correct way to "overcome" cognitive bias. This is not due simply to debate misconduct but is a more fundamental problem that stems from psychology's making up of multiple opposed cognitive bias theories that can be [[falsifiability|non-falsifiably]] used to explain away any viewpoint.<ref>Popper, Karl, Conjectures and Refutations: The Growth of Scientific Knowledge</ref><ref>"Surely You're Joking, Mr. Feynman!": Adventures of a Curious Character, 1985, Richard Feynman</ref> |
|||
==See also==<!-- PLEASE RESPECT ALPHABETICAL ORDER --> |
== See also ==<!-- PLEASE RESPECT ALPHABETICAL ORDER --> |
||
{{Portal|Psychology| |
{{Portal|Psychology|Philosophy}} |
||
{{div col|colwidth=20em|small=yes}} |
|||
{{colbegin||50em}} |
|||
* {{annotated link|Baconian method#Idols of the mind (idola mentis)|Baconian method § Idols of the mind (''idola mentis'')}} |
|||
* [[Cognitive bias mitigation]] |
|||
* |
* {{annotated link|Cognitive bias in animals}} |
||
* |
* {{annotated link|Cognitive bias mitigation}} |
||
* |
* {{annotated link|Cognitive bias modification}} |
||
* |
* {{annotated link|Cognitive dissonance}} |
||
* {{annotated link|Cognitive distortion}} |
|||
* [[Cognitive traps for intelligence analysis]] |
|||
* {{annotated link|Cognitive inertia}} |
|||
* [[Critical thinking]] |
|||
* {{annotated link|Cognitive psychology}} |
|||
* [[Cultural cognition]] |
|||
* {{annotated link|Cognitive vulnerability}} |
|||
* [[Emotional bias]] |
|||
* {{annotated link|Critical thinking}} |
|||
* [[Evolutionary psychology]] |
|||
* {{annotated link|Cultural cognition}} |
|||
* [[Expectation bias]] |
|||
* {{annotated link|Emotional bias}} |
|||
* [[Fallacy]] |
|||
* {{annotated link|Epistemic injustice}} |
|||
*[[Jumping to conclusions]] |
|||
* {{annotated link|Evolutionary psychology}} |
|||
* [[List of Cognitive Biases]] |
|||
* {{annotated link|Expectation bias}} |
|||
* [[Prejudice]] |
|||
* {{annotated link|Fallacy}} |
|||
* [[Realism theory]] |
|||
* {{annotated link|False consensus effect}} |
|||
{{colend}} |
|||
* {{annotated link|Halo effect}} |
|||
* {{annotated link|Implicit stereotype}} |
|||
* {{annotated link|Jumping to conclusions}} |
|||
* {{annotated link|List of cognitive biases}} |
|||
* {{annotated link|Magical thinking}} |
|||
* {{annotated link|Prejudice}} |
|||
* {{annotated link|Presumption of guilt}} |
|||
* {{annotated link|Rationality}} |
|||
* {{annotated link|Systemic bias}} |
|||
* {{annotated link|Theory-ladenness}} |
|||
{{Div col end}} |
|||
==References== |
== References == |
||
{{Reflist |
{{Reflist}} |
||
==Further reading== |
== Further reading == |
||
{{refbegin}} |
{{refbegin|30em}} |
||
* {{cite journal |last1=Soprano |first1=Michael |last2=Roitero |first2=Kevin |title=Cognitive Biases in Fact-Checking and Their Countermeasures: A Review |journal=Information Processing & Management |date=May 2024 |volume=61 |issue=3, 103672 |doi=10.1016/j.ipm.2024.103672 |doi-access=free}} |
|||
* Eiser, J.R. and Joop van der Pligt (1988) ''Attitudes and Decisions'' London: Routledge. ISBN 978-0-415-01112-9 |
|||
* {{cite book |vauthors=Eiser JR, van der Pligt J |date=1988 |title=Attitudes and Decisions |location=London |publisher=Routledge |isbn=978-0-415-01112-9}} |
|||
* Fine, Cordelia (2006) ''A Mind of its Own: How your brain distorts and deceives'' Cambridge, UK: Icon Books. ISBN 1-84046-678-2 |
|||
* {{cite book |last=Fine |first=Cordelia |name-list-style=vanc |date=2006 |title=A Mind of its Own: How your brain distorts and deceives |location=Cambridge, UK |publisher=Icon Books |isbn=1-84046-678-2}} |
|||
* Gilovich, Thomas (1993). ''How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life''. New York: The [[Free Press (publisher)|Free Press]]. ISBN 0-02-911706-2 |
|||
* {{cite book |last=Gilovich |first=Thomas |name-list-style=vanc |date=1993 |title=How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life |location=New York |publisher=[[Free Press (publisher)|Free Press]] |isbn=0-02-911706-2}} |
|||
* Haselton, M.G., Nettle, D. & Andrews, P.W. (2005). The evolution of cognitive bias. In [[David Buss|D.M. Buss]] (Ed.), ''Handbook of Evolutionary Psychology,'' (pp. 724–746). Hoboken: Wiley. [http://www.sscnet.ucla.edu/comm/haselton/webdocs/handbookevpsych.pdf Full text] |
|||
* {{cite book |vauthors=Haselton MG, Nettle D, Andrews PW |date=2005 |chapter=The evolution of cognitive bias |veditors=Buss DM |editor-link1=David Buss |title=Handbook of Evolutionary Psychology |pages=724–746 |location=Hoboken |publisher=Wiley |chapter-url=http://www.sscnet.ucla.edu/comm/haselton/webdocs/handbookevpsych.pdf}} |
|||
* {{cite web |last=Heuer |first=Richards J. Jr. |authorlink=Richards Heuer |year= 1999 |title= Psychology of Intelligence Analysis. Central Intelligence Agency |url=http://www.au.af.mil/au/awc/awcgate/psych-intel/art5.html}} |
|||
* {{cite web |last=Heuer |first=Richards J. Jr. |name-list-style=vanc |author-link=Richards Heuer |year=1999 |title=Psychology of Intelligence Analysis. Central Intelligence Agency |url=http://www.au.af.mil/au/awc/awcgate/psych-intel/art5.html|archive-url=https://web.archive.org/web/20010715133954/http://www.au.af.mil/au/awc/awcgate/psych-intel/art5.html|url-status=dead|archive-date=July 15, 2001}} |
|||
* Kahneman D., Slovic P., and Tversky, A. (Eds.) (1982) ''Judgment Under Uncertainty: Heuristics and Biases''. New York: Cambridge University Press ISBN 978-0-521-28414-1 |
|||
* Kahneman |
* {{cite book |last=Kahneman |first=Daniel |author-link=Daniel Kahneman |name-list-style=vanc |date=2011 |title=[[Thinking, Fast and Slow]] |location=New York |publisher=Farrar, Straus and Giroux |isbn=978-0-374-27563-1}} |
||
* {{cite book |last=Kahneman |first=Daniel |author-link=Daniel Kahneman |name-list-style=vanc |title=Noise: A Flaw in Human Judgment |year=2022 |publisher=Little, Brown and Company |isbn=978-0316451390}} |
|||
* Kida, Thomas (2006) ''Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking'' New York: Prometheus. ISBN 978-1-59102-408-8 |
|||
* {{cite book |last=Kida |first=Thomas |name-list-style=vanc |date=2006 |title=Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking |location=New York |publisher=Prometheus |isbn=978-1-59102-408-8}} |
|||
* Nisbett, R., and Ross, L. (1980) ''Human Inference: Strategies and shortcomings of human judgement''. Englewood Cliffs, NJ: Prentice-Hall ISBN 978-0-13-445130-5 |
|||
* {{cite journal |vauthors=Krueger JI, Funder DC |title=Towards a balanced social psychology: causes, consequences, and cures for the problem-seeking approach to social behavior and cognition |journal=The Behavioral and Brain Sciences |volume=27 |issue=3 |pages=313–27; discussion 328–76 |date=June 2004 |pmid=15736870 |doi=10.1017/s0140525x04000081 |s2cid=6260477}} |
|||
* Piatelli-Palmarini, Massimo (1994) ''Inevitable Illusions: How Mistakes of Reason Rule Our Minds'' New York: John Wiley & Sons. ISBN 0-471-15962-X |
|||
* {{cite book |vauthors=Nisbett R, Ross L |date=1980 |title=Human Inference: Strategies and shortcomings of human judgement |location=Englewood Cliffs, NJ |publisher=Prentice-Hall |isbn=978-0-13-445130-5}} |
|||
*{{Cite book |title=What Intelligence Tests Miss: The Psychology of Rational Thought |last=Stanovich |first=Keith |authorlink= |year=2009 |publisher=Yale University Press |location=New Haven (CT) |isbn=978-0-300-12385-2 |laysummary=http://web.mac.com/kstanovich/iWeb/Site/YUP_Reviews_files/TICS_review.pdf |laydate=21 November 2010 }} |
|||
* {{cite book |last=Piatelli-Palmarini |first=Massimo |name-list-style=vanc |date=1994 |title=Inevitable Illusions: How Mistakes of Reason Rule Our Minds |location=New York |publisher=John Wiley & Sons |isbn=0-471-15962-X}} |
|||
* Sutherland, Stuart (2007) ''Irrationality: The Enemy Within'' Second Edition (First Edition 1994) Pinter & Martin. ISBN 978-1-905177-07-3 |
|||
* {{cite book |last=Stanovich |first=Keith |name-list-style=vanc |title=What Intelligence Tests Miss: The Psychology of Rational Thought |year=2009 |publisher=Yale University Press |location=New Haven (CT) |isbn=978-0-300-12385-2 |url=https://archive.org/details/whatintelligence00stan |url-access=registration}} |
|||
* Tavris, Carol and Elliot Aronson (2007) ''Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts'' Orlando, Florida: [[Harcourt Books]]. ISBN 978-0-15-101098-1 |
|||
* {{cite book |last1=Tavris |first1=Carol |first2=Elliot |last2=Aronson |name-list-style=vanc |date=2007 |title=Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts |location=Orlando, Florida |publisher=[[Harcourt Books]] |isbn=978-0-15-101098-1}} |
|||
*{{cite journal|last=Funder|first=David C.|author2=Joachim I. Krueger|title=Towards a balanced social psychology: Causes, consequences, and cures for the problem-seeking approach to social behavior and cognition|journal=Behavioral and Brain Sciences|date=June 2004|volume=27|pages=313–376|pmid=15736870|url=http://132.74.59.154/internal/wiki/images/3/35/%D7%A4%D7%A1%D7%99%D7%9B%D7%95%D7%9C%D7%95%D7%92%D7%99%D7%94_3.pdf|accessdate=3 May 2011|issue=3|doi=10.1017/s0140525x04000081 }} |
|||
* {{cite book |vauthors=Young S |date=2007 |title=Micromessaging - Why Great Leadership Is Beyond Words |location=New York |publisher=McGraw-Hill |isbn=978-0-07-146757-5}} |
|||
{{refend}} |
|||
* {{refend}} |
|||
==External links== |
== External links == |
||
* {{Commons category-inline}} |
|||
* {{Wikiquote-inline}} |
|||
* [http://www.williamjames.com/Science/ERR.htm The Roots of Consciousness: To Err Is human] |
* [http://www.williamjames.com/Science/ERR.htm The Roots of Consciousness: To Err Is human] |
||
* [http://www.cxoadvisory.com/gurus/Fisher/article/ Cognitive bias in the financial arena] |
* [https://web.archive.org/web/20060620222654/http://www.cxoadvisory.com/gurus/Fisher/article/ Cognitive bias in the financial arena] (archived 20 June 2006) |
||
* [ |
* [https://www.scribd.com/doc/30548590/Cognitive-Biases-A-Visual-Study-Guide A Visual Study Guide To Cognitive Biases] |
||
* [http://www.cognitivebiasparade.com Cognitive Bias Parade] |
|||
{{Biases}} |
{{Biases}} |
||
{{Disinformation}} |
|||
{{Digital media use and mental health}} |
|||
{{Media and human factors}} |
|||
{{Authority control}} |
|||
{{DEFAULTSORT:Cognitive Bias}} |
{{DEFAULTSORT:Cognitive Bias}} |
||
Line 147: | Line 242: | ||
[[Category:Decision theory]] |
[[Category:Decision theory]] |
||
[[Category:Intelligence analysis]] |
[[Category:Intelligence analysis]] |
||
[[Category:Critical thinking]] |
Latest revision as of 19:55, 18 December 2024
Part of a series on |
Psychology |
---|
A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment.[1] Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.[2][3][4]
While cognitive biases may initially appear to be negative, some are adaptive. They may lead to more effective actions in a given context.[5] Furthermore, allowing cognitive biases enables faster decisions which can be desirable when timeliness is more valuable than accuracy, as illustrated in heuristics.[6] Other cognitive biases are a "by-product" of human processing limitations,[1] resulting from a lack of appropriate mental mechanisms (bounded rationality), the impact of an individual's constitution and biological state (see embodied cognition), or simply from a limited capacity for information processing.[7][8] Research suggests that cognitive biases can make individuals more inclined to endorsing pseudoscientific beliefs by requiring less evidence for claims that confirm their preconceptions. This can potentially distort their perceptions and lead to inaccurate judgments.[9]
A continually evolving list of cognitive biases has been identified over the last six decades of research on human judgment and decision-making in cognitive science, social psychology, and behavioral economics. The study of cognitive biases has practical implications for areas including clinical judgment, entrepreneurship, finance, and management.[10][11]
Overview
[edit]The notion of cognitive biases was introduced by Amos Tversky and Daniel Kahneman in 1972[12] and grew out of their experience of people's innumeracy, or inability to reason intuitively with the greater orders of magnitude. Tversky, Kahneman, and colleagues demonstrated several replicable ways in which human judgments and decisions differ from rational choice theory. Tversky and Kahneman explained human differences in judgment and decision-making in terms of heuristics. Heuristics involve mental shortcuts which provide swift estimates about the possibility of uncertain occurrences.[13] Heuristics are simple for the brain to compute but sometimes introduce "severe and systematic errors."[6] For example, the representativeness heuristic is defined as "The tendency to judge the frequency or likelihood" of an occurrence by the extent of which the event "resembles the typical case."[13]
The "Linda Problem" illustrates the representativeness heuristic (Tversky & Kahneman, 1983[14]). Participants were given a description of "Linda" that suggests Linda might well be a feminist (e.g., she is said to be concerned about discrimination and social justice issues). They were then asked whether they thought Linda was more likely to be (a) a "bank teller" or (b) a "bank teller and active in the feminist movement." A majority chose answer (b). Independent of the information given about Linda, though, the more restrictive answer (b) is under any circumstance statistically less likely than answer (a). This is an example of the "conjunction fallacy". Tversky and Kahneman argued that respondents chose (b) because it seemed more "representative" or typical of persons who might fit the description of Linda. The representativeness heuristic may lead to errors such as activating stereotypes and inaccurate judgments of others (Haselton et al., 2005, p. 726).
Critics of Kahneman and Tversky, such as Gerd Gigerenzer, alternatively argued that heuristics should not lead us to conceive of human thinking as riddled with irrational cognitive biases. They should rather conceive rationality as an adaptive tool, not identical to the rules of formal logic or the probability calculus.[15] Nevertheless, experiments such as the "Linda problem" grew into heuristics and biases research programs, which spread beyond academic psychology into other disciplines including medicine and political science.
Definitions
[edit]Definition | Source |
---|---|
"bias ... that occurs when humans are processing and interpreting information" | ISO/IEC TR 24027:2021(en), 3.2.4,[16] ISO/IEC TR 24368:2022(en), 3.8[17] |
Types
[edit]Biases can be distinguished on a number of dimensions. Examples of cognitive biases include -
- Biases specific to groups (such as the risky shift) versus biases at the individual level.
- Biases that affect decision-making, where the desirability of options has to be considered (e.g., sunk costs fallacy).
- Biases, such as illusory correlation, that affect judgment of how likely something is or whether one thing is the cause of another.
- Biases that affect memory,[18] such as consistency bias (remembering one's past attitudes and behavior as more similar to one's present attitudes).
- Biases that reflect a subject's motivation,[19] for example, the desire for a positive self-image leading to egocentric bias and the avoidance of unpleasant cognitive dissonance.[20]
Other biases are due to the particular way the brain perceives, forms memories and makes judgments. This distinction is sometimes described as "hot cognition" versus "cold cognition", as motivated reasoning can involve a state of arousal. Among the "cold" biases,
- some are due to ignoring relevant information (e.g., neglect of probability),
- some involve a decision or judgment being affected by irrelevant information (for example the framing effect where the same problem receives different responses depending on how it is described; or the distinction bias where choices presented together have different outcomes than those presented separately), and
- others give excessive weight to an unimportant but salient feature of the problem (e.g., anchoring).
As some biases reflect motivation specifically the motivation to have positive attitudes to oneself.[20] It accounts for the fact that many biases are self-motivated or self-directed (e.g., illusion of asymmetric insight, self-serving bias). There are also biases in how subjects evaluate in-groups or out-groups; evaluating in-groups as more diverse and "better" in many respects, even when those groups are arbitrarily defined (ingroup bias, outgroup homogeneity bias).
Some cognitive biases belong to the subgroup of attentional biases, which refers to paying increased attention to certain stimuli. It has been shown, for example, that people addicted to alcohol and other drugs pay more attention to drug-related stimuli. Common psychological tests to measure those biases are the Stroop task[21][22] and the dot probe task.
Individuals' susceptibility to some types of cognitive biases can be measured by the Cognitive Reflection Test (CRT) developed by Shane Frederick (2005).[23][24]
List of biases
[edit]The following is a list of the more commonly studied cognitive biases:
Name | Description |
---|---|
Fundamental attribution error (FAE, aka correspondence bias[25]) | Tendency to overemphasize personality-based explanations for behaviors observed in others. At the same time, individuals under-emphasize the role and power of situational influences on the same behavior. Edward E. Jones and Victor A. Harris' (1967)[26] classic study illustrates the FAE. Despite being made aware that the target's speech direction (pro-Castro/anti-Castro) was assigned to the writer, participants ignored the situational pressures and attributed pro-Castro attitudes to the writer when the speech represented such attitudes. |
Implicit bias (aka implicit stereotype, unconscious bias) | Tendency to attribute positive or negative qualities to a group of individuals. It can be fully non-factual or be an abusive generalization of a frequent trait in a group to all individuals of that group. |
Priming bias | Tendency to be influenced by the first presentation of an issue to create our preconceived idea of it, which we then can adjust with later information. |
Confirmation bias | Tendency to search for or interpret information in a way that confirms one's preconceptions, and discredit information that does not support the initial opinion.[27] Related to the concept of cognitive dissonance, in that individuals may reduce inconsistency by searching for information which reconfirms their views (Jermias, 2001, p. 146).[28] |
Affinity bias | Tendency to be favorably biased toward people most like ourselves.[29] |
Self-serving bias | Tendency to claim more responsibility for successes than for failures. It may also manifest itself as a tendency for people to evaluate ambiguous information in a way beneficial to their interests. |
Belief bias | Tendency to evaluate the logical strength of an argument based on current belief and perceived plausibility of the statement's conclusion. |
Framing | Tendency to narrow the description of a situation in order to guide to a selected conclusion. The same primer can be framed differently and therefore lead to different conclusions. |
Hindsight bias | Tendency to view past events as being predictable. Also called the "I-knew-it-all-along" effect. |
Embodied cognition | Tendency to have selectivity in perception, attention, decision making, and motivation based on the biological state of the body. |
Anchoring bias | The inability of people to make appropriate adjustments from a starting point in response to a final answer. It can lead people to make sub-optimal decisions. Anchoring affects decision making in negotiations, medical diagnoses, and judicial sentencing.[30] |
Status quo bias | Tendency to hold to the current situation rather than an alternative situation, to avoid risk and loss (loss aversion).[31] In status quo bias, a decision-maker has the increased propensity to choose an option because it is the default option or status quo. Has been shown to affect various important economic decisions, for example, a choice of car insurance or electrical service.[32] |
Overconfidence effect | Tendency to overly trust one's own capability to make correct decisions. People tended to overrate their abilities and skills as decision makers.[33] See also the Dunning–Kruger effect. |
Physical attractiveness stereotype | The tendency to assume people who are physically attractive also possess other desirable personality traits.[34] |
Practical significance
[edit]Many social institutions rely on individuals to make rational judgments.
The securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.
A fair jury trial, for example, requires that the jury ignore irrelevant features of the case, weigh the relevant features appropriately, consider different possibilities open-mindedly and resist fallacies such as appeal to emotion. The various biases demonstrated in these psychological experiments suggest that people will frequently fail to do all these things.[35] However, they fail to do so in systematic, directional ways that are predictable.[4]
In some academic disciplines, the study of bias is very popular. For instance, bias is a wide spread and well studied phenomenon because most decisions that concern the minds and hearts of entrepreneurs are computationally intractable.[11]
Cognitive biases can create other issues that arise in everyday life. One study showed the connection between cognitive bias, specifically approach bias, and inhibitory control on how much unhealthy snack food a person would eat.[36] They found that the participants who ate more of the unhealthy snack food, tended to have less inhibitory control and more reliance on approach bias. Others have also hypothesized that cognitive biases could be linked to various eating disorders and how people view their bodies and their body image.[37][38]
It has also been argued that cognitive biases can be used in destructive ways.[39] Some believe that there are people in authority who use cognitive biases and heuristics in order to manipulate others so that they can reach their end goals. Some medications and other health care treatments rely on cognitive biases in order to persuade others who are susceptible to cognitive biases to use their products. Many see this as taking advantage of one's natural struggle of judgement and decision-making. They also believe that it is the government's responsibility to regulate these misleading ads.
Cognitive biases also seem to play a role in property sale price and value. Participants in the experiment were shown a residential property.[40] Afterwards, they were shown another property that was completely unrelated to the first property. They were asked to say what they believed the value and the sale price of the second property would be. They found that showing the participants an unrelated property did have an effect on how they valued the second property.
Cognitive biases can be used in non-destructive ways. In team science and collective problem-solving, the superiority bias can be beneficial. It leads to a diversity of solutions within a group, especially in complex problems, by preventing premature consensus on suboptimal solutions. This example demonstrates how a cognitive bias, typically seen as a hindrance, can enhance collective decision-making by encouraging a wider exploration of possibilities.[41]
Reducing
[edit]Because they cause systematic errors, cognitive biases cannot be compensated for using a wisdom of the crowd technique of averaging answers from several people.[42] Debiasing is the reduction of biases in judgment and decision-making through incentives, nudges, and training. Cognitive bias mitigation and cognitive bias modification are forms of debiasing specifically applicable to cognitive biases and their effects. Reference class forecasting is a method for systematically debiasing estimates and decisions, based on what Daniel Kahneman has dubbed the outside view.
Similar to Gigerenzer (1996),[43] Haselton et al. (2005) state the content and direction of cognitive biases are not "arbitrary" (p. 730).[1] Moreover, cognitive biases can be controlled. One debiasing technique aims to decrease biases by encouraging individuals to use controlled processing compared to automatic processing.[25] In relation to reducing the FAE, monetary incentives[44] and informing participants they will be held accountable for their attributions[45] have been linked to the increase of accurate attributions. Training has also shown to reduce cognitive bias. Carey K. Morewedge and colleagues (2015) found that research participants exposed to one-shot training interventions, such as educational videos and debiasing games that taught mitigating strategies, exhibited significant reductions in their commission of six cognitive biases immediately and up to 3 months later.[46]
Cognitive bias modification refers to the process of modifying cognitive biases in healthy people and also refers to a growing area of psychological (non-pharmaceutical) therapies for anxiety, depression and addiction called cognitive bias modification therapy (CBMT). CBMT is sub-group of therapies within a growing area of psychological therapies based on modifying cognitive processes with or without accompanying medication and talk therapy, sometimes referred to as applied cognitive processing therapies (ACPT). Although cognitive bias modification can refer to modifying cognitive processes in healthy individuals, CBMT is a growing area of evidence-based psychological therapy, in which cognitive processes are modified to relieve suffering[47][48] from serious depression,[49] anxiety,[50] and addiction.[51] CBMT techniques are technology-assisted therapies that are delivered via a computer with or without clinician support. CBM combines evidence and theory from the cognitive model of anxiety,[52] cognitive neuroscience,[53] and attentional models.[54]
Cognitive bias modification has also been used to help those with obsessive-compulsive beliefs and obsessive-compulsive disorder.[55][56] This therapy has shown that it decreases the obsessive-compulsive beliefs and behaviors.
Common theoretical causes of some cognitive biases
[edit]Bias arises from various processes that are sometimes difficult to distinguish. These include:
- Bounded rationality — limits on optimization and rationality
- Evolutionary psychology — Remnants from evolutionary adaptive mental functions.[57]
- Mental accounting
- Adaptive bias — basing decisions on limited information and biasing them based on the costs of being wrong
- Attribute substitution — making a complex, difficult judgment by unconsciously replacing it with an easier judgment[58]
- Attribution theory
- Cognitive dissonance, and related:
- Information-processing shortcuts (heuristics),[59] including:
- Availability heuristic — estimating what is more likely by what is more available in memory, which is biased toward vivid, unusual, or emotionally charged examples[6]
- Representativeness heuristic — judging probabilities based on resemblance[6]
- Affect heuristic — basing a decision on an emotional reaction rather than a calculation of risks and benefits[60]
- Emotional and moral motivations[61] deriving, for example, from:
- Introspection illusion
- Misinterpretations or misuse of statistics; innumeracy.
- Social influence[62]
- The brain's limited information processing capacity[63]
- Noisy information processing (distortions during storage in and retrieval from memory).[64] For example, a 2012 Psychological Bulletin article suggests that at least eight seemingly unrelated biases can be produced by the same information-theoretic generative mechanism.[64] The article shows that noisy deviations in the memory-based information processes that convert objective evidence (observations) into subjective estimates (decisions) can produce regressive conservatism, the belief revision (Bayesian conservatism), illusory correlations, illusory superiority (better-than-average effect) and worse-than-average effect, subadditivity effect, exaggerated expectation, overconfidence, and the hard–easy effect.
Individual differences in cognitive biases
[edit]People do appear to have stable individual differences in their susceptibility to decision biases such as overconfidence, temporal discounting, and bias blind spot.[65] That said, these stable levels of bias within individuals are possible to change. Participants in experiments who watched training videos and played debiasing games showed medium to large reductions both immediately and up to three months later in the extent to which they exhibited susceptibility to six cognitive biases: anchoring, bias blind spot, confirmation bias, fundamental attribution error, projection bias, and representativeness.[66]
Individual differences in cognitive bias have also been linked to varying levels of cognitive abilities and functions.[67] The Cognitive Reflection Test (CRT) has been used to help understand the connection between cognitive biases and cognitive ability. There have been inconclusive results when using the Cognitive Reflection Test to understand ability. However, there does seem to be a correlation; those who gain a higher score on the Cognitive Reflection Test, have higher cognitive ability and rational-thinking skills. This in turn helps predict the performance on cognitive bias and heuristic tests. Those with higher CRT scores tend to be able to answer more correctly on different heuristic and cognitive bias tests and tasks.[68]
Age is another individual difference that has an effect on one's ability to be susceptible to cognitive bias. Older individuals tend to be more susceptible to cognitive biases and have less cognitive flexibility. However, older individuals were able to decrease their susceptibility to cognitive biases throughout ongoing trials.[69] These experiments had both young and older adults complete a framing task. Younger adults had more cognitive flexibility than older adults. Cognitive flexibility is linked to helping overcome pre-existing biases.
Criticism
[edit]The list of cognitive biases has long been a topic of critique. In psychology a "rationality war"[70] unfolded between Gerd Gigerenzer and the Kahneman and Tversky school, which pivoted on whether biases are primarily defects of human cognition or the result of behavioural patterns that are actually adaptive or "ecologically rational" [71] . Gerd Gigerenzer has historically been one of the main opponents to cognitive biases and heuristics.[72][73][74]
This debate has recently reignited, with critiques arguing there has been an overemphasis on biases in human cognition.[75]
See also
[edit]- Baconian method § Idols of the mind (idola mentis) – Investigative process
- Cognitive bias in animals – Influence of decision making by emotions or irrelevant stimulus in animals
- Cognitive bias mitigation – Reduction of the negative effects of cognitive biases
- Cognitive bias modification – process of modifying cognitive biases in healthy people or growing area of psychological therapies for cognitive bias modification therapy
- Cognitive dissonance – Stress from contradiction between beliefs and actions
- Cognitive distortion – Exaggerated or irrational thought pattern
- Cognitive inertia – Lack of motivation to mentally tackle a problem or issue
- Cognitive psychology – Subdiscipline of psychology
- Cognitive vulnerability – Concept in cognitive psychology
- Critical thinking – Analysis of facts to form a judgment
- Cultural cognition
- Emotional bias – Distortion in cognition
- Epistemic injustice – Injustice related to knowledge
- Evolutionary psychology – Branch of psychology
- Expectation bias – Cognitive bias of experimental subject
- Fallacy – Argument that uses faulty reasoning
- False consensus effect – Attributional type of cognitive bias
- Halo effect – Tendency for positive impressions to contaminate other evaluations
- Implicit stereotype – Unreflected, mistaken attributions to and descriptions of social groups
- Jumping to conclusions – Psychological term
- List of cognitive biases
- Magical thinking – Belief in the connection of unrelated events
- Prejudice – Attitudes based on preconceived categories
- Presumption of guilt – Presumption that a person is guilty of a crime
- Rationality – Quality of being agreeable to reason
- Systemic bias – Inherent tendency of a process to support particular outcomes
- Theory-ladenness – Degree to which an observation is affected by one's presuppositions
References
[edit]- ^ a b c Haselton MG, Nettle D, Andrews PW (2005). "The evolution of cognitive bias.". In Buss DM (ed.). The Handbook of Evolutionary Psychology. Hoboken, NJ, US: John Wiley & Sons Inc. pp. 724–746.
- ^ Kahneman D, Tversky A (1972). "Subjective probability: A judgment of representativeness" (PDF). Cognitive Psychology. 3 (3): 430–454. doi:10.1016/0010-0285(72)90016-3. Archived from the original (PDF) on 2019-12-14. Retrieved 2017-04-01.
- ^ Baron J (2007). Thinking and Deciding (4th ed.). New York, NY: Cambridge University Press.
- ^ a b Ariely D (2008). Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York, NY: HarperCollins. ISBN 978-0-06-135323-9.
- ^ For instance: Gigerenzer G, Goldstein DG (October 1996). "Reasoning the fast and frugal way: models of bounded rationality" (PDF). Psychological Review. 103 (4): 650–69. CiteSeerX 10.1.1.174.4404. doi:10.1037/0033-295X.103.4.650. hdl:21.11116/0000-0000-B771-2. PMID 8888650.
- ^ a b c d Tversky A, Kahneman D (September 1974). "Judgment under Uncertainty: Heuristics and Biases". Science. 185 (4157): 1124–31. Bibcode:1974Sci...185.1124T. doi:10.1126/science.185.4157.1124. PMID 17835457. S2CID 143452957.
- ^ Bless H, Fiedler K, Strack F (2004). Social cognition: How individuals construct social reality. Hove and New York: Psychology Press.
- ^ Morewedge CK, Kahneman D (October 2010). "Associative processes in intuitive judgment". Trends in Cognitive Sciences. 14 (10): 435–40. doi:10.1016/j.tics.2010.07.004. PMC 5378157. PMID 20696611.
- ^ Rodríguez-Ferreiro, Javier; Barberia, Itxaso (2021-12-21). "Believers in pseudoscience present lower evidential criteria". Scientific Reports. 11 (1): 24352. Bibcode:2021NatSR..1124352R. doi:10.1038/s41598-021-03816-5. ISSN 2045-2322. PMC 8692588. PMID 34934119.
- ^ Kahneman D, Tversky A (July 1996). "On the reality of cognitive illusions" (PDF). Psychological Review. 103 (3): 582–91, discussion 592–6. CiteSeerX 10.1.1.174.5117. doi:10.1037/0033-295X.103.3.582. PMID 8759048.
- ^ a b Zhang SX, Cueto J (2015). "The Study of Bias in Entrepreneurship". Entrepreneurship Theory and Practice. 41 (3): 419–454. doi:10.1111/etap.12212. S2CID 146617323.
- ^ Kahneman D, Frederick S (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Gilovich T, Griffin DW, Kahneman D (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 51–52. ISBN 978-0-521-79679-8.
- ^ a b Baumeister RF, Bushman BJ (2010). Social psychology and human nature: International Edition. Belmont, US: Wadsworth. p. 141.
- ^ Tversky A, Kahneman D (1983). "Extensional versus intuitive reasoning: The conjunction fallacy in probability judgement" (PDF). Psychological Review. 90 (4): 293–315. doi:10.1037/0033-295X.90.4.293. Archived (PDF) from the original on 2007-09-28.
- ^ Gigerenzer G (2006). "Bounded and Rational". In Stainton RJ (ed.). Contemporary Debates in Cognitive Science. Blackwell. p. 129. ISBN 978-1-4051-1304-5.
- ^ "3.2.4". ISO/IEC TR 24027:2021 Information technology — Artificial intelligence (AI) — Bias in AI systems and AI aided decision making. ISO. 2021. Retrieved 21 June 2023.
- ^ "3.8". ISO/IEC TR 24368:2022 Information technology — Artificial intelligence — Overview of ethical and societal concerns. ISO. 2022. Retrieved 21 June 2023.
- ^ Schacter DL (March 1999). "The seven sins of memory. Insights from psychology and cognitive neuroscience". The American Psychologist. 54 (3): 182–203. doi:10.1037/0003-066X.54.3.182. PMID 10199218. S2CID 14882268.
- ^ Kunda Z (November 1990). "The case for motivated reasoning" (PDF). Psychological Bulletin. 108 (3): 480–98. doi:10.1037/0033-2909.108.3.480. PMID 2270237. S2CID 9703661. Archived from the original (PDF) on 2017-07-06. Retrieved 2017-10-27.
- ^ a b Hoorens V (1993). "Self-enhancement and Superiority Biases in Social Comparison". In Stroebe, W., Hewstone, Miles (eds.). European Review of Social Psychology 4. Wiley.
- ^ Jensen AR, Rohwer WD (1966). "The Stroop color-word test: a review". Acta Psychologica. 25 (1): 36–93. doi:10.1016/0001-6918(66)90004-7. PMID 5328883.
- ^ MacLeod CM (March 1991). "Half a century of research on the Stroop effect: an integrative review". Psychological Bulletin. 109 (2): 163–203. CiteSeerX 10.1.1.475.2563. doi:10.1037/0033-2909.109.2.163. hdl:11858/00-001M-0000-002C-5646-A. PMID 2034749.
- ^ Frederick S (2005). "Cognitive Reflection and Decision Making". Journal of Economic Perspectives. 19 (4): 25–42. doi:10.1257/089533005775196732. ISSN 0895-3309.
- ^ Oechssler J, Roider A, Schmitz PW (2009). "Cognitive abilities and behavioral biases" (PDF). Journal of Economic Behavior & Organization. 72 (1): 147–152. doi:10.1016/j.jebo.2009.04.018. ISSN 0167-2681. Archived (PDF) from the original on 2016-08-03.
- ^ a b Baumeister RF, Bushman BJ (2010). Social psychology and human nature: International Edition. Belmont, USA: Wadsworth.
- ^ Jones EE, Harris VA (1967). "The attribution of attitudes". Journal of Experimental Social Psychology. 3: 1–24. doi:10.1016/0022-1031(67)90034-0.
- ^ Mahoney MJ (1977). "Publication prejudices: An experimental study of confirmatory bias in the peer review system". Cognitive Therapy and Research. 1 (2): 161–175. doi:10.1007/bf01173636. S2CID 7350256.
- ^ Jermias J (2001). "Cognitive dissonance and resistance to change: The influence of commitment confirmation and feedback on judgement usefulness of accounting systems". Accounting, Organizations and Society. 26 (2): 141–160. doi:10.1016/s0361-3682(00)00008-8.
- ^ Thakrar, Monica. "Council Post: Unconscious Bias And Three Ways To Overcome It". Forbes.
- ^ Cho, I. et al. (2018) 'The Anchoring Effect in Decision-Making with Visual Analytics', 2017 IEEE Conference on Visual Analytics Science and Technology, VAST 2017 - Proceedings. IEEE, pp. 116–126. doi:10.1109/VAST.2017.8585665.
- ^ Kahneman, D., Knetsch, J. L. and Thaler, R. H. (1991) Anomalies The Endowment Effect, Loss Aversion, and Status Quo Bias, Journal of Economic Perspectives.
- ^ Dean, M. (2008) 'Status quo bias in large and small choice sets', New York, p. 52. Available at: http://www.yorkshire-exile.co.uk/Dean_SQ.pdf Archived 2010-12-25 at the Wayback Machine.
- ^ Gimpel, Henner (2008), Gimpel, Henner; Jennings, Nicholas R.; Kersten, Gregory E.; Ockenfels, Axel (eds.), "Cognitive Biases in Negotiation Processes", Negotiation, Auctions, and Market Engineering, Lecture Notes in Business Information Processing, vol. 2, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 213–226, doi:10.1007/978-3-540-77554-6_16, ISBN 978-3-540-77553-9, retrieved 2020-11-25
- ^ Lorenz, Kate. (2005). "Do Pretty People Earn More?" http://www.CNN.com.
- ^ Sutherland S (2007). Irrationality: The Enemy Within (Second ed.). Pinter & Martin. ISBN 978-1-905177-07-3.
- ^ Kakoschke N, Kemps E, Tiggemann M (April 2015). "Combined effects of cognitive bias for food cues and poor inhibitory control on unhealthy food intake". Appetite. 87: 358–64. doi:10.1016/j.appet.2015.01.004. hdl:2328/35717. PMID 25592403. S2CID 31561602.
- ^ Williamson DA, Muller SL, Reas DL, Thaw JM (October 1999). "Cognitive bias in eating disorders: implications for theory and treatment". Behavior Modification. 23 (4): 556–77. doi:10.1177/0145445599234003. PMID 10533440. S2CID 36189809.
- ^ Williamson DA (1996). "Body image disturbance in eating disorders: A form of cognitive bias?". Eating Disorders. 4 (1): 47–58. doi:10.1080/10640269608250075. ISSN 1064-0266.
- ^ Trout J (2005). "Paternalism and Cognitive Bias". Law and Philosophy. 24 (4): 393–434. doi:10.1007/s10982-004-8197-3. ISSN 0167-5249. S2CID 143783638.
- ^ Levy DS, Frethey-Bentham C (2010). "The effect of context and the level of decision maker training on the perception of a property's probable sale price". Journal of Property Research. 27 (3): 247–267. doi:10.1080/09599916.2010.518406. ISSN 0959-9916. S2CID 154866472.
- ^ Boroomand, Amin; Smaldino, Paul E. (2023). "Superiority bias and communication noise can enhance collective problem-solving". Journal of Artificial Societies and Social Simulation. 26 (3). doi:10.18564/jasss.5154.
- ^ Buckingham M, Goodall A. "The Feedback Fallacy". Harvard Business Review. No. March–April 2019.
- ^ Gigerenzer G (1996). "On narrow norms and vague heuristics: A reply to Kahneman and Tversky (1996)". Psychological Review. 103 (3): 592–596. CiteSeerX 10.1.1.314.996. doi:10.1037/0033-295x.103.3.592.
- ^ Vonk R (1999). "Effects of outcome dependency on correspondence bias". Personality and Social Psychology Bulletin. 25 (3): 382–389. doi:10.1177/0146167299025003009. S2CID 145752877.
- ^ Tetlock PE (1985). "Accountability: A social check on the fundamental attribution error". Social Psychology Quarterly. 48 (3): 227–236. doi:10.2307/3033683. JSTOR 3033683.
- ^ Morewedge CK, Yoon H, Scopelliti I, Symborski CW, Korris JH, Kassam KS (2015-08-13). "Debiasing Decisions Improved Decision Making With a Single Training Intervention" (PDF). Policy Insights from the Behavioral and Brain Sciences. 2: 129–140. doi:10.1177/2372732215600886. ISSN 2372-7322. S2CID 4848978.
- ^ MacLeod C, Mathews A, Tata P (February 1986). "Attentional bias in emotional disorders". Journal of Abnormal Psychology. 95 (1): 15–20. doi:10.1037/0021-843x.95.1.15. PMID 3700842.
- ^ Bar-Haim Y, Lamy D, Pergamin L, Bakermans-Kranenburg MJ, van IJzendoorn MH (January 2007). "Threat-related attentional bias in anxious and nonanxious individuals: a meta-analytic study". Psychological Bulletin. 133 (1): 1–24. CiteSeerX 10.1.1.324.4312. doi:10.1037/0033-2909.133.1.1. PMID 17201568. S2CID 2861872.
- ^ Holmes EA, Lang TJ, Shah DM (February 2009). "Developing interpretation bias modification as a "cognitive vaccine" for depressed mood: imagining positive events makes you feel better than thinking about them verbally". Journal of Abnormal Psychology. 118 (1): 76–88. doi:10.1037/a0012590. PMID 19222316.
- ^ Hakamata Y, Lissek S, Bar-Haim Y, Britton JC, Fox NA, Leibenluft E, et al. (December 2010). "Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety". Biological Psychiatry. 68 (11): 982–90. doi:10.1016/j.biopsych.2010.07.021. PMC 3296778. PMID 20887977.
- ^ Eberl C, Wiers RW, Pawelczack S, Rinck M, Becker ES, Lindenmeyer J (April 2013). "Approach bias modification in alcohol dependence: do clinical effects replicate and for whom does it work best?". Developmental Cognitive Neuroscience. 4: 38–51. doi:10.1016/j.dcn.2012.11.002. PMC 6987692. PMID 23218805.
- ^ Clark DA, Beck AT (2009). Cognitive Therapy of Anxiety Disorders: Science and Practice. London: Guildford.
- ^ Browning M, Holmes EA, Murphy SE, Goodwin GM, Harmer CJ (May 2010). "Lateral prefrontal cortex mediates the cognitive modification of attentional bias". Biological Psychiatry. 67 (10): 919–25. doi:10.1016/j.biopsych.2009.10.031. PMC 2866253. PMID 20034617.
- ^ Eysenck MW, Derakshan N, Santos R, Calvo MG (May 2007). "Anxiety and cognitive performance: attentional control theory". Emotion. 7 (2): 336–53. CiteSeerX 10.1.1.453.3592. doi:10.1037/1528-3542.7.2.336. PMID 17516812. S2CID 33462708.
- ^ Beadel JR, Smyth FL, Teachman BA (2014). "Change Processes During Cognitive Bias Modification for Obsessive Compulsive Beliefs". Cognitive Therapy and Research. 38 (2): 103–119. doi:10.1007/s10608-013-9576-6. ISSN 0147-5916. S2CID 32259433.
- ^ Williams AD, Grisham JR (October 2013). "Cognitive Bias Modification (CBM) of obsessive compulsive beliefs". BMC Psychiatry. 13 (1): 256. doi:10.1186/1471-244X-13-256. PMC 3851748. PMID 24106918.
- ^ Van Eyghen H (2022). "Cognitive Bias. Philogenesis or Ontogenesis". Frontiers in Psychology. 13. doi:10.3389/fpsyg.2022.892829. PMC 9364952. PMID 35967732.
- ^ Kahneman D, Frederick S (2002). "Representativeness revisited: Attribute substitution in intuitive judgment". In Gilovich T, Griffin DW, Kahneman D (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. OCLC 47364085.
- ^ Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty: Heuristics and biases (1st ed.). Cambridge University Press.
- ^ Slovic P, Finucane M, Peters E, MacGregor DG (2002). "The Affect Heuristic". In Gilovich T, Griffin D, Kahneman D (eds.). Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge University Press. pp. 397–420. ISBN 978-0-521-79679-8.
- ^ Pfister HR, Böhm G (2008). "The multiplicity of emotions: A framework of emotional functions in decision making". Judgment and Decision Making. 3: 5–17. doi:10.1017/S1930297500000127.
- ^ Wang X, Simons F, Brédart S (2001). "Social cues and verbal framing in risky choice". Journal of Behavioral Decision Making. 14 (1): 1–15. doi:10.1002/1099-0771(200101)14:1<1::AID-BDM361>3.0.CO;2-N.
- ^ Simon HA (1955). "A behavioral model of rational choice". The Quarterly Journal of Economics. 69 (1): 99–118. doi:10.2307/1884852. JSTOR 1884852.
- ^ a b Hilbert M (March 2012). "Toward a synthesis of cognitive biases: how noisy information processing can bias human decision making" (PDF). Psychological Bulletin. 138 (2): 211–37. CiteSeerX 10.1.1.432.8763. doi:10.1037/a0025940. PMID 22122235.
- ^ Scopelliti I, Morewedge CK, McCormick E, Min HL, Lebrecht S, Kassam KS (2015-04-24). "Bias Blind Spot: Structure, Measurement, and Consequences". Management Science. 61 (10): 2468–2486. doi:10.1287/mnsc.2014.2096.
- ^ Morewedge CK, Yoon H, Scopelliti I, Symborski CW, Korris JH, Kassam KS (2015-10-01). "Debiasing Decisions Improved Decision Making With a Single Training Intervention" (PDF). Policy Insights from the Behavioral and Brain Sciences. 2 (1): 129–140. doi:10.1177/2372732215600886. ISSN 2372-7322. S2CID 4848978.
- ^ Vartanian O, Beatty EL, Smith I, Blackler K, Lam Q, Forbes S, De Neys W (July 2018). "The Reflective Mind: Examining Individual Differences in Susceptibility to Base Rate Neglect with fMRI". Journal of Cognitive Neuroscience. 30 (7): 1011–1022. doi:10.1162/jocn_a_01264. PMID 29668391. S2CID 4933030.
- ^ Toplak ME, West RF, Stanovich KE (October 2011). "The Cognitive Reflection Test as a predictor of performance on heuristics-and-biases tasks". Memory & Cognition. 39 (7): 1275–89. doi:10.3758/s13421-011-0104-1. PMID 21541821.
- ^ Wilson CG, Nusbaum AT, Whitney P, Hinson JM (August 2018). "Age-differences in cognitive flexibility when overcoming a preexisting bias through feedback". Journal of Clinical and Experimental Neuropsychology. 40 (6): 586–594. doi:10.1080/13803395.2017.1398311. PMID 29161963. S2CID 13372385.
- ^ Sturm, T. (2012). "The "Rationality Wars" in Psychology: Where they are and Where they Could Go". Inquiry. 55 (1): 66–81. doi:10.1080/0020174X.2012.643060.
- ^ Todd, P.M.; Gigerenzer, G. (2012). Ecological Rationality: Intelligence in the World. OUP USA.
- ^ Clavien C (2010). "Gerd Gigerenzer, Gut Feelings: Short Cuts to Better Decision Making: Penguin Books, 2008 (1st ed. 2007), £ 8.99 (paperback), ISBN-13: 978-0141015910". Ethical Theory and Moral Practice. 13 (1): 113–115. doi:10.1007/s10677-009-9172-8. ISSN 1386-2820. S2CID 8097667.
- ^ Gigerenzer G (2000). Adaptive thinking : rationality in the real world. Oxford: Oxford Univ. Press. ISBN 978-0-19-803117-8. OCLC 352897263.
- ^ Gigerenzer G (1999). Simple heuristics that make us smart. Todd, Peter M., ABC Research Group. New York: Oxford University Press. ISBN 0-585-35863-X. OCLC 47009468.
- ^ Page, Lionel (2022). Optimally Irrational. Cambridge University Press.
Further reading
[edit]- Soprano, Michael; Roitero, Kevin (May 2024). "Cognitive Biases in Fact-Checking and Their Countermeasures: A Review". Information Processing & Management. 61 (3, 103672). doi:10.1016/j.ipm.2024.103672.
- Eiser JR, van der Pligt J (1988). Attitudes and Decisions. London: Routledge. ISBN 978-0-415-01112-9.
- Fine C (2006). A Mind of its Own: How your brain distorts and deceives. Cambridge, UK: Icon Books. ISBN 1-84046-678-2.
- Gilovich T (1993). How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. New York: Free Press. ISBN 0-02-911706-2.
- Haselton MG, Nettle D, Andrews PW (2005). "The evolution of cognitive bias" (PDF). In Buss DM (ed.). Handbook of Evolutionary Psychology. Hoboken: Wiley. pp. 724–746.
- Heuer RJ Jr (1999). "Psychology of Intelligence Analysis. Central Intelligence Agency". Archived from the original on July 15, 2001.
- Kahneman D (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux. ISBN 978-0-374-27563-1.
- Kahneman D (2022). Noise: A Flaw in Human Judgment. Little, Brown and Company. ISBN 978-0316451390.
- Kida T (2006). Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. New York: Prometheus. ISBN 978-1-59102-408-8.
- Krueger JI, Funder DC (June 2004). "Towards a balanced social psychology: causes, consequences, and cures for the problem-seeking approach to social behavior and cognition". The Behavioral and Brain Sciences. 27 (3): 313–27, discussion 328–76. doi:10.1017/s0140525x04000081. PMID 15736870. S2CID 6260477.
- Nisbett R, Ross L (1980). Human Inference: Strategies and shortcomings of human judgement. Englewood Cliffs, NJ: Prentice-Hall. ISBN 978-0-13-445130-5.
- Piatelli-Palmarini M (1994). Inevitable Illusions: How Mistakes of Reason Rule Our Minds. New York: John Wiley & Sons. ISBN 0-471-15962-X.
- Stanovich K (2009). What Intelligence Tests Miss: The Psychology of Rational Thought. New Haven (CT): Yale University Press. ISBN 978-0-300-12385-2.
- Tavris C, Aronson E (2007). Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions and Hurtful Acts. Orlando, Florida: Harcourt Books. ISBN 978-0-15-101098-1.
- Young S (2007). Micromessaging - Why Great Leadership Is Beyond Words. New York: McGraw-Hill. ISBN 978-0-07-146757-5.
External links
[edit]- Media related to Cognitive biases at Wikimedia Commons
- Quotations related to Cognitive bias at Wikiquote
- The Roots of Consciousness: To Err Is human
- Cognitive bias in the financial arena (archived 20 June 2006)
- A Visual Study Guide To Cognitive Biases