Court Case Study Heuristic Decision

Alice Newkirk

Given the sheer number of decisions the average person makes on any given day, the brain's use of shortcuts to help assess different choices makes perfect sense. It would be a waste of time and energy if someone had to do an exhaustive cost-benefit analysis to decide which brand of laundry detergent to buy, or which kind of pizza to order. As a result, people use a number of mental shortcuts, or heuristics, to help make decisions, which provide general rules of thumb for decision making  (Tversky & Kahneman, 1982). However, the same glossing over of factors that makes heuristics a convenient and quick solution for many smaller issues means that they actually hinder the making of decisions about more complicated issues (Tversky & Kahneman, 1982). Heuristics are simplifications, and while simplifications use fewer cognitive resources, they also, well, simplify. Furthermore, since people mostly use these shortcuts automatically, they can also preempt analytical thinking in situations where a more logical process might yield better results. Although heuristics are useful shortcuts for everyday judgment calls, they can lead people to make hasty, sometimes incorrect decisions about issues that are more complicated.

An excellent case study for the flaws and complications of heuristics is the hypothetical case of Audrey, a hypochondriac whose vitamin-taking regimen is challenged by a new study linking vitamins with increased risk of death. Audrey attributes her good health to her vitamins, and her decision making process is further complicated by the advice of her friend, who tells her that the study is worthless and she should ignore it completely. Whether or not Audrey later goes through a more thorough reasoning process, her initial judgment will be highly influenced by common decision making heuristics. Audrey’s case is an excellent lens through which to look at common heuristics and the problems they create because her hypochondria makes her perceive her decision as having potentially dire consequence; she has a strong emotional investment in the decision, which has the potential to override her reasoning self. Although her situation is unique, the way she uses heuristics will follow common patterns of thinking. In Audrey's case, heuristics will lead her to believe that vitamins can only either be completely toxic or utterly harmless; her emotional attachment to her vitamins will give her a strong bias in favor of the second conclusion, and as a result she will reject the study entirely. This extreme reaction will highlight common heuristics and biases in an extreme way.

From the start, Audrey will be looking at her vitamin dilemma through the lens of her emotions. The affect heuristic suggests that strong emotional reactions often take the place of more careful reasoning (Sunstein, 2002), and Audrey has plenty of reason to have strong emotional reactions. Hypochondria is a mental illness centered around an irrational fear of serious disease, and hypochondriacs are obsessed with staying healthy as a result of this fear (Medline, 2012). As a result, by challenging Audrey's beliefs, the study presents her with massive emotional turmoil. Her vitamin regime, which provides her with a way to control her irrational fear of illness, is being called into question, and as a result her fear and anxiety levels are likely to be even greater than usual. Both giving up and continuing to take her vitamins are choices with massive emotional weight: giving up her vitamins means giving up a source of security, and continuing to take them means possibly continuing to expose herself to future harm. 

Audrey's emotional complications will be further exacerbated by a whole category of mental shortcuts known as intuitive toxicology. Intuitive toxicology governs the ways people think about chemicals, compounds and toxins, and includes the false notion that chemical compounds are either entirely dangerous or entirely safe: in other words, that there is no such thing as moderately dangerous or dangerous only in excess (Sunstein, 2002). While not technically heuristics, these simplifications often erase the complexity associated with carcinogens and chemical health risks (Sunstein, 2002). By falling prey to the all-or-nothing model of risk, Audrey will not be able to think of the risk presented by the vitamins as a slight increase in the statistical probability of death. In her mind, her vitamins will either be completely harmless or dangerously toxic.

Furthermore, other effects of the affect heuristic will increase the stakes, and her emotional investment, even more. The affect heuristic links the perception of risks and the perception of benefits: when people perceive something to be high risk they perceive it to be low benefit, and vice versa (Sunstein, 2002). People have trouble believing that something is simultaneously risky and beneficial, especially where the risks are perceived to be very high (Sunstein, 2002). So as a result of the affect heuristic, if Audrey thinks that her vitamins are high risk, she will also think that they are low benefit. For Audrey, choosing to give up her vitamins as a result of the study would not only be admitting that she has been doing something actively harmful, but also that the regime on which she based her good health and safety had no benefits at all.

These high emotional stakes will give Audrey a bias in terms of what she wants to be true, even if her emotions play no further part in her reasoning process: accepting the study as true would mean that her main source of safety and support was extremely dangerous and not beneficial through the lenses of the all-or-nothing and affect heuristic biases. As a result, she will be motivated to show that the study is completely wrong. Her emotional investment in this hypothesis will lead to a number of other biases which will further affect her reasoning process, especially since she already strongly believes vitamins are healthy. Most notably, she will be subject to the belief-bias effect and confirmation bias.

The belief-bias effect, the first of these biases, has two parts: when a conclusion is unbelievable, it is much harder for people to accept, even when the logic is sound; and when a conclusion is believable people are much less likely to question its logic (Evans & Feeney, 2004). There are two potential explanations for these effects, both with implications for Audrey's decision making process. The first, the Selective Scrutiny Model, suggests that people are more likely to think critically about evidence when presented with a conclusion they disagree with (Evans & Feeney, 2004). In Audrey's case, she is more likely to be skeptical about the evidence provided by the study because she disagrees with its findings. The second, the Misinterpreted Necessity Model, suggests that people rely on prior beliefs to guide their judgments when the evidence is unclear (Evans & Feeney, 2004). This model has clear applications to Audrey's situation: when presented with the conflicting evidence provided by her friend and by the study, she is likely to rely on her previous belief to make her choice, i.e. that vitamins are healthy and harmless. Both of these models will lead Audrey to be far more skeptical of the studies findings, and far more accepting of evidence supporting her original beliefs.

Not only will Audrey be far more accepting of evidence supporting her preferred hypothesis, she will actively seek out evidence, as suggested by confirmation bias, that validates her beliefs. Confirmation bias leads to people seeking out information that confirms their hypotheses instead of refuting it (Evans & Feeney, 2004). Once Audrey has decided on a hypothesis—in this case, the one suggested by her previous beliefs and emotional reaction—she will look for pieces of evidence that support it, instead of searching for conflicting evidence and revising her theory based on that. As a result of the belief bias effect and confirmation bias, Audrey will actively search for information that supports her belief in vitamins, accept it more easily than she would other information and scrutinize conflicting evidence more aggressively.

Audrey will be able to find plenty of support for her hypothesis through other heuristics and biases. A variety of heuristics and biases can take the place of empirical evidence in decision making (Tversky & Kahneman, 1982); These heuristics, and their resulting biases, will provide Audrey with 'evidence' in favor of her all-natural vitamin regime. This evidence might not stand up to critical, unbiased analysis, but since she is looking for evidence that confirms her hypothesis and not scrutinizing confirming evidence too carefully as a result of belief bias and confirmation bias, her shortcuts will have a strong effect on her decision making. The first of these biases is another facet of intuitive toxicology. A number of specific biases come into play when people think about chemical risks, and one of these is the bias concerning the benevolence of nature (Sunstein, 2002). The chemicals produced in nature are not inherently safer than manufactured ones- for example, arsenic is a natural chemical, and is definitely not harmless. But as a rule of thumb, people tend to instinctively assume that natural compounds are somehow healthier and more benevolent than compounds which are man-made (Sunstein, 2002). This has clear implications for Audrey's all-natural vitamin regimen: since nature is fundamentally benevolent according to intuitive toxicology, Audrey's natural vitamins cannot be dangerous.

Audrey will find further evidence for her hypothesis through her previous positive experience with her vitamins. The representative heuristic, describes the different ways people often misattribute causes to various effects (Tversky & Kahneman, 1982). (Tversky & Kahneman, 1982). One example of this is the misconception that past experience is a good indicator of future forecasting. Even when present experience has little to no bearing on what someone is trying to predict, they are likely to try to use their present evidence to support their hypotheses for the future (Tversky & Kahneman, 1982). In Audrey's case, she will base her expectations of her vitamins off of her past experience with them, whether or not the two things are at all connected or if the effects of vitamins are supposed to be instantaneous. Since she attributes her good health to them, she presumably thinks of them very positively. Furthermore, the affect heuristic applies here as well; in this case, instead of high risks being associated with low benefits, high benefits are associated with low risk. Because she has previously seen vitamins as being extremely beneficial, she will also see them as having previously been low risk. She will use this as confirming evidence that the study is wrong: because she has in the past experienced only the positive effects of vitamins, she will assume that vitamins only have positive effects.

Audrey's confidence in her vitamins will be further strengthened by her conversation with her friend, who provides direct evidence to confirm her hypothesis. Audrey will be subject to the effects of group polarization: when multiple people of similar beliefs talk about something they share an opinion on, the opinion of the entire group is likely to shift further to the extreme, since people both have their beliefs confirmed and may be exposed to the beliefs of more radical people (Sunstein, 2002). Audrey is already motivated to prove the study wrong, already believes in the healthiness of vitamins and already has 'evidence' supporting these claims as a result of intuitive toxicology and the representative heuristic; her friend's rejection of the study will support her beliefs and polarize them even further.  As a result, Audrey is likely to have her beliefs about vitamins confirmed and strengthened, and feel confident rejecting the results of the study completely.

Her previous positive associations with vitamins will help mitigate some of the potential negative effects of heuristics as well. Specifically, she will be less susceptible to alarmist bias, increased fear and urgency surrounding alarmingly vivid threats (Sunstein, 2002). Although the 'risk of death' mentioned by the study sounds very dangerous, it is also extremely vague. Death by vitamin does not have the urgency or vivid imagery of a plane crash or a terrorist attack.  The threat of death will also be lessened by the availability heuristic, a mental shortcut for estimating the size or probability of something with how many examples come to mind—for example, estimating the number of five letter words ending in -ing by thinking of a few examples (Tversky & Kahneman, 1982). Audrey will not be able to think of examples of people who have died by vitamin overdose because that sort of thing doesn't make the news and is not particularly graphic, so her estimation of the threat will be severely diminished. Conversely, she will be able to think of a great many positive instances associated with vitamins, since she has used them for a long time and attributes her good health to them. As a result, she is likely to underestimate the severity of the negative consequences of her vitamin regime and overestimate their positive effects. The fear and anxiety brought up by these heuristics will be mitigated, and these heuristics will therefore have a much smaller effect on her reasoning process.

One of the other biases of intuitive toxicology also seems to work against Audrey's hypothesis. Laypeople often assume that it is possible and desirable for a chemical to have absolutely no associated risk, which trained toxicologists know to be untrue (Sunstein, 2002). At first, this seems to be a strike against Audrey's vitamins. They cannot be healthy or worthwhile if they have any associated risk at all, and the study suggests that they do.  However, this fallacy's interactions with a number of other biases negates its effect. First, since Audrey is more critical of things she finds unbelievable as a result of the belief-bias effect, she is more likely to subject the zero-risk fallacy to critical examination. As a result, she is more likely to think logically about it and dismiss it as illogical than she is any of her other assumptions. Second, if she does not examine it critically, its interaction with the all-or-nothing fallacy will actually strengthen her notions about the safety of her vitamins. If her vitamins have associated risk, then by the all-or-nothing fallacy they must be dangerously toxic, a hypothesis which she is eager to reject. On the other hand, if they are completely healthy, the other option presented by the all-or-nothing fallacy, then they must have no risk associated, because the zero risk fallacy suggests that no risk is optimal and attainable for compounds. The zero-risk fallacy initially seems to counter Audrey's theories about risk, but as a result of her emotional investment combined with the biases driving her reasoning process, it will actually strengthen her argument.

Audrey's emotional reaction to the information presented by the study will dominate her initial thought process, and will guide her reasoning along with a number of general heuristics. Her mental polarization of the dilemma and her emotional investment in proving her original beliefs correct will lead her to instinctively reject the study in its entirety. However, her reasoning process does not have to end there, should she so choose. Heuristics are fundamentally shortcuts for reasoning, and people are perfectly capable of taking the long route to reach a better result. But whether or not Audrey decides to analyze the potential effects of her vitamins more critically, her beliefs and biases will play a role in the ways she initially thinks about her situation. Audrey's particular biases may be exacerbated by her intense situation, but they are the analogues of biases common to everyone. While our instincts can provide easy guidance in simple decisions where they accurately represent what's actually going on, in multifaceted issues like Audrey's vitamin dilemma, they can often lead us astray. By knowing when these heuristics may be working against us rather than for us, we can choose when to engage in deeper critical thinking and learn to overcome our own biases.


Evans, J. & Feeney, A. (2004). The role of prior belief in reasoning. In J.P. Leighton & R.J. Sternberg (eds.) The nature of reasoning. (pp.78-102). Cambridge, UK: Cambridge University Press.

Sunstein, C. R. (2002). Risk and reason: Safety, law, and the environment. Cambridge, UK: Cambridge University Press. Ch 2: Thinking About Risks, (pp. 28-58)

Tversky, A. & Kahneman, D. (1982). Judgment under uncertainty: Heuristics and biases. In D. Kahenman, P. Slovic, & A. Tversky (Eds.) Judgment under uncertainty: Heuristics and biases. (pp 3-20). Cambridge, UK: Cambridge University Press.  

For other uses, see Heuristic (disambiguation).

A heuristic technique (; Ancient Greek: εὑρίσκω, "find" or "discover"), often called simply a heuristic, is any approach to problem solving, learning, or discovery that employs a practical method not guaranteed to be optimal or perfect, but sufficient for the immediate goals. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics can be mental shortcuts that ease the cognitive load of making a decision. Examples of this method include using a rule of thumb, an educated guess, an intuitive judgment, guesstimate, stereotyping, profiling, or common sense.


Heuristics are strategies derived from previous experiences with similar problems. These strategies rely on using readily accessible, though loosely applicable, information to control problem solving in human beings, machines, and abstract issues.[1][2]

The most fundamental heuristic is trial and error, which can be used in everything from matching nuts and bolts to finding the values of variables in algebra problems.

Here are a few other commonly used heuristics, from George Pólya's 1945 book, How to Solve It:[3]

  • If you are having difficulty understanding a problem, try drawing a picture.
  • If you can't find a solution, try assuming that you have a solution and seeing what you can derive from that ("working backward").
  • If the problem is abstract, try examining a concrete example.
  • Try solving a more general problem first (the "inventor's paradox": the more ambitious plan may have more chances of success).

In psychology, heuristics are simple, efficient rules, learned or hard-coded by evolutionary processes, that have been proposed to explain how people make decisions, come to judgments, and solve problems typically when facing complex problems or incomplete information. Researchers test if people use those rules with various methods. These rules work well under most circumstances, but in certain cases lead to systematic errors or cognitive biases.[4]


Main article: Heuristics in judgment and decision making

The study of heuristics in human decision-making was developed in the 1970s and 80s by psychologists Amos Tversky and Daniel Kahneman,[5] although the concept was originally introduced by Nobel laureate Herbert A. Simon. Simon's original, primary object of research was problem solving which showed that we operate within what he calls bounded rationality. He coined the term "satisficing", which denotes the situation where people seek solutions or accept choices or judgments that are "good enough" for their purposes, but could be optimized.[6]

Rudolf Groner analyzed the history of heuristics from its roots in ancient Greece up to contemporary work in cognitive psychology and artificial intelligence,[7] and proposed a cognitive style "heuristic versus algorithmic thinking" which can be assessed by means of a validated questionnaire.[8]

Gerd Gigerenzer focused on the "fast and frugal" properties of heuristics, i.e., using heuristics in a way that is principally accurate and thus eliminating most cognitive bias.[9] Heuristics – like the recognition heuristic or the take-the-best heuristic – are viewed as special tools that tackle specific tasks (e.g., binary choice) under conditions of uncertainty[10] and are organized in an "adaptive toolbox".[11] From one particular batch of research, Gigerenzer and Wolfgang Gaissmaier found that both individuals and organizations rely on heuristics in an adaptive way. They also found that ignoring part of the information [with a decision], rather than weighing all the options, can actually lead to more accurate decisions.[12][13]

Heuristics, through greater refinement and research, have begun to be applied to other theories, or be explained by them. For example: the cognitive-experiential self-theory (CEST) also is an adaptive view of heuristic processing. CEST breaks down two systems that process information. At some times, roughly speaking, individuals consider issues rationally, systematically, logically, deliberately, effortfully, and verbally. On other occasions, individuals consider issues intuitively, effortlessly, globally, and emotionally.[14] From this perspective, heuristics are part of a larger experiential processing system that is often adaptive, but vulnerable to error in situations that require logical analysis.[15]

In 2002, Daniel Kahneman and Shane Frederick proposed that cognitive heuristics work by a process called attribute substitution, which happens without conscious awareness.[16] According to this theory, when somebody makes a judgment (of a "target attribute") that is computationally complex, a rather easier calculated "heuristic attribute" is substituted. In effect, a cognitively difficult problem is dealt with by answering a rather simpler problem, without being aware of this happening.[16] This theory explains cases where judgments fail to show regression toward the mean.[17] Heuristics can be considered to reduce the complexity of clinical judgements in healthcare.[18]

Theorized psychological heuristics[edit]

Well known[edit]

  • Anchoring and adjustment – Describes the common human tendency to rely more heavily on the first piece of information offered (the "anchor") when making decisions. For example, in a study done with children, the children were told to estimate the number of jellybeans in a jar. Groups of children were given either a high or low "base" number (anchor). Children estimated the number of jellybeans to be closer to the anchor number that they were given.[19]
  • Availability heuristic – A mental shortcut that occurs when people make judgments about the probability of events by the ease with which examples come to mind. For example, in a 1973 Tversky & Kahneman experiment, the majority of participants reported that there were more words in the English language that start with the letter K than for which K was the third letter. There are actually twice as many words in the English Language that have K as the third letter as those that start with K, but words that start with K are much easier to recall and bring to mind.[20]
  • Representativeness heuristic – A mental shortcut used when making judgments about the probability of an event under uncertainty. Or, judging a situation based on how similar the prospects are to the prototypes the person holds in his or her mind. For example, in a 1982 Tversky and Kahneman experiment, participants were given a description of a woman named Linda. Based on the description, it was likely that Linda was a feminist. Eighty to ninety percent of participants, choosing from two options, chose that it was more likely for Linda to be a feminist and a bank teller than only a bank teller. The likelihood of two events cannot be greater than that of either of the two events individually. For this reason, the representativeness heuristic is exemplary of the conjunction fallacy.[20]
  • Naïve diversification – When asked to make several choices at once, people tend to diversify more than when making the same type of decision sequentially.
  • Escalation of commitment – Describes the phenomenon where people justify increased investment in a decision, based on the cumulative prior investment, despite new evidence suggesting that the cost, starting today, of continuing the decision outweighs the expected benefit. This is related to the sunk cost fallacy.
  • Familiarity heuristic – A mental shortcut applied to various situations in which individuals assume that the circumstances underlying the past behavior still hold true for the present situation and that the past behavior thus can be correctly applied to the new situation. Especially prevalent when the individual experiences a high cognitive load.

Lesser known[edit]

Cognitive maps[edit]

Heuristics were also found to be used in the manipulation and creation of cognitive maps. Cognitive maps are internal representations of our physical environment, particularly associated with spatial relationships. These internal representations of our environment are used as memory as a guide in our external environment. It was found that when questioned about maps imaging, distancing, etc., people commonly made distortions to images. These distortions took shape in the regularization of images (i.e., images are represented as more like pure abstract geometric images, though they are irregular in shape).

There are several ways that humans form and use cognitive maps. Visual intake is a key part of mapping. The first is by using landmarks. This is where a person uses a mental image to estimate a relationship, usually distance, between two objects. Second, is route-road knowledge, and this is generally developed after a person has performed a task and is relaying the information of that task to another person. Third, is survey. A person estimates a distance based on a mental image that, to them, might appear like an actual map. This image is generally created when a person's brain begins making image corrections. These are presented in five ways: 1. Right-angle bias is when a person straightens out an image, like mapping an intersection, and begins to give everything 90-degree angles, when in reality it may not be that way. 2. Symmetry heuristic is when people tend to think of shapes, or buildings, as being more symmetrical than they really are. 3. Rotation heuristic is when a person takes a naturally (realistically) distorted image and straightens it out for their mental image. 4. Alignment heuristic is similar to the previous, where people align objects mentally to make them straighter than they really are. 5. Relative-position heuristic: people do not accurately distance landmarks in their mental image based on how well they remember that particular item.

Another method of creating cognitive maps is by means of auditory intake based on verbal descriptions. Using the mapping based from a person's visual intake, another person can create a mental image, such as directions to a certain location.[21]


"Heuristic device" is used when an entity X exists to enable understanding of, or knowledge concerning, some other entity Y. A good example is a model that, as it is never identical with what it models, is a heuristic device to enable understanding of what it models. Stories, metaphors, etc., can also be termed heuristic in that sense. A classic example is the notion of utopia as described in Plato's best-known work, The Republic. This means that the "ideal city" as depicted in The Republic is not given as something to be pursued, or to present an orientation-point for development; rather, it shows how things would have to be connected, and how one thing would lead to another (often with highly problematic results), if one would opt for certain principles and carry them through rigorously.

"Heuristic" is also often used as a noun to describe a rule-of-thumb, procedure, or method.[22] Philosophers of science have emphasized the importance of heuristics in creative thought and constructing scientific theories.[23] (See The Logic of Scientific Discovery by Karl Popper; and philosophers such as Imre Lakatos,[24]Lindley Darden, William C. Wimsatt, and others.)


In legal theory, especially in the theory of law and economics, heuristics are used in the law when case-by-case analysis would be impractical, insofar as "practicality" is defined by the interests of a governing body.[25]

The present securities regulation regime largely assumes that all investors act as perfectly rational persons. In truth, actual investors face cognitive limitations from biases, heuristics, and framing effects.

For instance, in all states in the United States the legal drinking age for unsupervised persons is 21 years, because it is argued that people need to be mature enough to make decisions involving the risks of alcohol consumption. However, assuming people mature at different rates, the specific age of 21 would be too late for some and too early for others. In this case, the somewhat arbitrary deadline is used because it is impossible or impractical to tell whether an individual is sufficiently mature for society to trust them with that kind of responsibility. Some proposed changes, however, have included the completion of an alcohol education course rather than the attainment of 21 years of age as the criterion for legal alcohol possession. This would put youth alcohol policy more on a case-by-case basis and less on a heuristic one, since the completion of such a course would presumably be voluntary and not uniform across the population.

The same reasoning applies to patent law. Patents are justified on the grounds that inventors must be protected so they have incentive to invent. It is therefore argued that it is in society's best interest that inventors receive a temporary government-granted monopoly on their idea, so that they can recoup investment costs and make economic profit for a limited period. In the United States, the length of this temporary monopoly is 20 years from the date the application for patent was filed, though the monopoly does not actually begin until the application has matured into a patent. However, like the drinking-age problem above, the specific length of time would need to be different for every product to be efficient. A 20-year term is used because it is difficult to tell what the number should be for any individual patent. More recently, some, including University of North Dakota law professor Eric E. Johnson, have argued that patents in different kinds of industries – such as software patents – should be protected for different lengths of time.[26]


Stereotyping is a type of heuristic that all people use to form opinions or make judgments about things they have never seen or experienced.[27] They work as a mental shortcut to assess everything from the social status of a person based on their actions to assumptions that a plant that is tall, has a trunk, and has leaves is a tree even though the person making the evaluation has never seen that particular type of tree before.

Stereotypes, as first described by journalist Walter Lippmann in his book Public Opinion (1922), are the pictures we have in our heads that are built around experiences as well as what we are told about the world.[28][29]

Artificial Intelligence[edit]

A heuristic can be used in artificial intelligence systems while searching a solution space. The heuristic is derived by using some function that is inputted into the system by the designer or by adjusting the weight of branches based on how likely each branch is to lead to a goal node.

Critiques and controversies[edit]

The concept of heuristics has critiques and controversies. The "We Cannot Be That Dumb" critique shows that the average person has low ability to make sound and effective judgments.[30]

See also[edit]

Look up heuristic in Wiktionary, the free dictionary.
Wikibooks has more on the topic of: Heuristic


Further reading[edit]

  • How To Solve It: Modern Heuristics, Zbigniew Michalewicz and David B. Fogel, Springer Verlag, 2000. ISBN 3-540-66061-5
  • Russell, Stuart J.; Norvig, Peter (2003), Artificial Intelligence: A Modern Approach (2nd ed.), Upper Saddle River, New Jersey: Prentice Hall, ISBN 0-13-790395-2 
  • The Problem of Thinking Too Much, 2002-12-11, Persi Diaconis
  1. ^Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, Addison-Wesley, p. vii. ISBN 978-0-201-05594-8
  2. ^Emiliano, Ippoliti (2015). Heuristic Reasoning: Studies in Applied Philosophy, Epistemology and Rational Ethics. Switzerland: Springer International Publishing. pp. 1–2. ISBN 978-3-319-09159-4. 
  3. ^Pólya, George (1945) How to Solve It: A New Aspect of Mathematical Method, Princeton, NJ: Princeton University Press. ISBN 0-691-02356-5   ISBN 0-691-08097-6
  4. ^Gigerenzer, Gerd (1991). "How to Make Cognitive Illusions Disappear: Beyond "Heuristics and Biases""(PDF). European Review of Social Psychology. 2: 83–115. doi:10.1080/14792779143000033. Retrieved 14 October 2012. 
  5. ^Daniel Kahneman, Amos Tversky, and Paul Slovic, eds. (1982) Judgment under Uncertainty: Heuristics & Biases. Cambridge, UK, Cambridge University Press ISBN 0-521-28414-7
  6. ^"Heuristics and heuristic evaluation". Retrieved 2013-09-01. 
  7. ^Rudolf Groner, Marina Groner & Walter F. Bischof (1983). Methods of heuristics. Hillsdale N.J., Lawrence Erlbaum.
  8. ^Rudolf Groner & Marina Groner (1991). Heuristische versus algorithmische Orientierung als Dimension des individuellen kognitiven Stils. In K. Grawe, N. Semmer, R. Hänni (Hrsg.), Über die richtige Art, Psychologie zu betreiben . Göttingen, Hogrefe.
  9. ^Gerd Gigerenzer, Peter M. Todd, and the ABC Research Group (1999). Simple Heuristics That Make Us Smart. Oxford, UK, Oxford University Press. ISBN 0-19-514381-7
  10. ^Neth, Hansjörg; Gigerenzer, Gerd (May 2015). "Heuristics: Tools for an uncertain world". Emerging Trends in the Social and Behavioral Sciences. Wiley & Sons, Inc. pp. 1–18. doi:10.1002/9781118900772.etrds0394. Retrieved 10 August 2017. 
  11. ^Gigerenzer, Gerd, and Reinhard Selten, eds. (2002) Bounded rationality: The adaptive toolbox. Cambridge, MA, MIT press. ISBN 978-0262571647
  12. ^Gigerenzer, Gerd; Gaissmaier, Wolfgang (January 2011). "Heuristic Decision Making". Annual Review of Psychology. pp. 451–482. SSRN 1722019. 
  13. ^Gigerenzer, G; Gaissmaier, W (2011). "Heuristic Decision Making". Annual Review of Psychology. 62: 451–482. doi:10.1146/annurev-psych-120709-145346. PMID 21126183. 
  14. ^"Cognitive experiential self theory - Psychlopedia". 2008-10-18. doi:10.1177/1745691611429354. Retrieved 2013-09-01. 
  15. ^Epstein, S.; Pacini, R.; Denes-Raj, V.; Heier, H. (1996). "Individual differences in intuitive-experiential and analytical-rational thinking styles". Journal of Personality and Social Psychology. 71 (2): 390–405. doi:10.1037/0022-3514.71.2.390. PMID 8765488. 
  16. ^ abKahneman, Daniel; Shane Frederick (2002). "Representativeness Revisited: Attribute Substitution in Intuitive Judgment". In Thomas Gilovich; Dale Griffin; Daniel Kahneman. Heuristics and Biases: The Psychology of Intuitive Judgment. Cambridge: Cambridge University Press. pp. 49–81. ISBN 978-0-521-79679-8. OCLC 47364085. 
  17. ^Kahneman, Daniel (December 2003). "Maps of Bounded Rationality: Psychology for Behavioral Economics"(PDF). American Economic Review. American Economic Association. 93 (5): 1449–1475. doi:10.1257/000282803322655392. ISSN 0002-8282. 
  18. ^Cioffi, Jane (1997). "Heuristics, servants to intuition, in clinical decision making". Journal of Advanced Nursing. 26: 203–208. doi:10.1046/j.1365-2648.1997.1997026203.x. 
  19. ^Smith, H. (1999). "Use of the anchoring and adjustment heuristic by children". Current Psychology. 18 (3): 294–300. doi:10.1007/s12144-999-1004-4. 
  20. ^ abHarvey, N (2007). "Use of heuristics: Insights from forecasting research". Thinking & Reasoning. 13 (1): 5–24. doi:10.1080/13546780600872502. 
  21. ^Sternberg, Robert J.; Karin Sternberg (2012). Cognitive Psychology (6th ed.). Belmont, CA: Wadsworth, Cengage Learning. pp. 310–1315. ISBN 978-1-111-34476-4. 
  22. ^K. M. Jaszczolt (2006). "Defaults in Semantics and Pragmatics", The Stanford Encyclopedia of Philosophy, ISSN 1095-5054
  23. ^Roman Frigg and Stephan Hartmann (2006). "Models in Science", The Stanford Encyclopedia of Philosophy, ISSN 1095-5054
  24. ^Kiss, Olga (2006). "Heuristic, Methodology or Logic of Discovery? Lakatos on Patterns of Thinking". Perspectives on Science. 14 (3): 302–317. doi:10.1162/posc.2006.14.3.302. 
  25. ^Gerd Gigerenzer and Christoph Engel, eds. (2007). Heuristics and the Law, Cambridge, The MIT Press, ISBN 978-0-262-07275-5
  26. ^Johnson, Eric E. (2006). "Calibrating Patent Lifetimes"(PDF). Santa Clara Computer & High Technology Law Journal. 22: 269–314. 
  27. ^Bodenhausen, Galen V.; et al. (1999). "On the Dialectics of Discrimination: Dual Processes in Social Stereotyping", in Dual-process Theories in Social Psychology edited by Shelly Chaiken and Yaacov Trope. NY: Guilford Press. pp. 271–92. ISBN 1572304219. Retrieved 24 March 2015. 
  28. ^Kleg, Milton (1883). Hate Prejudice and Racism. Albany: State University of New York Press. p. 135. ISBN 0791415368. Retrieved 24 March 2015. 
  29. ^Gökçen, Sinan. "Pictures in Our Heads". European Roma Rights Centre. Retrieved 24 March 2015. 
  30. ^"Heuristics and Biases" - The Psychology of Intuitive Judgment Edited by THOMAS GILOVICH, Cornell University DALE GRIFFIN, Stanford University DANIEL KAHNEMAN, Princeton University (page 8-9)
Categories: 1

0 Replies to “Court Case Study Heuristic Decision”

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *