Tag Archives: Lifelong Learning

Cognitive Biases

How are type I and type II thinking at work in the medical decision making process?  In this session we explore type I and type II thinking more deeply.  How do these processes disadvantage and advantage this process?  We find the concepts in cognitive and affective biases in medicine to be useful as constructs and examples.  It is not necessary for you to memorize this list.  It is important that you are able to recognize how type I and type II thinking are present in each example.  Many of our attempts to prevent cognitive errors require us to recognize when we are relying on type I thinking and encourage us to spend time in type II thought.  We will also spend time discussing environmental and system factors that can distract and support these processes.

Watch the video Understanding and Preventing Cognitive Errors in Medicine through 10:05.

Read the vocabulary list below.


From: Croskerry, P. 50 cognitive and affective biases in medicine. 2013.

The following 3 are included in the USMLE content list:

Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias.

Availability: the disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience with a disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time (is less available) it may be under-diagnosed. The availability cascade occurs when a collective belief becomes more plausible through increased repetition, e.g. ‘I’ve heard this from several sources so it must be true’.

Framing effect: how diagnosticians see things may be strongly influenced by the way in which the problem is framed, e.g., physicians’ perceptions of risk to the patient may be strongly influenced by whether the outcome is expressed in terms of the possibility that the patient may die or that they might live. In terms of diagnosis, physicians should be aware of how patients, nurses and other physicians frame potential outcomes and contingencies of the clinical problem to them.


The following cognitive biases may also contribute to faulty decision making and medical errors.

Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required

Base-rate neglect: the tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate, and distorting Bayesian reasoning. However, in some cases clinicians may (consciously or otherwise) deliberately inflate the likelihood of disease, such as in the strategy of ‘rule out worst case scenario’ to avoid missing a rare but significant diagnosis.

Blind Obedience: showing undue deference to authority or technology. This can occur when an individual or team defers to the opinion of the consultant or to the findings of a radiologic study, even when it doesn’t make sense with the clinical picture.

Blind spot bias: the general belief physicians may have that they are less susceptible to bias than others due, mostly, to the faith they place in their own introspections. This bias appears to be universal across all cultures.

Commission bias: results from the obligation towards beneficence, in that harm to the patient can only be prevented by active intervention. It is the tendency towards action rather than inaction. It is more likely in over-confident physicians. Commission bias is less common than omission bias.

Confirmation bias: the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter often being more persuasive and definitive.

Diagnosis Momentum: once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries, (patients, paramedics, nurses, physicians) what might have started as a possibility gathers increasing momentum until it becomes definite and all other possibilities are excluded.

Fundamental attribution error: the tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine the circumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities and other marginalized groups tend to suffer from this bias. Cultural differences exist in terms of the respective weights attributed to dispositional and situational causes.

Information bias: the tendency to believe that the more evidence one can accumulate to support a decision the better. While gathering sufficient information is always important, it is also important to anticipate the value of information and whether it will be useful or not in making the decision, rather than collecting information because we can, or for its own sake, or out of curiosity.

Mere exposure effect: the development of a preference for something simply because you are familiar with it. Also known as the familiarity principle, it can have widespread effects in medicine, e.g., merely seeing a pharmaceutical product or being told about it may increase the likelihood of choosing it over other products.

Need for closure: the bias towards drawing a conclusion or making a verdict about something when it is still not definite. It often occurs in the context of making a diagnosis where the clinician may feel obliged to make a specific diagnosis under conditions of time or social pressure, or to escape feelings of doubt or uncertainty. It might be preferable to say instead that the patient’s complaint is ‘not yet diagnosed’ (NYD).

Omission bias: the tendency towards inaction; rooted in the principle of non-maleficence. In hindsight, events that have occurred through the natural progression of a disease are more acceptable than those that may be attributed directly to the action of the physician. The bias may be sustained by the reinforcement often associated with not doing anything, but may prove disastrous. Omission biases typically outnumber commission biases.

Posterior probability error: occurs when a physician’s estimate for the likelihood of disease is unduly influenced by what has gone before for a particular patient. It is the opposite of the Gambler’s fallacy in that the physician is gambling on the sequence continuing, e.g., if a patient presents to the office five times with a headache and is correctly diagnosed as migraine on each visit, it is the tendency to diagnose migraine on the sixth visit.

Premature closure: is a powerful bias accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision making process, accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim ‘when the diagnosis is made, the thinking stops’.

Search satisficing: reflects the universal tendency to call off a search once something is found. It is pervasive and considered one of the most important sources of error in radiology. Comorbidities, second foreign bodies, other fractures, and co-ingestants in poisoning may all be missed.

Sunk costs: the more clinicians invest in a particular diagnosis, the less likely they may be to release it and consider alternatives. This is an entrapment form of bias more associated with investment and financial considerations. However, for the diagnostician, the investment of time, mental energy and, for some, ego may be a precious investment. Confirmation bias may be a manifestation of such an unwillingness to let go of a failing diagnosis.

Visceral bias: the influence of affective sources of error on decision-making has been widel underestimated. Visceral arousal leads to poor decisions. Countertransference, involving both negative and positive feelings towards patients, may result in diagnoses being missed.

Zebra retreat: occurs when a rare diagnosis (zebra) figures prominently on the differential diagnosis but the physician retreats from it for various reasons:
Perceived inertia in the system and barriers to obtaining special or costly tests;
• Self-consciousness and under-confidence about entertaining a remote and unusual diagnosis, and gaining a reputation for being esoteric;
• The fear of being seen as unrealistic and wasteful of resources;
• Underestimating or overestimating the base-rate for the diagnosis;
• The clinical environment may be very busy and the anticipated time and effort to pursue the diagnosis might dilute the physician’s conviction;
• Team members may exert coercive pressure to avoid wasting the team’s time;
• Inconvenience of the time of day or weekend and difficulty getting access to specialists;
• Unfamiliarity with the diagnosis might make the physician less likely to go down an unfamiliar road;
• Fatigue, sleep deprivation, or other distractions may tip the physician toward retreat. Any one or a combination of these reasons may result in a failure to pursue the initial hypothesis.

Thinking Fast and Thinking Slow

This session begins an exploration of thinking about thinking.  Nobel laureate Daniel Kanhneman deserves credit for the title- he published Thinking, Fast and Slow in 2011.  In our quest to become master clinicians it is paramount we explore all aspects of clinical reasoning.  This session introduces the thought process behind conscious and unconscious bias.

A brief note on bias.  The Oxford English Dictionary (OED) defines bias as “Cause to feel or show inclination or prejudice for or against someone or something.”  As you read the below articles and participate in the in class discussion consider how these theories of thought contribute to how fast or slow you arrive at conclusions.

Over the course of EHM we will continuously refer to Type I and Type II thinking.  An understanding of these fundamental thought processes can improve not only your clinical reasoning but your interactions with patients and peers.

  1. Watch “Think Fast! Critical Thinking and Dual Process Theories.”
  2. Read the article Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84:1022-8. Be sure to read the highlighted sections.