Author Archives: billhill

Social Determinants of Health in US Populations

Required:

  • Link and Phelan. 1995. “Social Conditions as Fundamental Causes of Disease” Journal of Health and Social Behavior pp.80-94
    • In this foundational article, social epidemiologists Bruce Link and Jo Phelan argue three main points:
      1.  social conditions (e.g., SES, inequality, racism, segregation) have a causal effect on health and well-being,
      2. to understand patterns of disease prevalence and incidence, we need to contextualize risk factors and understand what conditions put people “at risks of risks” (i.e., people in poor neighborhoods have an elevated risk of exposure to crime which increases the risk of stress accumulation), and
      3. researchers need to acknowledge that social conditions are not just distal causes of disease, they are FUNDAMENTAL causes, meaning that the relationship between the social conditions (e.g., SES) and health is robust and will remain present even as the risk factors for disease and the leading causes of disease/death change.
    • This is because high SES individuals are afforded flexible resources that they can use to avoid risks and minimize the consequences of disease.
  • Frieda, Misha. 2016. “For Native Americans, Health Care Is a Long, Hard Road Away”.
    • NPR : https://www.npr.org/2016/04/13/473264076/for-native-americans-health-care-is-a-long-hard-road-away
  • Social Determinants of Health PPT

Barriers to Confronting Bias: Fragility

Required:

Optional Readings:

Race and Racism in Medicine

Required:

Optional:

 

Cognitive Biases

How are type I and type II thinking at work in the medical decision making process?  In this session we explore type I and type II thinking more deeply.  How do these processes disadvantage and advantage this process?  We find the concepts in cognitive and affective biases in medicine to be useful as constructs and examples.  It is not necessary for you to memorize this list.  It is important that you are able to recognize how type I and type II thinking are present in each example.  Many of our attempts to prevent cognitive errors require us to recognize when we are relying on type I thinking and encourage us to spend time in type II thought.  We will also spend time discussing environmental and system factors that can distract and support these processes.

Watch the video Understanding and Preventing Cognitive Errors in Medicine through 10:05.

Read the vocabulary list below.


From: Croskerry, P. 50 cognitive and affective biases in medicine. 2013.

The following 3 are included in the USMLE content list:

Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias.

Availability: the disposition to judge things as being more likely, or frequently occurring, if they readily come to mind. Thus, recent experience with a disease may inflate the likelihood of its being diagnosed. Conversely, if a disease has not been seen for a long time (is less available) it may be under-diagnosed. The availability cascade occurs when a collective belief becomes more plausible through increased repetition, e.g. ‘I’ve heard this from several sources so it must be true’.

Framing effect: how diagnosticians see things may be strongly influenced by the way in which the problem is framed, e.g., physicians’ perceptions of risk to the patient may be strongly influenced by whether the outcome is expressed in terms of the possibility that the patient may die or that they might live. In terms of diagnosis, physicians should be aware of how patients, nurses and other physicians frame potential outcomes and contingencies of the clinical problem to them.


The following cognitive biases may also contribute to faulty decision making and medical errors.

Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required

Base-rate neglect: the tendency to ignore the true prevalence of a disease, either inflating or reducing its base-rate, and distorting Bayesian reasoning. However, in some cases clinicians may (consciously or otherwise) deliberately inflate the likelihood of disease, such as in the strategy of ‘rule out worst case scenario’ to avoid missing a rare but significant diagnosis.

Blind Obedience: showing undue deference to authority or technology. This can occur when an individual or team defers to the opinion of the consultant or to the findings of a radiologic study, even when it doesn’t make sense with the clinical picture.

Blind spot bias: the general belief physicians may have that they are less susceptible to bias than others due, mostly, to the faith they place in their own introspections. This bias appears to be universal across all cultures.

Commission bias: results from the obligation towards beneficence, in that harm to the patient can only be prevented by active intervention. It is the tendency towards action rather than inaction. It is more likely in over-confident physicians. Commission bias is less common than omission bias.

Confirmation bias: the tendency to look for confirming evidence to support a diagnosis rather than look for disconfirming evidence to refute it, despite the latter often being more persuasive and definitive.

Diagnosis Momentum: once diagnostic labels are attached to patients they tend to become stickier and stickier. Through intermediaries, (patients, paramedics, nurses, physicians) what might have started as a possibility gathers increasing momentum until it becomes definite and all other possibilities are excluded.

Fundamental attribution error: the tendency to be judgmental and blame patients for their illnesses (dispositional causes) rather than examine the circumstances (situational factors) that might have been responsible. In particular, psychiatric patients, minorities and other marginalized groups tend to suffer from this bias. Cultural differences exist in terms of the respective weights attributed to dispositional and situational causes.

Information bias: the tendency to believe that the more evidence one can accumulate to support a decision the better. While gathering sufficient information is always important, it is also important to anticipate the value of information and whether it will be useful or not in making the decision, rather than collecting information because we can, or for its own sake, or out of curiosity.

Mere exposure effect: the development of a preference for something simply because you are familiar with it. Also known as the familiarity principle, it can have widespread effects in medicine, e.g., merely seeing a pharmaceutical product or being told about it may increase the likelihood of choosing it over other products.

Need for closure: the bias towards drawing a conclusion or making a verdict about something when it is still not definite. It often occurs in the context of making a diagnosis where the clinician may feel obliged to make a specific diagnosis under conditions of time or social pressure, or to escape feelings of doubt or uncertainty. It might be preferable to say instead that the patient’s complaint is ‘not yet diagnosed’ (NYD).

Omission bias: the tendency towards inaction; rooted in the principle of non-maleficence. In hindsight, events that have occurred through the natural progression of a disease are more acceptable than those that may be attributed directly to the action of the physician. The bias may be sustained by the reinforcement often associated with not doing anything, but may prove disastrous. Omission biases typically outnumber commission biases.

Posterior probability error: occurs when a physician’s estimate for the likelihood of disease is unduly influenced by what has gone before for a particular patient. It is the opposite of the Gambler’s fallacy in that the physician is gambling on the sequence continuing, e.g., if a patient presents to the office five times with a headache and is correctly diagnosed as migraine on each visit, it is the tendency to diagnose migraine on the sixth visit.

Premature closure: is a powerful bias accounting for a high proportion of missed diagnoses. It is the tendency to apply premature closure to the decision making process, accepting a diagnosis before it has been fully verified. The consequences of the bias are reflected in the maxim ‘when the diagnosis is made, the thinking stops’.

Search satisficing: reflects the universal tendency to call off a search once something is found. It is pervasive and considered one of the most important sources of error in radiology. Comorbidities, second foreign bodies, other fractures, and co-ingestants in poisoning may all be missed.

Sunk costs: the more clinicians invest in a particular diagnosis, the less likely they may be to release it and consider alternatives. This is an entrapment form of bias more associated with investment and financial considerations. However, for the diagnostician, the investment of time, mental energy and, for some, ego may be a precious investment. Confirmation bias may be a manifestation of such an unwillingness to let go of a failing diagnosis.

Visceral bias: the influence of affective sources of error on decision-making has been widel underestimated. Visceral arousal leads to poor decisions. Countertransference, involving both negative and positive feelings towards patients, may result in diagnoses being missed.

Zebra retreat: occurs when a rare diagnosis (zebra) figures prominently on the differential diagnosis but the physician retreats from it for various reasons:
Perceived inertia in the system and barriers to obtaining special or costly tests;
• Self-consciousness and under-confidence about entertaining a remote and unusual diagnosis, and gaining a reputation for being esoteric;
• The fear of being seen as unrealistic and wasteful of resources;
• Underestimating or overestimating the base-rate for the diagnosis;
• The clinical environment may be very busy and the anticipated time and effort to pursue the diagnosis might dilute the physician’s conviction;
• Team members may exert coercive pressure to avoid wasting the team’s time;
• Inconvenience of the time of day or weekend and difficulty getting access to specialists;
• Unfamiliarity with the diagnosis might make the physician less likely to go down an unfamiliar road;
• Fatigue, sleep deprivation, or other distractions may tip the physician toward retreat. Any one or a combination of these reasons may result in a failure to pursue the initial hypothesis.

Introduction to Systems Improvements (AIM Statements)

Optional:

IHI online open school.  Supplementary activity to provide greater detail and context for the material covered in lecture.  Recommended for students pursing a certificate in quality and safety. Improve understanding of aim statement by providing additional examples.

IHI Online Open School Modules – http://app.ihi.org/lms/home.aspx

  • QI 101Lesson 1- 2
  • QI 102 lesson 1- 2

Interrupting Bias

Required:

Thinking Fast and Thinking Slow

This session begins an exploration of thinking about thinking.  Nobel laureate Daniel Kanhneman deserves credit for the title- he published Thinking, Fast and Slow in 2011.  In our quest to become master clinicians it is paramount we explore all aspects of clinical reasoning.  This session introduces the thought process behind conscious and unconscious bias.

A brief note on bias.  The Oxford English Dictionary (OED) defines bias as “Cause to feel or show inclination or prejudice for or against someone or something.”  As you read the below articles and participate in the in class discussion consider how these theories of thought contribute to how fast or slow you arrive at conclusions.

Over the course of EHM we will continuously refer to Type I and Type II thinking.  An understanding of these fundamental thought processes can improve not only your clinical reasoning but your interactions with patients and peers.

  1. Watch “Think Fast! Critical Thinking and Dual Process Theories.”
  2. Read the article Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84:1022-8. Be sure to read the highlighted sections.

Key Ethics Term: Interdependency

The concept of interdependency is most prevalent in care ethics, feminist ethics, virtue ethics, and communitarianism. The idea is that we are not independent but rather interdependent beings in the world. We depend on others when we are young/old, sick, etc., but also in our everyday lives. We would not get along without schools, daycares, public transportation, electricity, and so on. In care ethics and virtue ethics, this is not a feature of humans that is to be overcome, but actually something valuable about our lives. It is not just that we need relationships to survive, but they are also important to meaning and flourishing in the world.

This feature of our humanity can be easy to forget in the clinical setting where patients and clinicians seem removed from the particular relationships that make them who they are and able to live in the world as they do. Sometimes it can make all the difference to recognize both the interdependency of the clinician-patient relationship, but also the other interdependent relationships that can support or impede care. Beginning from a perspective of interdependency might have a more positive effect than thinking in terms of individual independence and rights.


Key Ethics Concept: Communitarianism

The communitarian approach on the values of the community over and above the values of individuals within that community. This approach looks to the particular context, community beliefs, and societal relations to formulate standards of justice and responsibility.

This approach can significantly affect how we make and respect decisions in clinical contexts. Some medical communities may decide not to offer particular therapies to individuals based on the needs of the community. Likewise, some families may decide not to accept particular interventions in light of their community’s resources and values. See the case of Baby Aaron below for a good example of a communitarian ethos.


Ethical Values, Obligations and Virtues in Communities

REVIEW these key ethics terms:

WATCH this video of Carol Gilligan on Moral Development and Care Ethics:

While watching, CONSIDER …

  • What is different about the approach that Gilligan is suggesting (i.e. a relational or Care Ethics based approach) from what you typically think of in terms of your ethical obligations as physicians.

READ Baby Aaron and the Elders by Ellen Wright Clayton and Eric Kodish
.

While reading CONSIDER:

  • How is a communitarian approach being employed by Aaron’s family?
  • How could a communitarian approach guide the response of the medical provider?
  • What would be the primary goal(s) in a Care Ethics based approach to this case? Be specific.
  • How might a Care Ethics approach guide your particular response to baby Aaron’s family? What would it avoid?