How cognitive biases impact healthcare decisions

https://www.linkedin.com/pulse/how-cognitive-biases-impact-healthcare-decisions-robert-pearl-m-d–ti5qc/?trackingId=eQnZ0um3TKSzV0NYFyrKXw%3D%3D

Day one of the healthcare strategy course I teach in the Stanford Graduate School of Business begins with this question: “Who here receives excellent medical care?”

Most of the students raise their hands confidently. I look around the room at some of the most brilliant young minds in business, finance and investing—all of them accustomed to making quick yet informed decisions. They can calculate billion-dollar deals to the second decimal point in their heads. They pride themselves on being data driven and discerning.

Then I ask, “How do you know you receive excellent care?”

The hands slowly come down and room falls silent. In that moment, it’s clear these future business leaders have reached a conclusion without a shred of reliable data or evidence.

Not one of them knows how often their doctors make diagnostic or technical errors. They can’t say whether their health system’s rate of infection or medical error is high, average or low.

What’s happening is that they’re conflating service with clinical quality. They assume a doctor’s bedside manner correlates with excellent outcomes.

These often false assumptions are part of a multi-millennia-long relationship wherein patients are reluctant to ask doctors uncomfortable but important questions: “How many times have you performed this procedure over the past year and how many patients experienced complications?” “What’s the worst outcome a patient of yours had during and after surgery?”

The answers are objective predictors of clinical excellence. Without them, patients are likely to become a victim of the halo effect—a cognitive bias where positive traits in one area (like friendliness) are assumed to carry over to another (medical expertise).

This is just one example of the many subconscious biases that distort our perceptions and decision-making.

From the waiting room to the operating table, these biases impact both patients and healthcare professionals with negative consequences. Acknowledging these biases isn’t just an academic exercise. It’s a crucial step toward improving healthcare outcomes.

Here are four more cognitive errors that cause harm in healthcare today, along with my thoughts on what can be done to mitigate their effects:

Availability bias

You’ve probably heard of the “hot hand” in Vegas—a lucky streak at the craps table that draws big cheers from onlookers. But luck is an illusion, a product of our natural tendency to see patterns where none exist. Nothing about the dice changes based on the last throw or the individual shaking them.

This mental error, first described as “availability bias” by psychologists Amos Tversky and Daniel Kahneman, was part of groundbreaking research in the 1970s and ‘80s in the field of behavioral economics and cognitive psychology. The duo challenged the prevailing assumption that humans make rational choices.

Availability bias, despite being identified nearly 50 years ago, still plagues human decision making today, even in what should be the most scientific of places: the doctor’s office.

Physicians frequently recommend a treatment plan based on the last patient they saw, rather than considering the overall probability that it will work. If a medication has a 10% complication rate, it means that 1 in 10 people will experience an adverse event. Yet, if a doctor’s most recent patient had a negative reaction, the physician is less likely to prescribe that medication to the next patient, even when it is the best option, statistically.

Confirmation bias

Have you ever had a “gut feeling” and stuck with it, even when confronted with evidence it was wrong? That’s confirmation bias. It skews our perceptions and interpretations, leading us to embrace information that aligns with our initial beliefs—and causing us to discount all indications to the contrary.

This tendency is heightened in a medical system where physicians face intense time pressures. Studies indicate that doctors, on average, interrupt patients within the first 11 seconds of being asked “What brings you here today?” With scant information to go on, doctors quickly form a hypothesis, using additional questions, diagnostic testing and medical-record information to support their first impression.

Doctors are well trained, and their assumptions prove more accurate than incorrect overall. Nevertheless, hasty decisions can be dangerous. Each year in the United States, an estimated 371,000 patients die from misdiagnoses.

Patients aren’t immune to confirmation bias, either. People with a serious medical problem commonly seek a benign explanation and find evidence to justify it. When this happens, heart attacks are dismissed as indigestion, leading to delays in diagnosis and treatment.

Framing effect

In 1981, Tversky and Kahneman asked subjects to help the nation prepare for a hypothetical viral outbreak. They explained that if the disease was left untreated, it would kill 600 people. Participants in one group were told that an available treatment, although risky, would save 200 lives. The other group was told that, despite the treatment, 400 people would die. Although both descriptions lead to the same outcome—200 people surviving and 400 dying—the first group favored the treatment, whereas the second group largely opposed it.

The study illustrates how differently people can react to identical scenarios based on how the information is framed. Researchers have discovered that the human mind magnifies and experiences loss far more powerfully than positive gains. So, patients will consent to a chemotherapy regiment that has a 20% chance of cure but decline the same treatment when told it has 80% likelihood of failure.

Self-serving bias

The best parts about being a doctor are saving and improving lives. But there are other perks, as well.

Pharmaceutical and medical-device companies aggressively reward physicians who prescribe and recommend their products. Whether it’s a sponsored dinner at a Michelin restaurant or even a pizza delivered to the office staff, the intention of the reward is always the same: to sway the decisions of doctors.

And yet, physicians swear that no meal or gift will influence their prescribing habits. And they believe it because of “self-serving bias.”

In the end, it’s patients who pay the price. Rather than receiving a generic prescription for a fraction of the cost, patients end up paying more for a brand-name drug because their doctor—at a subconscious level—doesn’t want to lose out on the perks.

Thanks to the “Sunshine Act,” patients can check sites like ProPublica’s Dollars for Docs to find out whether their healthcare professional is receiving drug- or device-company money (and how much).

Reducing subconscious bias

These cognitive biases may not be the reason U.S. life expectancy has stagnated for the past 20 years, but they stand in the way of positive change. And they contribute to the medical errors that harm patients.

A study published this month in JAMA Internal Medicine found that 1 in 4 hospital patients who either died or were transferred to the ICU had been affected by a diagnostic mistake. Knowing this, you might think cognitive biases would be a leading subject at annual medical conferences and a topic of grave concern among healthcare professionals. You’d be wrong. Inside the culture of medicine, these failures are commonly ignored.

The recent story of an economics professor offers one possible solution. Upon experiencing abdominal pain, he went to a highly respected university hospital. After laboratory testing and observation, his attending doctor concluded the problem wasn’t serious—a gallstone at worst. He told the patient to go home and return for outpatient workup.

The professor wasn’t convinced. Fearing that the medical problem was severe, the professor logged onto ChatGPT (a generative AI technology) and entered his symptoms. The application concluded that there was a 40% chance of a ruptured appendix. The doctor reluctantly ordered an MRI, which confirmed ChatGPT’s diagnosis.

Future generations of generative AI, pretrained with data from people’s electronic health records and fed with information about cognitive biases, will be able to spot these types of errors when they occur.

Deviation from standard practice will result in alerts, bringing cognitive errors to consciousness, thus reducing the likelihood of misdiagnosis and medical error. Rather than resisting this kind of objective second opinion, I hope clinicians will embrace it. The opportunity to prevent harm would constitute a major advance in medical care.

Why we’re numb to 250,000 coronavirus deaths

https://www.axios.com/coronavirus-death-toll-psychological-reaction-f5aab275-1c93-444e-9914-5b0bf8fe07d9.html

Illustration of a graveyard with one giant tombstone

The U.S. passed 250,000 confirmed deaths from COVID-19 this week, a figure that is truly vast — too vast, perhaps, for us to comprehend.

Why it matters: The psychic numbing that sets in around mass death saps us of our empathy for victims and discourages us from making the sacrifices needed to control the pandemic, whileit hampers our ability to prepare for other rare but potentially catastrophic risks down the road.

By the numbersThe sheer scale of the U.S. death toll from COVID-19 can be felt in the lengths media organizations have gone to try to put the numbers in perspective. 250,000 deaths is:

  • Ten times the number of American drivers and passengers who die in car crashes each year, according to CNN.
  • More than twice the number of American soldiers who died in World War I, according to NPR.
  • Enough to draw a vast hole in America’s heartland, if the deaths had all been concentrated in one area, according to the Washington Post.

Even if we try our best to grasp mass death, we inevitably come up against cognitive biases, says Paul Slovic, a psychologist at the University of Oregon who studies human judgment and decision-making.

  • The biggest bias is scope neglect: as the scale of deaths and tragedy grows, our own compassion and concern fail to keep pace. As the title of one of Slovic’s papers on the subject goes: “The more who die, the less we care.”

This is, of course, not rational — by any reasonable, moral calculation, we should find 250,000 deaths commensurately more horrifying than a smaller number. But in practice we don’t, almost as if we had a set capacity for empathy and concern that tops out well below the scale of a pandemic.

  • It doesn’t help that for most of us — save bereaved family members and health care workers on the front line — those deaths go unseen, hidden behind the walls of hospitals and funeral homes.
  • In a news culture driven by the visual — and equipped with a psychology moved by identifiable victims over mere numbers — that makes these deaths feel that much more unreal, and for some, that much easier to deny altogether.
  • Combined with the habituation to trauma that has set in after months of the pandemic, it shouldn’t be surprising that most of us are doing much less to fight the spread of COVID-19 now than we were in the spring, when the number of sick and dead were far lower.

How it works: In a study following the 1994 Rwandan genocide, in which 800,000 people were killed in a matter of months, Slovic and his colleagues asked a group of volunteers to imagine they were in charge of a refugee camp.

  • They had to decide whether or not to help 4,500 refugees get access to clean water. Half were told the camp held 250,000 refugees, and half were told it held 11,000.
  • The study subjects were much more willing to help if they thought they were assisting 4,500 people out of 11,000, and less willing if it was 4,500 out of 250,000 people. They were reacting to the proportion of those who would be helped, while neglecting the scope of the raw number.
  • Relatedly,in a 2014 study, Slovic found a decrease in empathy and a consequent drop in donations to save sick children as the number of victims rose, with effects being seen as soon as one child became two.

What to watch: These same cognitive biases make it difficult for us to fully appreciate chronic threats like climate change, or to prepare for rare but catastrophic risks in the future — like a pandemic.

  • Given how hardwired these biases are, our best bet is to try to steer into them, and keep in mind that each of these 250,000 deaths tells an individual story.
  • As the survivor Abel Herzberg said of the Holocaust: “There were not six million Jews murdered; there was one murder, six million times.”

The bottom line: As the death toll rises, it will take willful effort not to become numb to what’s happening. But it is an effort that must be made.