chevron-down Created with Sketch Beta.

Experience

Experience October/November 2024

Hindsight and Outcome Bias: The Fallacy of Understanding

Gerald Joseph Todaro

Summary

  • Hindsight bias involves working backward from a known result, connecting the dots eliciting the mental propensity to exaggerate the predictability of the outcome.
  • According to Noble Prize winner Daniel Kahneman, it can be dangerous for people to rely on hindsight and outcome bias to form beliefs about how the world works.
Hindsight and Outcome Bias: The Fallacy of Understanding
Eugene Mymrin/Moment via Getty Images

Jump to:

In the honest search for knowledge, you quite often have to abide by ignorance for an indefinite period.” This admonition is from Erwin Schrodinger, the winner of the Nobel Prize for Quantum Physics.

Limits of human perception

One of the glaring flaws of human nature is the inability to embrace uncertainty when the path for decision-making is either unclear or nonexistent. Medical doctors, especially emergency room physicians and radiologists, lead the list of one of life’s most frustrating challenges. Humans were not made to make error-free decisions in less than 10-15 minutes. In my experience, limited time for decision-making in high-anxiety settings is fertile ground for human error. And except for me, I don’t know anyone that is perfect!

Radiologists have studied the error rate in the interpretation of imaging studies since the 50s. Radiologists spend, on average, between 3-10 minutes interpreting radiological black-and-white images, depending on the complexity of the films. Whether radiologists interpret simple plain films of the chest and abdomen or MRIs containing 500 individual frames, the diagnostic error rate has remained in the 2-4 percent range, irrespective of the most recent technological advances. The first time I saw this statistic, I felt a pang of anxiety. I hope that doesn’t happen to me, but it’s the reality of the limitations of human perception, no matter how diligent and compulsive the physician may be.

The wisdom in the rearview mirror

There are lessons from the practice of radiology for all of us no matter our occupation, no matter the circumstance—business or personal. In short, I’m talking about the wisdom of hindsight. To me, though, I have found the research on cognitive errors involving hindsight and outcome bias carries over to the deluge of daily events, especially media pundits and analysts, expressing opinions ranging from faulty government to solutions to social and economic issues, offered nightly by partisan news channels, usually delivered in high dudgeon. Less often but more costly are medical experts retained to testify in medical malpractice lawsuits.

In my corner of the world, hindsight and outcome bias are the staple arguments of malpractice lawsuits. Take, for example, a mammogram on a young woman that shows what appears to the trained eye to be a normal variation of breast tissue, such as microcalcifications, typically a benign finding unless worrisome patterns appear. 18 months later, a cancerous tumor is found in the same location where the radiologist had previously interpreted normal findings. The radiologist saw a normal pattern of calcifications, but now that same pattern is perceived as abnormal. Why? Because a cancerous tumor was found in the exact spot, the calcifications are certainly not benign. “After acquired knowledge” changed our perception of the calcifications. Now we know that the calcifications were early signs of cancer and now the calcifications are more meaningful, and we think the radiologist should have been more suspicious and recommended more specialized X-rays.

Welcome to one of the major illusions of understanding the world in which we navigate: hindsight and its twin sister outcome bias, which are similar but take different mental shortcuts. Hindsight bias involves working backward from a known result, connecting the dots eliciting the mental propensity to exaggerate the predictability of the outcome. In our breast cancer example, hindsight assures us that once the radiologist saw the formation of calcifications, any reasonable radiologist would have anticipated the onset of cancer.

Outcome bias vs. hindsight

Outcome bias is the tendency to judge an individual’s decisions on whether the outcome is good or bad. A bad outcome means bad judgment. For example, a patient dies two days after leaving the hospital. Clearly, his doctor missed something. Focusing on the bad result ignores the difficulty and uncertainty in the process of making decisions. My most recent example of outcome bias involves a patient who died from a routine gallbladder procedure. No one dies from a gallbladder procedure unless the doctor was negligent, the deceased patient’s lawyer told the jury. The plaintiffs ignored the fact that a severe bout of sepsis that nearly killed the patient six months before surgery had so distorted the anatomy of the vascular supply of the liver that despite taking special precautions to remove the gallbladder from the bed of the liver, the surgeon inadvertently cut the portal vein. Injury to this vein bleeds profusely and is difficult to locate because it oozes instead of spurting, and portal vein injuries have a high mortality rate. We successfully defended the doctor, but how much was lucky because of the jury composition versus deconstruction of hindsight and outcome bias? I don’t truly know, but this jury understood more than a bad result. They understood the difficulty of the gallbladder procedure and the injury to a vessel that wasn’t where it was supposed to be.

Mental shortcuts lead to poor judgment

People who rely on hindsight and outcome bias to form beliefs about how the world works “helps perpetuate a pernicious illusion,” says Daniel Kahneman, winner of the Nobel Prize for Economics and author of the bestselling book Thinking, Fast and Slow. His life work as a behavior psychologist, and his book focuses on the two ways humans make decisions: “System 1” is fast, intuitive and emotional; “System 2” is slower, more deliberative, and more logical. I have read parts of this book many times because, in general, the defense of doctors and medicine involves vulnerable doctors looking into an abyss of uncertainty and defending against hindsight analysis and outcome bias. Second-guessing doctors is an inherent part of medical malpractice litigation. Importantly, professor Kahneman’s wisdom transcends the courtroom and provides universal wisdom for how we make judgments in our daily lives.

He explains that our fast-thinking side of the brain “sees the world as more tidy, simple, predictable and coherent than it really is.” Then, he goes on to explain why such erroneous beliefs do more harm than good: “The illusion that one has understood the past feeds the further illusion that one can predict and control the future.”

An assassination attempt: faulting the secret service director

The number of examples of outcome bias just from the political events in this country would arguably exhaust the archives of ChatBox. However, the public evisceration of Secret Service Director Kimberly Cheatle by Democratic and Republican congressmen (who took time off from their bitter, divisive stalemate on how to run our country) is the most recent example of outcome bias on the national stage. Although she took responsibility for the failure of the Secret Service to prevent the assassination attempt of former president Donald Trump, and it appears from the reports that mistakes were made, there was little evidence of Kimberly Cheatle’s actual role in the planning of the Secret Service protection at the Trump rally. Yet, House Speaker Mike Johnson called for her to resign after her congressional testimony. He didn’t like, he said, her cover your butt testimony.

In an interview, a former Secret Service agent analyzed the flaws in the approval of the site and the plan of protection. The shooter involved here used a drone to survey the rally site. He found a vantage point 150 yards from the podium. He used an AR rifle, reportedly not as accurate as other long rifles with scopes on the market. Fortunately, he missed the former president, but unfortunately, he killed a family man protecting his children behind the podium.

Fall guy: Kimberly Cheatle

On a recent NPR segment on national security, Scott Simon interviewed Carol Leonnig, an investigative reporter for the Washington Post and author of Zero Fail: The Rise and Fall Of The Secret Service. There are layers of protection provided by the Secret Service, she explained, a “pecking order,” depending on the status of the government official. “The President gets NFL treatment; the Vice President gets college ball treatment, and up until the assassination [attempt], Donald Trump was getting junior high to high school treatment.” The problem falls at the doorstep of Congress. "They have not made the Secret Service a priority in resources," she added.

The calling for the resignation of Kimberly Cheatle seemed like the politically right thing to do based on the outcome—the failure of adequate protection for the former president. However, the resignation of the Secret Service Director did nothing to increase the ability of the Secret Service to protect presidential candidates and people attending campaign rallies. She’s the fall guy, or more precisely, a bad guy. Ironically, our congressman essentially blamed Kimberly Cheatle for a lapse of protection for which they shared responsibility.

Daniel Kahneman warns that our shortcuts to understanding the past limit our ability to learn from our mistakes.

    Author