Chair's Column: Think! (or the Whiteness of the Whale)
By David H. Johnson, Bannerman & Johnson, P.A., Albuquerque, NM
So what do Aretha Franklin, Herman Melville and a Nobel laureate in economics have in common? Read on, shipmates, read on.
I have long believed that a critical skill in the practice of law is the ability to understand the context of my clients’ problems. For me, this skill was largely learned during the years I spent working as a nurse practitioner. I was educated in the old school by clinicians who believed that the most important element of clinical decision-making was the patient history. As a founder of modern medicine, Sir William Osler, told his students, “Young doctor, listen to your patient. He is telling you his diagnosis.” From that one might deduce the nature of the patient’s illness. So too, the seasoned lawyer learns to examine the circumstances of his or her client’s problems and then to compare the findings with previous facts, observations and outcomes rooted in past experience, which, in turn, provide the basis for recommendations to the client. As lawyers, we are frequently asked by clients to assess the degree of risk inherent in a situation or to recommend a particular course of action, whether it be in litigation or a business transaction. We usually are able to provide an answer, even if it is hedged by the usual qualifiers. Nonetheless, we are satisfied that we have provided our clients with a reasonably accurate assessment. Is our confidence justified? Perhaps not.
It was a typical Saturday night in the ER of the county hospital in Oakland where I moonlighted on weekends. In between the heart attacks and gunshot wounds, one of the residents told me about an article she had read in Science that led to her changing the way she thought about practicing medicine. It was titled “Judgment under Uncertainty: Heuristics and Biases” by Israeli cognitive psychologists Daniel Kahneman and Amos Tversky. In the 27 years since I first encountered it, it remains the academic article that has most influenced me.
Kahneman, who received the Nobel Prize in economics in 2002 (the only person to have achieved this without graduate work in economics) and his friend and long-time collaborator, Tversky, are considered to be among the founders of the discipline of behavioral economics, which is the basis for such recent bestsellers as Nudge and Blink. Kahneman has recently summarized the results of his life’s work in this area in an eminently-readable new book: Thinking, Fast and Slow. The article and Kahneman’s book focus upon largely automatic mental mechanisms or shortcuts (heuristics) by which people draw conclusions about the meaning of information presented to them or the likelihood of certain outcomes. These heuristics often lead a person to useful conclusions, and are thus adaptive. However, they also often lead people, even experienced social scientists, to dramatically wrong conclusions.
The truth is that abstract reasoning is inherently more difficult than we realize and that in the course of making complex decisions we unconsciously utilize mental shortcuts in reaching these decisions. Moreover, because these shortcuts are evolutionary in their development, we are generally unaware of their existence, not to mention how they shape our thinking.
Kahneman and Tversky’s insight was to realize that the unconscious biases that affect our reasoning can be described by references to categories of common errors in thinking and processing information. Representativeness refers to the probability that event or object A is related to event or object B, whether as members of the same category or in a causal sense. Consider the following question. Susan is a 40-year-old professional who belongs to a women’s book group and is on the Board of Planned Parenthood. Which is more likely to be her occupation: professor of women’s studies or attorney? By a relatively large percentage, respondents to the question arrive at the wrong answer. They pick professor because the description conforms to a stereotype. Attorney is the correct answer because the number of attorneys greatly outnumbers the number of women’s studies faculty. This example demonstrates the failure to take into account the prior probability of a particular outcome, i.e. that attorneys are much more numerous than women’s studies professors.
According to an old joke, for lawyers the plural of anecdote is data. This may lead to misattribution of causality where the sample size is small. Take, for example, an unusually high rate of cancer in a small town of 5,000 with a nearby petrochemical plant. While it is tempting to draw an inference of causality from the association of a high cancer rate to the chemical plant, an equally plausible explanation is that the association is simply a statistical artifact arising from a small sample size. In this case the base cancer rate is derived from a large sample (the U.S. population) as compared to the size of the town (5,000). It is well-established in statistics that small samples are more likely to yield extreme values than larger ones. The error in assuming causation in this case comes from failing to consider the effect of a small sample size on the observed findings. (Imagine conducting a study of the average height of American men based upon a sample from a neighborhood with a high proportion of college and professional basketball players).
A related problem arises when we fail to take into account the statistical principle of regression to the mean in making predictions and, as a result, create inaccurate explanations to account for the effects observed. Regression to the mean is a term for the statistical principle that if a value is extreme when it is first measured then it will tend toward an average value upon the second measurement. For example, if Tiger Woods scores a 63 on his first round at the Masters, he is more likely to be closer to par on his second round.
A common human mistake is to impute causality for an outcome when the observed results can be simply explained as regression to the mean. An example is the well-known “Sports Illustrated jinx” in which an athlete who appears on the cover of Sports Illustrated is thought to be “jinxed” and thereby doomed to a less successful season the following year. This ignores the effect of regression on the outcome. The athletes on the cover are there because they are having an exceptional season (usually with a good deal of luck thrown in). The following season they are more likely to perform closer to their baseline performance. This example also illustrates the human tendency to impute causality in circumstances in which only correlation is present. Thus appearing on the cover of SI is correlated with a less successful season the following year, but there is no causal relationship.
A second type of shortcut is the availability heuristic. That is the tendency to base estimates of likelihood or probability upon the ease by which the brain retrieves examples of similar facts or occurrences. Thus, one may estimate the probability of heart attack in middle-aged women by recalling specific instances of such occurrences. Notwithstanding the utility of such calculations, the ease of recall reflects more than just the frequency or probability and the use of the availability heuristic often leads to inaccurate estimates and, in particular, a tendency to overestimate the likelihood of certain outcomes.
Consider a situation in which a client asks you to estimate the range of potential jury awards. While attorneys may do research on the question, often they will “guestimate” based on their knowledge of jury awards in similar cases. In particular, they may recall a recent huge settlement with dramatic facts, some of which may be similar to the case at hand. Research has repeatedly shown that people will overweight the value of recent dramatic events in predicting the likelihood of a similar outcome in another situation. Thus, following 9/11, people were much more likely to overestimate the likelihood of being killed in a terrorist attack.
A third heuristic, anchoring, describes the impact of starting an analysis with an initial value which is then adjusted to provide a final answer. Research has shown that providing subjects with an initial number, even when that number has no relevance to the situation at hand, has a substantial impact on their judgments. For example, shoppers who are exposed to a sign that states “limit of 12 cans per purchase” buy twice as many cans as do those who are not exposed to the sign. Litigators may be familiar with this concept through participation in the mediation of lawsuits in which the initial demand impacts the subsequent negotiation. Plaintiffs who make a high initial demand may intuitively be relying on the anchoring effect and subsequent adjustments to influence upwards the amount of the final settlement.
Kahneman and others have found that research subjects also tend to embrace narratives that they can use to make sense of events that have already occurred. These narratives, in turn, may be used as the basis for understanding or predicting future events. For example, the story of Bill Gates starting what became Microsoft in a garage in Albuquerque is a compelling narrative regarding intelligence, perseverance, determination and enormous luck. Its utility in describing how an enterprising entrepreneur should approach a start-up business is limited in that a simple narrative is incapable of providing the rich detail that an accurate assessment requires. In addition, even a “thick” narrative will not address what is often the most important factor of all—the role of luck (or if you prefer inherent random variation) on subsequent events. The result is that we have a tendency to overestimate the degree to which skill determines outcomes and underestimate the role of luck. Thus, to go back to Tiger, his 63 on the first day of the Masters, while carding a 74 on day two, is more likely explained by luck or random events than by fluctuations of his skill. Trial attorneys seem to appreciate this tendency in jurors and attempt to craft narratives that simplify (or ignore) facts and impute causation in ways that benefit their clients.
What all this suggests to me is that as lawyers we need to be particularly careful when relying on past experience or intuition as guides for decisions. The shortcuts described by Kahneman are both useful and dangerous for the same reason—they operate below the level of conscious experience. We are not aware of their impact on our decision-making and they can easily lead us to incorrect or invalid recommendations. Caution is advised.
Oh yeah, what about Aretha and Melville? Where do they fit in? As Aretha tells us (and as she told Elwood and Jake Blues) – you better think about it. Then think about it again. Set your preconceptions aside. Weigh her words carefully and give her the time she deserves.
Melville’s Ahab is another story. His was the wrong narrative. He over-relied on his past experience and failed to read the signs (e.g. the whiteness of the whale, Queequeg’s construction of his coffin) and proceeded ahead recklessly, thereby dooming his crew. Ishmael, as he beckons us to call him, has the real narrative here. An exploration of mind and fate, spread out across a vast sea, the narrator banished to his fate to die drifting alone on the currents, until, with the appearance of dawn at last, a sail appears on the horizon.
The ABA Health eSource is distributed automatically to members of the ABA Health Law Section . Please feel free to forward it! Non-members may also sign up to receive the ABA Health eSource.