chevron-down Created with Sketch Beta.

Litigation Journal

Spring 2022: Embracing Change

Truth and Common Sense

Kenneth R Berman

Summary

  • It’s one thing for courts to narrow a statute by interpreting its words or looking at it under a constitutional microscope.
  •  It’s quite another to narrow it by the simple stroke of a judicial pen.
  • The people have not elected federal judges or endowed them with authority to overwrite those statutes.
Truth and Common Sense
ridvan_celik via Getty Images

Jump to:

In Eugene Ionesco’s play The Bald Soprano, Mr. and Mrs. Smith confront a simple question: When a doorbell rings, does it mean that someone is there? “Never anyone,” says Mrs. Smith. “Always someone,” counters Mr. Smith.

To resolve the dispute, another character—the Fire Chief—says, “I am going to reconcile you. You both are partly right. When the doorbell rings, sometimes there is someone, other times there is no one.”

Another character—Mr. Martin—concurs. “This seems logical to me.” His wife agrees, “I think so too.”

An absurd conversation, to be sure. Ionesco, after all, introduced the world to the theater of the absurd. These days, sadly and frighteningly, his bizarre dialogue might be an allegory of how truth is determined in legal disputes.

Consider this: The doorbell question should not have been up for debate. Doorbells do not ring themselves. Someone must ring them. That truth is indisputable.

But that did not stop Mrs. Smith from putting forth an obvious falsehood, perhaps knowingly, perhaps out of ignorance. Nor did it stop the Fire Chief—the closest character in the play to a dominant figure in the jury room—from erroneously concluding that Mrs. Smith’s falsehood was “partly right.” And once the Fire Chief’s conclusion was on the table, the other two proxy jurors, Mr. and Mrs. Martin, were quick to fall in line. They provided the needed consensus to resolve the point, albeit incorrectly.

Is that really any different from what happens in life, when people reject an incontestable truth, embrace some false narrative, and get others to subscribe to it? One in 10 Americans believes that the 1969 moon landing was faked. In Britain, that ratio is higher: one in six. And despite irrefutable scientific evidence, two in 10 Americans believe that human activity has little or no role in climate change.

Truth rejection is even more robust in the political sphere. According to a Quinnipiac poll taken one month after the 2020 election, more than a third of registered voters believed that Joe Biden’s victory was illegitimate and that there was widespread voter fraud. Yet, no evidence supported either belief.

Truth in the Courtroom

Is it any wonder, then, that fact finders who may reject truth outside a courthouse can be just as prone to do so inside a courthouse? Why can different jurors, hearing the exact same evidence, draw polar opposite conclusions about what happened and have profoundly strong opinions that their own landing spot is the true one? Non-unanimous verdicts, hung juries, and wrongful convictions stand as powerful evidence that not all jurors get it right. It’s a tad ironic that the word “verdict” comes from the Latin dicere (to speak) and veritas (the truth). Verdicts are not necessarily grounded in the truth.

And if some jurors can get it wrong, what about judges? One judge might think that the evidence points to the plaintiff’s version of the facts, while another, on the very same evidence, might think that it points to the defendant’s. Evidence, it seems, is not the only factor that influences how people draw conclusions. Even without evidence—as the moon landing, climate change, and Biden’s election show—some people are wired to convert their own imaginations into a bedrock, false truth.

Litigators like to think that the truth-finding process is almost formulaic, rooted in logic, and seasoned with common sense. Amass all the evidence that supports a factual contention, present it in an easily digestible way through well-prepared witnesses and indisputable documents, explain what it all means, and let the fact finders’ logic and common sense do their thing. This is how the truth emerges, or so we’re taught to believe.

But it doesn’t work that way. Even the strongest, most clear-cut evidence of actual truth degrades as it makes its way through the filters and biases that fact finders bring to the process. Evidence doesn’t really speak for itself. Rather, it gets sliced and diced, repainted and repackaged, by the education, life experiences, and worldviews of each fact finder. Fact finders not only get a vote in “finding” the truth; they also alter the evidentiary ingredients and add other things to the recipe, most times unknowingly, which can produce conclusions far afield from the truth.

Let’s look at some examples. Consider, for instance, stereotyping—the propensity to draw some conclusions based on characteristics that the person assumes are associated with the drawn conclusion. Typically, that happens when someone thinks that most or all members of an identifiable group share some common characteristics. Stereotyping leads to filling in blanks based on limited data, often drawing unfounded inferences when actual evidence on the point is missing. Unaware of the data’s inadequacy, those doing the stereotyping are prone to have great faith in their conclusion’s purported correctness.

It takes very little for stereotyping to take hold, often just a few pieces of information. What might someone infer about the intelligence of a high school athlete who drops the g when pronouncing words that end in -ing? Or if the person is middle-aged and heavyset, wears a T-shirt and torn jeans, and has many tattoos? Fact finders might draw inferences from just those scant data points, but when they do, it is their imagination, influenced by stereotyping, that mixes with the evidence and can mislead them into believing a false version of the truth.

Here’s another example of how someone can miss the truth: motivated cognition, the tendency to credit the evidence that supports the conclusions we favor and to ignore or disbelieve evidence that points in the other direction. The more we favor a particular outcome, the more apt we are to embrace whatever evidence supports it while rejecting contrary evidence that might show how the truth lies elsewhere.

These aren’t the only cognitive phenomena that shuttle truth to the sidelines. Consider the “illusory truth effect,” reputedly discovered in 1977 at Temple and Villanova Universities. Studies there showed that repetition leads to believability. The more times we hear something, the greater the likelihood we will believe it to be true, even if it’s false.

Newer research, though, shows that repetition isn’t even necessary. University of Alabama Professor Timothy Levine coined the “truth default theory,” a shorthand for our inclination to believe what we’re told without requiring supporting evidence and, apparently, without having to hear it multiple times. Mere plausibility is enough to trigger belief and suppress skepticism. Even well-educated people are easily deceived. Witness the wealthy investors who gave their money to Bernie Madoff or the families who believed Rick Singer’s story that his “side door” arrangements could get their kids into Yale and UCLA legally. How easy, then, must it be to get the moderately or poorly educated to hear a falsehood, have them believe it, and have them repeat it as if it were true?

What about the effect of confident delivery? When someone presents inaccurate information with great confidence, the confident delivery serves as a stamp of its veracity.

Then there’s what Nobel laureate Daniel Kahneman calls the availability heuristic, our tendency to embrace only the evidence or explanations that are readily available to us. If only one explanation for an occurrence has been offered and no others are mentioned, that explanation is likely to assume more importance than it deserves and is more likely to be accepted as true, even if it is false.

Phenomena Influencing Judgment

While these and other cognitive truth derailers percolate, still more phenomena influence how we make judgments. Consider the compromise effect. Studies show that when faced with two alternative choices, decision makers can seek out and adopt an intermediate choice if possible. That’s what Ionesco’s Fire Chief did. A compromise outcome rejects what each side has presented. Because one of those presentations more likely aligns with the truth, the compromise outcome, almost by definition, varies from the truth.

Conformity impulses also set in. When a group considers an issue, a dominant member, by expressing an early and strongly held opinion, can unintentionally deter others from expressing contrary opinions, especially if a second member lines up with the first. Group members who have yet to speak may well refrain from offering dissenting views, even if those views are factually correct. The impulse is to conform one’s opinion to what the dominant member puts forth, because of the ease of adoption, the desire to go along, the discomfort of disagreeing, the fear of being rejected, the phenomenon of mutual reinforcement, or any number of other peer-influencing factors. That’s what Mr. and Mrs. Martin did when they concurred with the Fire Chief.

Is it any wonder, then, that information mutates and becomes misinformation, and that misinformation becomes disinformation, until the chasm between what is true and what we may think is true becomes too wide to bridge? Truth may be stranger than fiction, but too many find it hard to tell the difference and lack the energy or tools to try.

And that brings us to common sense, the centerpiece of modern jury instructions, what someone must have thought would be the panacea to remove the weeds of falsehood from the garden of truth. If jurors would just apply their home-grown common sense to the divergent stories that flow from the witnesses and lawyers, surely those jurors would see the truth and render the correct verdict.

Judges frequently instruct jurors to use their common sense and personal experience in deciding whether testimony is true and accurate. In the Kyle Rittenhouse murder trial, after instructing about many things the jurors could consider in evaluating the evidence, the judge finished with this: “There is no magic way for you to evaluate the testimony; instead, you should use your common sense and experience.” Unusual? Not at all. Instructions about common sense proliferate throughout American courtrooms. It’s a justice system article of faith.

But do we really want the fate of litigants and the outcome of serious legal disputes to turn on a juror’s common sense? Do we even know what common sense looks like? In principle, it sounds noble and valuable. It implies rationality, which seems like a good thing. And for simple questions, most people don’t need anything more to figure it out. It doesn’t take much common sense to know that if a doorbell rings, someone rang it.

Moving beyond simple questions, though, to the kinds of questions at issue in courtrooms, the only common sense that has value is the common sense of rationalists. We like to think that everyone has common sense and that, if everyone used it, the world would be a better place, truth would prevail, and justice would be done. But the consensus among behavioral economists and social psychologists is that most people don’t think rationally. We respond more to emotion than to reason.

It is undoubtedly easy for someone unversed in rational or high-level thinking to make decisions based on guesswork if their primary decision tool is common sense. Isn’t common sense, at least as used by the average fact finder, infected with all the stereotyping, cognitive biases, and psychological booby traps underlying the very distortions of truth that sabotage correct outcomes? Doesn’t that overtake reasoning and lead to higher error rates in fact-finding and decision-making?

On the other hand, common sense, the good kind, lies in the domain of rational thinking. Telling non-rationalists to use common sense won’t endow them with it. Stripped of its window dressing, the “common sense” jury instruction is basically saying this:

“When you consider the evidence, you may have a gut reaction to it. Listen to your gut reaction. It has been honed by the totality of your own personal experience throughout the course of your life, whatever that may have been. That’s as good a barometer as any to evaluate the evidence.”

Is that truly how we want fact finders to decide cases? What might be common sense to one person, after all, might be nonsense to another. When politicians speak, we use fact-checkers to guard against the ill effects of voters applying their common sense. Why, then, do we cling to common sense in the courtroom, as if it were a beacon in the darkness, illuminating the pathway to truth?

Instead of telling jurors to use their common sense, judges should consider giving a different kind of “gut reaction” instruction, a warning really, sounding something like this:

“When you consider the evidence, you may have a gut reaction to it. To some of you, your gut reaction may seem like common sense. Resist the impulse to go with your gut reaction. Legal disputes are too important to be decided by impulse, gut reactions, or emotions. Instead, you should evaluate the evidence without any preconceived feelings or ideas about whether it is right or wrong or whom you might privately want to win this case. You should question whether there is any reason to disbelieve the evidence and whether you are looking at it fairly from the perspective of a neutral observer. Only after you’ve satisfied yourself that you gave each side a fair shake, evaluated the evidence impartially, and did so without any bias should you then form an opinion about what evidence to believe or disbelieve.”

It’s hard to say whether a gut reaction instruction like that would make a difference. Thinking systems aren’t easy to change. But if, at critical moments, we could remind decision makers to resist the impulse to see the evidence simplistically, then maybe truth will have a better chance of surfacing, justice will have a better chance of being done, and the world may well become a better place. And we wouldn’t have to wonder whether, when a doorbell rings, someone is there.

    Authors