Nate Silver’s The Signal and the Noise: Why So Many Predictions Fail but Some Don’t

Vol. 6 No. 5

By

Donna Suchy is the principal patent attorney at Rockwell Collins responsible for developing and implementing patent corporate strategy, litigation, due diligence, and advising corporate management on the legal and operational implications of patent laws.

The survival of homo sapiens has depended upon our ability to analyze quickly and then act upon the plethora of data that continuously assaults our senses. Just as our ability to notice a crouching tiger in the bamboo thicket can lead to a life or death situation so too in our modern world, we try to make sense of a barrage of information and sensory stimulation by envisioning patterns that enable us to make predictions. Whether we infer by interpolation between data sets or extrapolate to predict a future event such as an earthquake, we need to see relevant data—the signal—and discard the rest—the noise. The field of statistics provides many tools that help us make useful and likely predictions for modern day problems and discard fallacious inferences that we may construct from intuition and apparent obviousness. Nate Silver’s book, The Signal and the Noise: Why So Many Predictions Fail but Some Don’t applies one of these statistical tools, the Bayesian Method1 to look at phenomena from gambling to sports events, to earth quakes and weather predictions. He succeeds in bringing statistical analysis alive—and front and center to our thinking.

“Life must be lived and curiosity kept alive. One must never, for whatever reason, turn his back on life.”

—Eleanor Roosevelt

The book addresses all of us in this increasingly data burdened society, and its observations apply especially well to the legal profession which uses data to drive everything from litigation to legal theory. Silver reminds us of something often forgotten—there are challenges not only in discovering the facts but also in interpreting them. The data sorting that pervades our lives is as important as the mastery of the devices that deliver that data. So it could be that in the future statistics should be required study for every child. Silver’s book represents a mindset that might allow us all not only to survive, but to succeed in a world of seemingly infinite data—and it advocates the use of a well thought out Bayesian method when predicting future events.

One of the key impressions in reading this book: lawyers should be much more interested in statistics than one might think. A curious mind that applies statistical tools correctly can better interpret the apparently chaotic natural events all around us. While legal professionals are all curious by nature, curiosity alone cannot provide all the answers in a modern world of ceaseless data. There is a strong tension between the basic, indisputable need to root the interpretation of myriad pieces of information and the inherent contradictions that may arise when a statistical tool is used inappropriately. And host of problems occur when historical occurrences are used to predict future events without adequate statistical support.

Silver also describes some problems with the Bayesian modeling approaches that have made him known for successful prediction—for example forecasting the performance of baseball players and the outcome of political events. The law is no stranger to these methods—even implicit, intuitive interpretation of the multitude of legal data that lawyers sift through daily uses sorts of statistical modeling. It is important to recognize and understand these data systems in order to avoid making errors of false inference.

A formal statistical prediction can be thought of as having two elements: data analysis and human judgment. An especially interesting part of the book is Silver’s example of how weather forecasters combine judgment with predictions in order not to disappoint the public. If there is a 25 percent chance of rain, forecasters will report a higher percentage—a mistake towards a better outcome is more easily forgiven than the call for rain when sun was expected. This is a form of “expert expertise” hypothesis governing interactions between different elements of prediction, such as the ways in which mathematical modeling cannot replace human judgment. Silver notes that weather predictions that combine judgment with computation are up to 25 percent more accurate than those that rely on computer programs alone.

Although Silver’s claims seem reasonable, his perspective of data modeling for simplification may not extend to the legal field of litigation, especially intellectual property litigation. There the discovery is an iterative process, and the fact that there are juries to interpret those facts further magnifies the task. Statistical methods like the predictive Bayesian methods may help, but they are hardly guaranteed to ensure consistent and predictable results. In such a case, “expert expertise” becomes critical and takes precedence over the book’s example of data-driven methods of deriving understanding.

As in all areas of life and law, the type of data-driven predictions Silver advocates succeed or fail based on how they are applied, and an individual cannot deny his role in the process. As the author notes in his introduction, “Bayes’ theorem is nominally a mathematical formula. But it is really much more than that. It implies that we must think differently about our ideas.” This 250 year-old theorem, taught in the first weeks of introductory college statistics, may contain radical insights, but it is still only a partial solution to the very large problems Silver enumerates.

Bayesian predictive methods are especially useful when the prior probabilities of a given outcome are clear, such as the pervasive gambling examples we all saw in our math probability classes. Silver provides an excellent chapter on gambling prediction that includes a step-by-step description of the use of probabilistic reasoning in placing bets while playing a hand of Texas Hold’em. These predictions take into account: the probabilities on the cards that have been dealt and that will be dealt; the information about opponents’ hands that you can glean from the bets they have placed; and your general understanding of what kind of players are in front of you—whether aggressive, cautious, naïve, or clever. Such statistical methods clearly have a place in prediction of gambling outcomes.

But the Bayesian approach is much less helpful when there is little or no knowledge about prior events, let alone their associated probabilities. Unfortunately, the book’s discussion of alternatives to the Bayesian approach in these circumstances is incomplete and misleading. In some cases, Silver attributes successful reasoning to the use of Bayesian methods without any evidence that those particular analyses were actually performed in Bayesian fashion.

Silver discusses at length an important and troubling paper by John Ioannidis, “Why Most Published Research Findings Are False.” It was concerning to this patent lawyer when he implied that the problems associated with the publication of inconclusive or poor research could be solved by using the Bayesian approach rather than conventional, frequentist methods such as Fisher’s test. Silver writes that conventional “methods discourage the researcher from considering the underlying context or plausibility of his hypothesis, something that the Bayesian method demands in the form of a prior probability. Thus, you will see apparently serious papers published on how toads can predict earthquakes” [page 253]. The author disregards, however, that NASA’s 2011 study of toads was a scientifically rigorous, thoughtful analysis of groundwater chemistry that was based on the fact that a group of toads abandoned a lake in Italy prior to an earthquake that happened a few days later, and which resulted in a theory that ionospheric disturbance and water composition may be correlated with tectonic movement. He neglects the fundamental scientific principle that correlation does not equate causation—toads need not be conscious of an earthquake to be useful in its accurate prediction. If a given correlation is strong enough, and specific enough, then its predictive qualities may persist irrespective of any causative factors—biologic, geologic, or other.

The real reason that too many published studies are false and that we cannot rely on statistics without human critical thinking is that—as Silver himself explains in the book—statistics can easily be misused. Whether based on lack of understanding or lack of care, there is an underlying problem of having mountains of data with varying degrees of applicability to Bayesian methods. Inappropriate use of any method—whether revolutionary or hundreds of years old—will result in false inference, and ultimately in lack of trust on the part of any given audience: legal, scientific, or popular. Simply switching to using Bayesian methods without careful consideration of their specific utility and appropriate implementation will only add to the issue. Instead, this problem must be resolved the old fashioned way: by critically questioning what we are given, examining our assumptions, and applying our expertise to the issue at hand. Cleaning up the body of scientific publications and finding “the truth” requires constant reevaluation. Unthinking thinking application of any mathematical method, even a really good one, will only foment the problem.

It is perfectly reasonable for Silver to prefer the Bayesian approach, and in his book he makes a good case for its specific utility in many areas. However, the case for preferring this approach over all others is far less attractive than Silver claims, and there is no reason to think that it would comprise a ‘think differently’ revolution. The Signal and the Noise is a terrific and useful introduction to Bayesian methods, highlighting a number of interesting perspectives that may have been overlooked by someone not classically trained in statistics. But, unlike Silver’s apparent main premise, it will take a lot more than Bayesian methods to solve the many challenges of our world.

Endnotes

1. The Bayesian method is a statistical method that assigns probabilities or distributions to events or parameters based on experience or best guesses before experimentation and data collection. It is based on the Bayes theorem of conditional probabilities: the probability that an event A occurs given that another event B has already occurred is equal to the probability that the event B occurs given that A has already occurred multiplied by the probability of the event A and divided by the probability of the occurrence of event B. See Merriam-Webster’s Dictionary 99 (11th ed. 2003); see also International Society For Bayesian Analysis, http://bayesian.org/Bayes-Explained (last visited March 7, 2014).

Advertisement

BNA Books

 

Pre-ANDA Litigation: Strategies and Tactics for Developing a Drug Product and Patent Portfolio

  

IPL Spring Conference 2015 ad

 

Landslide Webinar December 2014

 

  • About LANDSLIDE

  • Subscriptions

  • More Information

  • Contact Us

Advancing Intellectual Property Law®