The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t
Penguin Books (2015)
This book was first published in 2012, at a time when Big Data (or if you prefer, big data or BIG data) was only beginning to receive the attention it deserves as a better way to use analytics within and beyond the business world. One key point is that big data should also be right data and in sufficient quantity. I recently re-read the book, in its paperbound edition (2015). The quality and value of its insights have held up remarkably well.
In the years that followed publication of the first edition, as Nate Silver notes in the new Preface, the perception that statisticians are soothsayers was proven to be an exaggeration, at best, and a dangerous assumption, at worst. This new edition “makes some recommendations but they are philosophical as much as technical. Once we’re getting the big stuff right — coming to a better [i.e. more accurate and more reliable] understanding of probability and uncertainty; learning to recognize our biases; appreciating the value of diversity, incentives, and experimentation — we’ll have the luxury of worrying about the finer points of technique.”
In the Introduction to the First Edition, Silver observes, “If there is one thing that defines Americans — one thing that makes us exceptional — it is our belief in Cassius’ idea that we are in control of our own fates.” In t his instance, Silver refers to a passage in Shakespeare’s play, Julius Caesar, when Cassius observes:
“Men at some time are masters of their fates.
The fault, dear Brutus, is not in our stars,
But in ourselves, that we are underlings.”
(Act 1, Scene 2, Lines 146-148)
Cassius’ assertion has serious implications and significant consequences. It is directly relevant to a theory named after Reverend Thomas Bayes (1701–1761), who first provided an equation that allows new evidence to update beliefs in his “An Essay towards solving a Problem in the Doctrine of Chances” (1763). Silver suggests: “Bayes’s theorem is nominally a mathematical formula. But it is really much more than that. It implies that we must think differently about our ideas [predictions, for example] — and how to test them. We must become more comfortable with probability and uncertainty. We must think more carefully about the assumptions and beliefs that we bring to a problem.”
Silver cites another passage in Julius Caesar when Cicero warns Caesar: “Men may construe things, after their fashion / Clean from the purpose of things themselves.” According to Silver, man perceives information selectively, subjectively, “and without much self-regard for the distortions this causes. We think we want information when we want knowledge.” I take “want” to have a double meaning: lack and desire. Silver goes on to suggest, “the signal is the truth. The noise is what distracts us from the truth. This is a book about the signal and the noise…We may focus on those signals that advance our preferred theory about the world, or might imply a more optimistic outcome. Or we may simply focus on the ones that fit with bureaucratic protocol, like the doctrine that sabotage rather than an air attack was the more likely threat to Pearl Harbor.”
In their review of the book for The New Yorker (January 25, 2013), Gary Marcus and Ernest Davis observe: “Switching to a Bayesian method of evaluating statistics will not fix the underlying problems; cleaning up science requires changes to the way in which scientific research is done and evaluated, not just a new formula.” That is, we need to think about how we think so that we can make better decisions.
In Thinking, Fast and Slow, Daniel Kahneman explains how an easy question (“How coherent is the narrative of a given situation?”) is often substituted for a more difficult one (“How probable is it?”). And this, according to Kahneman, is the source of many of the biases that infect our thinking. Kahneman and Amos Tversky’s System 1 jumps to an intuitive conclusion based on a “heuristic” — an easy but imperfect way of answering hard questions — and System 2 lazily endorses this heuristic answer without bothering to scrutinize whether it is logical). And this, according to Kahneman, is the source of many of the biases that infect our thinking. System 1 jumps to an intuitive conclusion based on a “heuristic” — an easy but imperfect way of answering hard questions — and System 2 lazily endorses this heuristic answer without bothering to scrutinize whether it is logical.
When an unprecedented disaster occurs, some people may feel at least some doubt that they are in control of their fate. Nate Silver offers this reminder: “But our bias is to think we are better at prediction than we really are. The first twelve months of the new millennium have been rough, with one unpredicted disaster after another. May we arise from the ashes of these beaten but not bowed, a little more modest about our forecasting abilities, and a little less likely to repeat our mistakes.”
A Hebrew proverb suggests that man plans and then God laughs. The same could be said of man’s predictions.