Superforecasting: A book review by Bob Morris

SuperforecastingSuperforecasting: The Art and Science of Prediction
Philip Tetlock and Dan Gardner
Crown Publishers (2015)

How to blend computer-based forecasting and subjective judgment to gain a better sense of what will likely occur

Obviously, computers can process, organize, and access more data faster than can human beings. However, at least for now, human beings can outperform computers when other tasks are involved. In an MIT Urban Planning Report, Dancing with Robots, for example, Frank Levy and Richard Murnane observe: “The human mind’s strength is its flexibility—the ability to process and integrate many kinds of information to perform a complex task. The computer’s strengths are speed and accuracy, not flexibility, and computers are best at performing tasks for which logical rules or a statistical model lay out a path to a solution. Much of computerized work involves complicated tasks that have been simplified by imposing structure.”

Note: This insight goes back at least as far as Herbert Simon’s 1960 essay, “The Corporation, Will it be Managed by Machines?,” in Management and the Corporations, M. L. Anshen and G. L. Bach, eds., 1985, New York: McGraw-Hill, pp. 17–55, Print.

As I began to read Superforecasting, I was again reminded of the fact that so-called “experts” working with a computer tend to make better predictions that can either a computer or another human being (or group) working without one. As was the case with Philip Tetlock’s previously published book, Expert Political Judgment: How Good Is It? How Can We Know?, he and Dan Gardner have collaborated on a book that is evidence-driven rather than theory-driven. That’s a key point. (Please see pages 291-328.) I agree with another reviewer, Dr. Frank Stechon, who suggests that Tetlock shows conclusively two key points: First, the best experts in making political estimates and forecasts are no more accurate than fairly simple mathematical models of their estimative processes. This is yet another confirmation of what Robyn Dawes termed “the robust beauty of simple linear models.” The inability of human experts to out-perform models based on their expertise has been demonstrated in over one hundred fields of expertise over fifty years of research; one of the most robust findings in social science.

Tetlock and Gardner are convinced — and I agree — that “we will need to blend computer-based forecasting and subjective judgment in the future. So it’s time to get serious about both.” Obviously superior judgment by an individual or group blended with superior technology is the ideal combination. In one of Tom Davenport’s recent books, Judgment Calls, he and co-author Brooke Manville offer “an antidote for the Great Man theory of decision making and organizational performance”: organizational judgment. That is, “the collective capacity to make good calls and wise moves when the need for them exceeds the scope of any single leader’s direct control.”

These are among the several dozen passages of greatest interest and value to me in Chapters 1-7, also listed to suggest the scope of Tetlock and Gardner’s coverage:

o The Skeptic (Pages 6-10)
o The Optimist (10-20)
o Blind Men Arguing (25-30)
o Thinking About Thinking (33-39)
o Blinking and Thinking (41-45)
o Judging Judgments (52-65)
o Expert Political Judgment, and, And the Results… (66-72)
o Resisting Gravity — But for How Long? (96-104
o Fermi-Ize (110-114)
o Outside First (117-120)
o Thesis, Antithesis, Synthesis (121-124)
o Where’s Osama? (130-134)
o Probability for the Stone Age (137-140)
o Probability for the Information Age (140-143)
o But What Does It All Mean? (147-152)
o The Over-Under (156-158)
o Under, and, Over (159-166)

As indicated, the information, insights, and counsel that Philip Tetlock and Dan Gardner provide in this volume are based on rigorous and extensive research with regard to the art and science of forecasting. While re-reading the book prior to setting to work on this brief commentary, I first re-read the Appendix, “Ten Commandments for Aspiring Superforecasters,” and presume to suggest that those about to read the book for the first time do the same (I wish I had) because this material provides a superb framework, a context and frame of reference, for the lively and eloquent narrative developed within twelve substantial chapters. Here are the concluding remarks: “Guidelines [not predictions] are the best we can do in a world where nothing is certain or exactly repeatable. Superforecasting requires constant mindfulness, even when — perhaps especially when — you are dutifully trying to follow these commandments.”

In this context, I am again reminded of these words of caution expressed by Nassim Nicholas Taleb in The Black Swan: The Impact of the Highly Improbable: “It has been more profitable for us to bind together in the wrong direction than to be alone in the right one. Those who have followed the assertive idiot rather than the introspective wise person have passed us some of their genes. This is apparent from a social pathology: psychopaths rally followers.”

* * *

Those who share my high regard for this book are urged to check out the aforementioned MIT Urban Planning Report, Dancing with Robots, as well as these additional sources: Daniel Kahneman’s Thinking, Fast and Slow; Taleb’s aforementioned work, The Black Swan; and Nate Silver’s The Signal and the Noise: Why So Many Predictions Fail – But Some Don’t.

Posted in

Leave a Comment





This site uses Akismet to reduce spam. Learn how your comment data is processed.