The Good Judgment Project (GJP) is a project “harnessing the wisdom of the crowd to forecast world events.” It was co-founded by Philip E. Tetlock (left, author of Superforecasting with Dan Gardner as well as Expert Political Judgment: How Good Is It? How Can We Know?), decision scientist Barbara Mellers, and Don Moore.
Its workforce consists of volunteers. After a simple screening, 3,200 were invited to participate and began forercasting.It was a participant in the Aggregative Contingent Estimation (ACE) program of the Intelligence Advanced Research Projects Activity (IARPA) in the United States. Predictions are scored using Brier scores. The top forecasters in GJP are “reportedly 30% better than intelligence officers with access to actual classified information.” Moreover, forecasters on the small GJP’s superteams became 50% more accurate in their individual predictions.
According to its website, the Good Judgment Project‘s independent, non-partisan research has “discovered four keys to accurate forecasting: talent-spotting, training, teaming, and aggregation. Our scientifically validated process uses these four keys to unlock the secret to better forecasts – and better decisions.” More specifically:
Talent “Our evidence-based techniques reliably identify the most accurate forecasters.”
Training: “Combining our training methods with deliberate practice increased accuracy by 10-12% in Good Judgment research.”
Teamwork: “Forecasters working in teams outperform those working alone or in prediction markets.”
Aggregations: “Our state-of-the-art aggregation methods extract the maximum possible signal from the noise of crowd-sourced forecasts.”
Selected Good Judgment publications on the science of Superforecasting include:
-
What makes foreign policy teams tick? Explaining variation in group performance in geopolitical forecasting (forthcoming)
Journal of Politics, in press.
-
Forecasting tournaments, epistemic humility and attitude depolarization (forthcoming)
Cognition, in press.
-
Small steps to prediction accuracy (2019)
ResearchGate (online)
-
The value of precision in probability assessment: Evidence from a large-scale geopolitical forecasting tournament (2018)
International Studies Quarterly, 62(2), 410-422.
-
Robust forecast aggregation: Fourier L2E regression (2018)
Journal of Forecasting, 37(3), 259-268.
-
Restructuring structured analytic techniques in intelligence (2018)
Intelligence and National Security, 33(3), 337-356.
-
Correcting judgment correctives in national security intelligence (2018)
Frontiers in Psychology, 9, 2640.
-
Partial information framework: Model-based aggregation of estimates from diverse information sources (2017)
Electronic Journal of Statistics, 11(2), 3781-3814.
-
An IRT forecasting model: Linking proper scoring rules to item response theory (2017)
Judgment and Decision Making, 12(2), 90-103.
-
Assessing objective recommendation quality through political forecasting (2017)
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, 2348-2357.
I highly recommend Philip Tetlock and Dan Gardner’s classic work, Superforecasting: The Art and Science of Prediction, its second edition published by Broadway Books(September 2016).