Superforecasting: The Art and Science of Prediction explores how some individuals consistently outperform others in making accurate predictions about complex world events. Philip E. Tetlock and Dan Gardner draw on research from the Good Judgment Project to highlight the traits and methods that distinguish top 'superforecasters' from amateurs and experts alike. The book delves into the psychology of forecasting, the value of updating beliefs, and the importance of probabilistic thinking, offering practical guidance for anyone aiming to improve their predictive skills.
Embrace probabilistic thinking: Rather than thinking in certainties, break problems down into probabilities and constantly update your assessments as new information emerges.
Cultivate humility and openness: Acknowledge what you don't know, seek diverse viewpoints, and be ready to change your mind when warranted by evidence.
Practice and reflection improve judgment: Like any skill, forecasting gets better with deliberate practice, thoughtful feedback, and learning from both successes and failures.
The book was published in: 2015
AI Rating (from 0 to 100): 93
Superforecasters consistently revise their estimates when presented with new data. For example, when geopolitical situations change—such as negotiations between nations—the best forecasters don’t stick to their initial predictions but iteratively refine them, boosting accuracy.
Instead of relying on gut feelings, superforecasters break large questions into smaller, more manageable parts. When predicting whether a country will default on its debt within a year, some superforecasters consider sub-questions like political stability, economic trends, and recent precedents.
Superforecasters use a variety of information sources, from news reports to academic papers, then synthesize this data to form a balanced outlook. One example in the book describes forecasters predicting the fate of Syria by parsing conflicting news stories and triangulating between multiple perspectives.
Rather than expressing absolute certainty, top forecasters assign specific probabilities and publicly track their accuracy. For instance, a forecaster might say there's a 65% chance of a policy change—enabling meaningful self-review and improvement based on outcomes.
The book emphasizes that superforecasters often work in teams, challenging each other's assumptions and debating forecasts. This collaborative approach leads to more robust predictions, as illustrated by the Good Judgment Project’s team-based forecasting exercises.
Superforecasters carefully analyze their inaccurate predictions, identifying what went wrong. The book gives an example of forecasters reviewing why they missed an important geopolitical turn and using those insights to refine their future approaches.
by Daniel Kahneman
AI Rating: 96
AI Review: A foundational book on behavioral economics, this work explains the two systems of thought that drive our judgments and decisions. Kahneman’s insights into cognitive biases and illusions are invaluable for anyone interested in improving their critical thinking and forecasting skills.
View Insightsby Nate Silver
AI Rating: 92
AI Review: Silver examines why predictions about politics, economics, and other fields go wrong and how to make them better. Clear, engaging, and packed with examples from real-world forecasting, this book complements the ideas in Superforecasting.
View Insightsby Douglas W. Hubbard
AI Rating: 89
AI Review: Hubbard demystifies measurement and the role it plays in decision-making, showing how even the most intangible elements can be quantified. Useful for those interested in improving their analytical and forecasting skills.
View Insightsby Richard H. Thaler and Cass R. Sunstein
AI Rating: 90
AI Review: Nudge explores how subtle changes in environment can influence our decision-making for better outcomes. It’s a great read for anyone looking to understand human behavior and cognitive biases.
View Insightsby Philip E. Tetlock
AI Rating: 91
AI Review: Tetlock’s earlier book rigorously evaluates the track records of expert political forecasters, setting the stage for Superforecasting. It reveals the surprising limitations of expert predictions and provides insight into how to do better.
View Insightsby Chip Heath and Dan Heath
AI Rating: 87
AI Review: The Heath brothers breakdown the psychology of choice, offering practical strategies for making better decisions. Their accessible framework dovetails nicely with the lessons of Superforecasting.
View Insightsby David Epstein
AI Rating: 88
AI Review: Epstein argues for breadth of knowledge and eclectic careers, showing how generalists are often better at handling uncertainty—an idea that resonates with the skills of superforecasters.
View Insightsby Hans Rosling
AI Rating: 90
AI Review: Rosling dismantles misconceptions about global trends with data and clear thinking, equipping readers to understand the world more accurately—critical for forecasters.
View Insightsby Dan Gardner
AI Rating: 86
AI Review: Gardner dissects why we put faith in unreliable forecasts and what it takes to think about the future more realistically. It builds on the skepticism found in Superforecasting.
View Insightsby Nassim Nicholas Taleb
AI Rating: 92
AI Review: Taleb's exploration of luck, uncertainty, and the limitations of human knowledge is foundational for anyone seeking to avoid forecasting traps.
View Insightsby Nassim Nicholas Taleb
AI Rating: 89
AI Review: Taleb argues that some systems thrive on volatility and uncertainty, offering a provocative perspective on risk-taking and planning for the unknown.
View Insightsby Rolf Dobelli
AI Rating: 85
AI Review: Dobelli’s digestible exploration of cognitive biases provides quick lessons in spotting and avoiding errors in reasoning—important for any aspiring forecaster.
View Insightsby David Epstein
AI Rating: 88
AI Review: Epstein demonstrates why breadth of experiences and interests leads to better predictions, reinforcing the need for diverse perspectives in forecasting.
View Insightsby Daniel Kahneman, Olivier Sibony, Cass R. Sunstein
AI Rating: 88
AI Review: The authors explore the random variability (“noise”) in human judgments, and provide practical advice for reducing error in forecasting and decision-making.
View Insightsby Gerd Gigerenzer
AI Rating: 87
AI Review: Gigerenzer explains how people can cope with risk and uncertainty by developing practical tools for better decision-making, reinforcing themes from Superforecasting.
View Insightsby Dan Ariely
AI Rating: 89
AI Review: Ariely demonstrates the surprising ways people behave irrationally, suggesting methods for counteracting these effects—a crucial read for anyone seeking reliable predictions.
View Insightsby Laura Huang
AI Rating: 84
AI Review: Huang explores how unconventional thinking and adaptability can turn challenges into wins. Her practical approach complements the adaptive mindset heart of Superforecasting.
View Insights