This is a summary review of Superforecasting containing key details about the book.
What is Superforecasting About?
Superforecasting offers a masterwork on prediction, drawing on decades of research and the results of a massive, government-funded forecasting tournament. The book offers the first demonstrably effective way to improve our ability to predict the future—whether in business, finance, politics, international affairs, or daily life—and is destined to become a modern classic.
Who is the Author of Superforecasting?
Dan Gardner is the New York Times best-selling author of books about psychology and decision-making. His work has been called "an invaluable resource for anyone who aspires to think clearly" by The Guardian and "required reading for journalists, politicians, academics, and anyone who listens to them" by Harvard psychologist Steven Pinker.
Philip E. Tetlock is a bestselling author. He is a Canadian-American political science writer, and is currently the Annenberg University Professor at the University of Pennsylvania, where he is cross-appointed at the Wharton School and the School of Arts and Sciences. He was elected a Member of the American Philosophical Society in 2019.
What are the main summary points of Superforecasting?
Here are some key summary points from the book:
- Prediction is a talent and a skill that can be developed. Anyone with enough dedication, interest and domain expertise can improve their skill and accuracy
- Our complex world means that small events can lead to large unforeseen consequences making regular forecasting rather limited. This, however, does not mean that forecasting should be scrapped. Just because not everything is predictable, doesn’t mean everything is unpredictable.
- Superfocasting is derived from curiosity, the desire to learn, the ability to gather information, and the willingness to change and update our beliefs.
- Superforecasters are less ideologically (or professionally) biased. They seek data from a wide range of sources to examine future trends. They are open-minded and have less fear of making mistakes.
- Superforecasters embrace probabilistic thinking and understand how statistical probability works.
- As forecasters, we want to measure the accuracy of our forecasts in order to improve our forecasting skills. We also want to adjust your forecast as new information comes to light.
- Forecasters should avoid using vague language like “might”, “could” and “likely” because different people attach different meanings to these words, it’s far better to use numbers, particularly percentages.
- Seemingly impossible forecast problems can be tackled by breaking them down into smaller bite-size units to analyze
- Every situation is unique so don’t judge a case too quickly. Approach it from the outside by finding the base rate first.
- It’s always wise to plan for adaptability and resilience.
- Many forecasters tend to give advice that is too certain due to the fact that consumers have an inherent distaste for uncertainty
- Declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.
- In forecasting, often, a small edge can make a big difference.
- The more that is unknown, the greater the opportunity.
- Consensus is not always good; Disagreement not always bad. If you do happen to agree, don’t take that agreement—in itself—as proof that you are right. Never stop doubting.
- Beliefs are hypotheses to be tested, not treasures to be guarded.
- We as humans have the ability to take a series of past events and turn them into linear narrative that makes the outcome seem all but inevitable.
- Using multiple lenses to view a subject can provide a more detailed picture and yield greater understanding and better forecasting.
- One of the keys for better forecasting is to embrace diverse thoughts, expose assumptions, catch mistakes, and correct biases.
- People who work in teams had forecasts that were more accurate than forecasts made by individuals. However, it's key to include critical discussions and present an alternative view.
- The goal of forecasting is not to see what’s coming. It’s to advance the interests of the forecaster and the forecaster’s community.
- Anchoring bias: “When we make estimates, we tend to start with some number and adjust. The number we start with is called the anchor. It is important because we typically underadjust, which means a bad anchor can easily produce a bad estimate.”
What are key takeaways from Superforecasting?
Takeaway #1. Know The Limitations of Forecasting
Our forecasts reflect our expectations for our future but life in our complex world means that small events can lead to large unforeseen consequences making regular forecasting rather limited.
Take the Arab Spring for example, the uprisings started because 1 Tunisian street vendor set himself on fire after he was humiliated by deceitful police officers. Forecasters cannot predict events such as this due to the chaos theory aka ‘the butterfly effect’. These limitations don’t mean that forecasting should be scrapped though.
Takeaway #2. Measure and Update Your Forecasts
Weather forecasts are quite reliable when made a few days in advance because forecasters analyze the accuracy of their forecasts with the actual weather that has occurred, this allows them to improve their forecasts alongside their understanding of how the weather works. Unfortunately, forecasters in other fields do not usually measure the accuracy of their forecasts so in order to improve forecasting we have to get serious not only about comparing but about measuring and updating our findings.
You can also use the Brier score system to see how accurate past forecasts were. The Brier score system states that the lower the score, the more accurate the forecast so if the forecast was perfect, it gets 0. Forecasts which are randomly guessed are given a score of 0.5 whilst completely wrong forecasts are given a maximum score of 2.0. Interpreting the score depends on the question being asked as you might have a Brier score of 0.2 which seems great yet the forecast could be terrible.
Takeaway #3. Be Precise with Your Measurements
You might think that measuring forecasts is as simple as gathering the data and doing the calculations to judge their accuracy but it’s not quite that simple as you have to understand the meaning behind the original forecast.
For increased precision, forecasters should avoid using vague language like “might”, “could” and “likely” because different people attach different meanings to these words, it’s far better to use numbers, particularly percentages.
The false claim of Saddam Hussein having weapons of mass destruction was made by the NSA and CIA intelligence agencies, here is an example where percentages rather than words should have been used - If the forecast said that the chances of Iraq having weapons of mass destruction was 60% likely, this would have meant that there was a 40% chance they didn’t resulting in insufficient knowledge to justify the U.S invading Iraq.
Takeaway #4. Break Down Problems
How do you eat an elephant? One bite at a time! Seemingly impossible forecast problems can be tackled the same way, not by eating them but by applying Fermi-style thinking which means breaking problems down into smaller bite-size units to analyze. Effective superforecasters separate the known from the unknown, getting to grips understanding the basics first before looking for assumptions.
Let’s look at the forecast of the cause of death of Yasser Arafat, leader of the Palestine Liberation Organization. Initially, his cause of death was unknown but it was later discovered that he had been in contact with lethally high levels of polonium-210 which were found on his belongings. Conspiracy theories had always mentioned poison but now this seemed a very likely cause of death. His body was exhumed and examined in France and Switzerland with volunteer forecasters on the Good Judgment Project asked to forecast whether scientists would find high levels of polonium in Yasser Arafat’s remains. One volunteer concluded that there was a 60% chance thanks to the Fermi-style approach which he used to break down the information he had. People said that Polonium decays rapidly so it was thought that the chance of it still existing in Arafat’s remains, 8 years after he had died, were slim however, the forecaster did his own research and discovered that it could still be detectable. Also taking into consideration that the leader had Palestinian enemies and that there could have been foul play during the postmortem due to Palestine wanting to blame Israel for Arafat’s death and the forecaster came to his 60% conclusion.
Takeaway #5. Don’t Jump To Conclusions
Don’t judge a case too quickly as every situation is unique. Approach it from the outside with a concept called anchoring which has forecasters finding the initial figure, the base rate.
Let’s say you want to predict the chances of a particular Italian family owning a pet. A regular forecaster would look at the specific facts about the family first, learning that they live in the U.S in a modest house and that the father is a bookkeeper and the mother a part-time childcare provider. They have 1 child and the grandmother lives in the same house.
Superforecasters don’t use the inside facts first, the details, they go to the outside view first discovering what percentage (base rate) of American households own a pet. Thanks to Google they quickly learn the base rate is 62%. It’s only now that the superforecaster looks at the inside view taking into consideration the family dynamics as this allows them to get more specific with their data and adjust the base rate accordingly. Knowing that the family is Italian, their next step might be to check the percentage of Italian families owning a pet in the U.S.
Takeaway #6. Stay Up To Date
Once your initial forecast is made, it’s no good sitting back to wait and see if you were right, you’ve got to adjust your forecast as new information comes to light.
This is exactly what the researcher who forecasted that Yasser Arafat’s remains had a 60% chance of being detected with polonium did. By keeping an eye on the news surrounding this case, you can also set a Google Alert, he would update his forecast as new data arose. The official Swiss research team announced that they needed to do additional testing long after Bill Flack made his original forecast. Because Bill had done a lot of research on polonium he knew that the delay meant the Swiss team had found polonium and that they needed to do further tests to confirm its source. He adjusted his original forecast from 60% to 65% certainty that Arafat had died of polonium. The Swiss team did find polonium so Bill’s final Brier score came out at an impressive 0.36.
This case-study doesn’t mean that new information is always helpful though - it can also misguide you. To update a forecast you must use your skill to tease out subtle details from extraneous information. On the other hand, you shouldn’t be afraid to change your mind and update your forecast.
Takeaway #7. Working In Teams Can Be Beneficial
The term groupthink was coined by psychologist Irving Janis who hypothesized that people working in small groups build team spirit by unconsciously creating shared illusions that disrupt critical thinking because of people agreeing with each other.
The research team on the Good Judgment Project decided to see if teamwork would influence accuracy and concluded, after 1 year, that on average, the people who worked in teams had forecasts that were 23% more accurate than forecasts made by individuals. In the 2nd year, the team replaced regular forecasters with superforecasters and found they outperformed the regular group of forecasters tremendously but that the group dynamics were impacted with everyone overly polite with very little critical discussion and no one willing to present an alternative view.
To combat this problem teams can work more effectively by using precision questioning, a method that encourages people to rethink their argument by defining the finer details of their argument i.e by asking them to define a particular term.
Remember, by not conforming, your ideas can be a tremendous asset to the group.
- Print length: 352 Pages
- Audiobook: 9 hrs and 45 mins
- Genre: Nonfiction, Business, Science
What are the chapters in Superforecasting?
1. An Optimistic Skeptic
2. Illusions of Knowledge
3. Keeping Score
8. Perpetual Beta
10. The Leader's Dilemma
11. Are They Really So Super?
12. What's Next?
What are good quotes from Superforecasting?
"The more that is unknown, the greater the opportunity."
“For superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded.” (Meaning)
What do critics say?
Here's what one of the prominent reviewers had to say about the book: "One of Tetlock's key points is that these aren't innate skills: they can be both taught and learned... Tetlock's 'Ten Commandments For Aspiring Superforecasters' should probably have a place of honor in most business meeting rooms." — Forbes
* The summary points above have been concluded from the book and other public sources. The editor of this summary review made every effort to maintain information accuracy, including any published quotes, chapters, or takeaways
Tal Gur is an author, founder, and impact-driven entrepreneur at heart. After trading his daily grind for a life of his own daring design, he spent a decade pursuing 100 major life goals around the globe. His journey and most recent book, The Art of Fully Living, has led him to found Elevate Society.