[Skip to content]

Sign up for our daily newsletter
The Actuary The magazine of the actuarial profession

Superforecasting: the art and science of prediction

Superforcasting: the art and science of prediction

Publisher: Random House Books

ISBN- 10: 184794714X
RRP: £14.99

Superforecasting is one of those rare books that provides a substantial amount of useful advice in an engaging way, supported by many interesting case studies. The origin of the book lies in research by one of the authors (Philip Tetlock) into the accuracy of predictions. Disturbed at the generally poor predictive capabilities of the ‘experts’ involved, Tetlock considered how forecasts could be improved.

This led him to set up the Good Judgment Project, involving around 20,000 amateurs interested in trying their hand at forecasting. This group formed one of a number of teams in a forecasting tournament run by the US Intelligence body IARPA (Intelligence Advanced Research Projects Activity), which was trying to improve forecasting in the wake of the intelligence problems relating to 9/11, and Saddam Hussein’s (lack of) weapons of mass destruction in Iraq.

The large number of forecasters in the team meant Tetlock could experiment with various forecasting techniques and compare their results. This book is effectively a guide to the techniques that proved most effective – leading to his Good Judgment Project team beating rival teams (including professional intelligence analysts with access to classified data) by hefty margins. Space forbids a comprehensive account of all of those techniques – so do read the book. Instead, here’s a ‘representative sample’ of the main lessons.

Thinking about thinking: forecasters who had the benefit of Tetlock’s briefing instructions on better forecasting (circa one hour’s reading) scored around 10% higher than others – a modest improvement, but achieved at very little cost.

Break large problems down into components: probably a standard technique for most readers of The Actuary, and one the author refers to as ‘Fermi-izing’ – referring to Enrico Fermi’s challenge to his students to estimate the number of piano tuners in Chicago (best solved by considering components such as the number of pianos in Chicago and frequency of tuning).

Perpetual beta: successful forecasters embody this in two respects – continual updating of their forecasts, when material new evidence arises (trying to strike a balance between over- and under-reacting to new information); and a growth mindset, of always wanting to be learning and improving personally.

Teamwork: one interesting experiment conducted was to split the forecasters into solitary workers and team workers. The concern that team forecasts might be jeopardised by groupthink and inefficient group dynamics (loudmouths and loafers) proved unfounded, as the advantages of information sharing and mutual challenge led to the team forecasters outscoring their non-team counterparts by a margin of over 20%.

Granularity: slightly surprisingly to my mind, forecasters who made very precise predictions did better than the ‘round numbers’ forecasters. This reflects not only the mindset of the individuals (those making more precise predictions seem to have been more assiduous in their thought process) but also the extra accuracy in their perhaps spurious-seeming precision (taking the average of rounded precise predictions gave a worse answer than averaging the unadjusted predictions).

In addition to the various sections relating to technique, the authors also consider the type of individual associated with good forecasting ability. Early on, the authors contend that: “What makes these superforecasters so good [is] not really who they are. It is what they do. Foresight isn’t a mysterious gift bestowed at birth. It is the product of particular ways of thinking, of gathering information, of updating beliefs. These habits of thought can be learned and cultivated by any intelligent, thoughtful, determined person.”

There is a slight sense of contradiction between this and later analysis of the optimal forecaster mindset, involving such attributes as humility (not self-doubt, but humility before an infinitely complex reality), active open-mindedness (“beliefs are hypotheses to be tested, not treasures to be protected”), and pragmatism (not to be wedded to any idea or agenda). Interestingly, advanced mathematical proficiency does not feature; good forecasters are numerate, and think probabilistically, but even those forecasters mentioned in the book with mathematical backgrounds use very simple methods to arrive at their results.

I recommend the book to anyone wishing to improve their predictive capabilities. It is particularly useful in the light of Solvency II’s focus on expert judgment, given how much more rigour insurers should be applying to this area than in the past. It is also a fascinating and entertaining read.

Matthew Edwards is a senior consultant at Towers Watson. He is a former editor of The Actuary