Whether they realise it or not, most actuaries are at risk of some form of bias in their modelling work. Michael Ortmann explores the most common pitfalls
"You have been biased!" - what an unpleasant rebuke for an actuary. Yet, this characterisation may not be too far-fetched. Exhibiting some form of bias is all too human, but hardly any quantitative analyst would admit to it.
Risk analysis and risk modelling have become a core competence of actuaries. In particular, actuaries make numerous predictions about the future with respect to pensioners' longevity, mortality and morbidity incidence rates and solvency capital requirements to name but a few. On the other hand, the traps an actuary can easily fall into give rise to particular concern.
Perception versus reality
Perception is another hurdle to overcome.
A recent 'Reality Cheque' survey conducted by Hymans Robertson showed that individuals underestimate life expectancy at retirement by some five to eight years on average (Figure 2, below). This is possibly a function of personal anchor points, the inability to conceptualise living in old age, and the media reporting 'period' rather than 'cohort' life expectancies (the former not allowing for future mortality improvements).
As expected, the perception gap has been shown to close as individuals consider life expectancy at older ages. This requires us to take different approaches for different age groups.
It is unheard of for any capital actuary to have been fired for inaccurate projections of the future. In general, there is no reward - and no reprimand either - for forecasts duly conducted, unless there is gross negligence at work. On the contrary, remuneration and bonuses are based on enhancing the risk framework by building internal Solvency II models, applying state-of-the-art statistical techniques and encompassing ever more risks. In other words, effort matters. The incentive bias refers to the misalignment of incentives and desired outcomes.
9 Who's in charge?
Problems can arise if the actuary has a superior who asserts undue pressure and influence over the actuary's work. Sadly, unquestioning acceptance of authority implies fatally simplified thinking. This bias refers to being overly submissive to authority so it affects the actuary's impartial modelling. What happens if the company is not able to afford the solvency capital that the risk actuary deems necessary? Not surprisingly, the dangers for the profession invoked by authority bias are part of the IFoA's professionalism courses.
8 Telling stories
Whatever the outcome of a forecast, the actuary's superiors need a story to buy into it. A convincing story that explains projected results and renders them plausible is vital. In fact, it is often seen by some as more important than the forecast itself. The actuary puts results into a commercial context. The story bias refers to the tendency to be more easily convinced by a narrative than numbers. In an extreme scenario, management may buy into a capital requirement figure that is not robust, simply because there is a convincing story to support it.
7 Anchoring to the past
A seasoned Solvency II actuary carefully weighs up projected results against previous forecasts and market benchmarks. The anchor bias postulates that we always relate new results to past established results. In fact, an actuary has to come up with a good reason to overwrite anything key stakeholders have bought into previously. As a consequence, past mistakes that have nested are very difficult to correct.
6 Does the answer look right?
Undoubtedly, for any stochastic forecast for example, there is an abundance of parameters that the actuary in charge needs to set carefully. Most practical actuaries would concur that modelling is a mix of art and science. Quite often, in a process of tinkering and twiddling with the assumptions, the actuary arrives at a projection that looks reasonable. This tendency is called outcome bias. Thus, the actuary risks judging a model on the basis of its results. Not surprisingly, the best guess as well as likely outcomes will comply with mainstream forecasts.
5 Where's my data?
When data is scarce, capital model actuaries risk making incorrect assumptions with reference to the applicability of the normal distribution, stochastic independence, correlation matrices, the Black-Scholes formula, and so on. Likewise, actuaries tend to use data that is readily available in order to evaluate certain risks where data is scarce, such as operational risk, for instance. As a consequence, results have to be taken with a pinch of salt. Any insights gained from such a model exhibit the availability bias.
4 Lots of effort, must be right!
Actuarial models are becoming ever more sophisticated. The complexity bias refers to the tendency to overly appreciate the labour that has gone into a forecast model. Most tend to believe less in simple plausibility calculations than in complex computations that have required a good deal of mental effort, such as internal models for Solvency II. Interestingly, some academic research has revealed that during an acute crisis, financial markets behave like a single factor model, as all assets become co-monotonic. Stress scenario models do not necessarily need to be complex per se.
3 But the model's been back-tested!
Comprehensive mortality forecasting, in particular, entails a number of different back-testing exercises. These tests have in common that model parameters are calibrated in order to forecast the past based on previous experience. If the model works fine for a number of such ex-post evaluations, we deduce by induction that it will always work. Such reasoning is called induction bias. There is, however, no underlying law of cause and effect in actuarial science as there is in natural science. A forecast based on induction can only work if things stay as they are. The implied stationarity of the model has an innocuous but vital consequence - a stationary model tends towards its equilibrium and will always predict this stable equilibrium.
2 With hindsight
The most well-known forecasting pitfall relates to hindsight bias. Actuaries analyse and model historic data with a view to forecasting the future. A fundamental assumption is that past experience is a reliable basis for predicting future experience. But there is inherent uncertainty in so doing, since the future may be affected by events that have never before been witnessed, such as the financial crisis that unfolded in 2007. In retrospect, its development over the following years is compelling and logical. There is a heap of experts who can fully explain the whole story from beginning to end - with the benefit of hindsight. However, at no point in time did anybody correctly forecast the subsequent course of global economics. Thus, it must be a misperception to feel knowledgeable about what has gone on. Such an error in reasoning makes us believe that we are better at projecting the future than we actually are.
1 I'll take this dataset, please
The most important drawback of all is called confirmation bias. This phenomenon refers to the tendency to actively select and over-emphasise evidence that supports the aspired results. Likewise, the actuary may ignore and under-emphasise any data that point to the contrary. To be more specific, a reasonable Solvency II actuary usually tries to render a capital forecast plausible by arriving at about the same figure in different ways. By so doing, the actuary actively searches for confirming evidence. On the other hand, an actuary does not usually take the trouble to falsify a satisfactory modelling outcome. As such, there is a risk of disconfirming evidence being discarded.
Know your unconscious!
In a nutshell, a bias inveigles an actuary into favouring a particular forecast over any alternative projected result. The model outcome may be no more than a common sense conclusion. Erroneous beliefs in a seemingly objective model may seriously affect an actuary's impartial judgement. Unconscious attitudes play an important role in this respect. Implicit fallacies may result in overconfidence and self-delusion. Therefore, it is crucially important to be aware of potential errors in reasoning.