Skip to main content
The Actuary: The magazine of the Institute and Faculty of Actuaries - return to the homepage Logo of The Actuary website
  • Search
  • Visit The Actuary Magazine on Facebook
  • Visit The Actuary Magazine on LinkedIn
  • Visit @TheActuaryMag on Twitter
Visit the website of the Institute and Faculty of Actuaries Logo of the Institute and Faculty of Actuaries

Main navigation

  • News
  • Features
    • General Features
    • Interviews
    • Students
    • Opinion
  • Topics
  • Knowledge
    • Business Skills
    • Careers
    • Events
    • Predictions by The Actuary
    • Whitepapers
    • Moody's - Climate Risk Insurers series
    • Webinars
    • Podcasts
  • Jobs
  • IFoA
    • CEO Comment
    • IFoA News
    • People & Social News
    • President Comment
  • Archive
Quick links:
  • Home
  • Sections
  • Archive
  • old-articles
  • part-3

Risk quantification techniques

Open-access content Friday 21st September 2012 — updated 2.33pm, Tuesday 5th May 2020

Mark Chaplin examines some of the effective risk-management options available to companies and regulators worldwide.

In the past risk has usually been allowed for by taking prudent margins over best-estimate assumptions. These prudent margins are frequently set by one individual often an actuary and are based on a little historical data and a lot of judgement. In its most basic form, risk evaluation is little more than this with, perhaps, a slightly more formal identification of the degree of prudence that is being targeted. There is increasing pressure, however, for more quantitative risk assessment using established techniques, for example in deriving market value risk margins under new international accounting standards and for internal capital assessment in the UK.
As an illustration of the techniques available, in this article I want to show how a life insurer might quantify the risk of increases in mortality. For this example, we assume that the risk sensitivity is a 99.5% confidence level over a one-year time horizon in other words, we are considering '1-in-200-years' events. However, the approaches outlined can easily be applied to different risks, time periods, and confidence levels.

Expert opinion
Expert opinion is an extremely useful tool in risk assessment and is often overlooked as a separate technique in the quantitative actuarial world. It is particularly useful where relevant data are scarce, for example where conditions have changed materially (reducing the usefulness of past experience), or where the risks are very company-specific as would often be the case for lapse rates. In essence, the prudent assumption setter was providing one expert opinion on the risk. However, it will often be appropriate to seek input from a range of experts across different disciplines.
One common approach to gathering expert opinion is to set up risk-management workshops for senior managers within a firm to discuss the relevant risks. This can be quite effective, particularly if well facilitated, but there are potential problems:
- Small groups or single experts can suffer from significant bias.
- Results can be distorted by office politics.
- There is a tendency within a group to 'follow the leader', either the most respected or worse still the most dominant individual in the group.
- There will generally be a reluctance to abandon previously stated views.
The Delphi method was developed by the RAND Corporation to address these possible shortcomings, and came in response to a US military request to prepare a forecast of future technological capabilities. However, the forecasting techniques developed have since been applied in a much wider range of areas. The basic approach is to:
- select a panel of experts;
- develop a first round questionnaire on the risks to be considered;
- test the questionnaire for problems such as ambiguity and bias, send the questionnaire to the panellists, then gather and analyse the responses;
- provide a statistical summary of the panel's responses back to the panel; and
- prepare a second round questionnaire; and so on until the results converge.
The Delphi method still has weaknesses that are a function of human behaviour, and are common to many of the other techniques available for collecting expert 'unbiased' opinion for instance, the tendency to give greater prominence to more recent conditions or events.
As part of our research into mortality, we asked a number of medical experts and demographers for an indication of a possible 1-in-200-year deviation from expected mortality. We did not complete the full iterative process of the Delphi method, but even on the initial poll there was some agreement around a 20% variation in death rates for the year.

Historical simulation
A fairly straightforward approach to risk quantification is simply to gather as much past data as possible and use this history as a simulation of the future. For example, we can gather the daily price changes for the last 1,000 days for the shares we are currently holding in our portfolio. This generates 1,000 different scenarios for the performance of our portfolio over the coming day. If we take the 5th-worst performance of the portfolio then we will have generated the 99.5th percentile portfolio return over a one-day time horizon.
Taking the mortality example, figure 1 shows the year-on-year movement in mortality for the UK population aged 2059 between 1911 and 1995. As can be seen, the largest increase in mortality was a 50% jump in 1918 as a result of the Spanish influenza epidemic. This epidemic might prove a useful guide to a possible 1-in-200-year event. However, it might equally be argued that conditions have changed, and that state-led controls for reducing the impact of epidemics are more effective now.
Lack of data is often an additional constraint on carrying out historical simulation, as a time horizon of one year limits the number of independent observations that can be made from history. This problem is exacerbated when looking at the tail of a distribution. Reliance on historical simulation also introduces a further problem, sometimes known as pro-cyclicality, whereby the occurrence of a rare, significant risk event has a double impact. First, the capital available will be depleted by the adverse impact of the risk event and second, the required capital may increase because of the larger number of significant adverse events included in the past data set.

Normal distribution assumption
Another way of exploiting past data is simply to observe the mean and standard deviation of a particular factor, for instance equity market returns, and assume that the factor is normally distributed. The basic properties of the normal distribution then allow us to generate the chosen confidence interval around the mean by taking particular multiples of the standard deviation. This approach generally gives far less weight to the outliers in the data than would a historical simulation.
Returning to the data given in figure 1, we find that the standard deviation of annual mortality rate changes from 1911 to 1995 was around 9%. With a normal distribution, we should be 99.5% confident that the observed value will not be more than 2.58 standard deviations above the mean. This would suggest that a 1-in-200-year event would be a 23% increase in mortality above our expected change. This result is similar to that provided by our 'expert opinion'.
Naturally, more complicated distributions could be assumed and the confidence intervals derived in a similar, if more mathematically complex, way. There are numerous statistical techniques for helping us to decide which distribution might best describe the risk factor being considered, and then to fit the past observed values of the risk factor to that distribution.

Extreme value theory
Under extreme value theory (EVT) events below a particular threshold are excluded from the distribution-fitting process. In effect, this exclusion assumes that small variations in the risk factor are no help when trying to predict the occurrence of very large changes, and focuses the effort on replicating the observed large changes. The positive aspect of this approach is that attention is concentrated on the part of the distribution in which we are most interested. The generalised Pareto distribution (GPD) is commonly used for modelling events above the threshold. As can be seen from our example below, the problem with the EVT approach is that the answers produced can be very sensitive to the choice of threshold. It also suffers from the problem afflicting historical simulation: that the occurrence of an extreme event can have a very significant effect on the estimated risk.
In our mortality data we have only four year-on-year mortality increases in excess of 10% and only two increases greater than 25%. The sensitivity of the results (fitted using a GPD) to the choice of threshold is apparent from table 1 (right).

Monte Carlo simulation
Monte Carlo simulation is a statistical sampling technique for solving complex differential equations. Basically, we assume that the evolution of a particular item of interest can be described by a probability density function; the Monte Carlo simulation is then carried out by sampling from this probability density function and tallying the results. This is a powerful technique and may not strictly be required if 'closed form' (that is, formula based) solutions exist. However, Monte Carlo simulation is an approach frequently used in asset-liability modelling, and gives the user more flexibility in modelling the codependencies between multiple risk factors.
One danger that often occurs when deriving the distribution of the risk factor is to add in too many parameters, and so 'over-fit' the distribution formula to the past data. This would lead to the distribution formula explaining the past particularly well, giving very small observed 'error' terms. As a consequence, little uncertainty is projected, and the variability of the risk factor may be understated. This is an example of model risk, and can be difficult to quantify.
Taking our mortality example further, we constructed a simple stochastic mortality model and used the historical data from figure 1 to parameterise it. The model chosen was a simple ARMA (autoregressive moving average) process. Running 1,000 simulations and taking the 99.5th percentile gave a mortality increase of 31%.

Reasonable methods
As can be seen from the summary in table 2, a wide range of results can be derived using a selection of reasonable methods, all (bar the expert opinion) based on the same data. This presents a significant problem for the risk modeller.
The most robust approach is to employ as many methods as possible to help understand the risk better and to provide a reasonableness check on the results from the other methods. This range of methods will also give an indication of the model risk involved. From the options available, the most appropriate method should be chosen for use in quantifying risk. This selection itself requires no little judgement.

03_06_03.pdf

You may also be interested in...

Who was Captain John Graunt?

Margaret de Valois Lowe gives us a clue
Friday 21st September 2012
Open-access content

Funding the Chatham Chest

Chris Lewin highlights the roles that the Royal Navy and a Kentish town played in the history of occupational pension schemes.
Friday 21st September 2012
Open-access content

Calculating women 2

Katrina Malone continues her account of the history of the first women actuaries.
Friday 21st September 2012
Open-access content

Biodiversity nature’s insurance policy

Jill Green outlines the importance of conserving biodiversity and how actuaries could play a part.
Friday 21st September 2012
Open-access content

Actuaries’ contributions to financial economics

Shane Whelan tries to explain why actuaries failed to capitalise on their century-old head start in the field of financial economics.
Friday 21st September 2012
Open-access content

Banking: Invest in me, I’m an actuary!

Chris Cannon talks to Graham Jung of Goldman Sachs about the opportunities available to actuaries considering a move into the investment industry
Friday 21st September 2012
Open-access content

Latest from Archive

2

De-risking too far?

Simon Willes explains why investment de-risking a pension scheme without regard to employer covenant may not lead to optimal member outcomes
Monday 9th September 2019
Open-access content
2

Financial services stand to gain most from low-carbon transition

The financial sector is set to gain most from creating new sustainable products and services in response to climate change, a groundbreaking international study has revealed.
Tuesday 4th June 2019
Open-access content
2

Government gives green light to pension dashboards in 2019

UK savers will soon be able to see all their pension savings in one place after the government today unveiled proposals for a series of dashboards that could come online later this year.
Thursday 4th April 2019
Open-access content

Latest from no_opening_image

TPR publishes coronavirus guidance

The Pensions Regulator (TPR) has published guidance to help UK pension trustees, employers and administrators deal with the financial and regulatory risks posed by coronavirus.
Monday 23rd March 2020
Open-access content
web_p24_cat-and-fish_iStock-483454069.png

Sensitivity analysis: swimming lessons

Silvana Pesenti, Alberto Bettini, Pietro Millossovich and Andreas Tsanakas present their alternative approach to sensitivity analysis
Wednesday 4th March 2020
Open-access content
ta

IFoA adjudication panel: Mr Jack Wicks, student

On 30 October 2019 the Adjudication Panel considered an allegation of misconduct against Mr Jack Wicks (the respondent).
Friday 28th February 2020
Open-access content

Latest from missing_authorDate_strap_details

TPR publishes coronavirus guidance

The Pensions Regulator (TPR) has published guidance to help UK pension trustees, employers and administrators deal with the financial and regulatory risks posed by coronavirus.
Monday 23rd March 2020
Open-access content
web_p24_cat-and-fish_iStock-483454069.png

Sensitivity analysis: swimming lessons

Silvana Pesenti, Alberto Bettini, Pietro Millossovich and Andreas Tsanakas present their alternative approach to sensitivity analysis
Wednesday 4th March 2020
Open-access content
OCT-2019.jpg

10

October 2019 archive + digital edition
Wednesday 9th October 2019
Open-access content

Latest from missing_standfirst

news in brief

March news in brief

Paper: A Cashless Society in 2019 Cash is under pressure. A tense 2018 led to a tumultuous 2019: Facebook's announcement of plans to launch its Libra cryptocurrency with a consortium of companies united all regulators against the project.
Friday 28th February 2020
Open-access content
2

Forging new paths

The Terminator is coming!  At least that's one potential vision of the future, invoked by Boris Johnson at the UN last year while he speculated about artificial intelligence (AI). We can certainly debate how realistic that vision is, and what the possible timescales might be.
Friday 28th February 2020
Open-access content
2

Expert advice

This edition of the magazine focuses on data science and its applications, which will be a recurring theme for the IFoA.
Friday 28th February 2020
Open-access content

Latest from inline_local_link

2

COVID-19 forum for actuaries launched

A forum for actuaries has been launched to help the profession come together and learn how best to respond to the deadly coronavirus sweeping the world.
Wednesday 25th March 2020
Open-access content
2

Travel insurers expect record payouts this year

UK travel insurers expect to pay a record £275m to customers this year as coronavirus grounds flights across the world, the Association of British Insurers (ABI) has revealed.
Wednesday 25th March 2020
Open-access content
2

Grim economic forecasts made as countries lockdown

A sharp recession is imminent in the vast majority of developed and emerging economies as the deadly coronavirus forces businesses to shut down across the world.
Tuesday 24th March 2020
Open-access content

Latest from part-3

Discount rates: Fighting for supremacy

What is SONIA, and why are banks using it instead of LIBOR to calculate collateral? Michael DeWeirdt and Andrew Smith investigate
Friday 21st September 2012
Open-access content

Reinsurance: A brief history

David Holland explores the evolution of reinsurance from its maritime origins to the present day
Friday 21st September 2012
Open-access content

People moves

People moves
Friday 21st September 2012
Open-access content
Share
  • Twitter
  • Facebook
  • Linked in
  • Mail
  • Print

Latest Jobs

Life Actuarial Contract - Capital Project (outside IR35)

England
Negotiable
Reference
149010

Pricing Consultant (Non-Life)

London / Leeds
Up to £70,000 + Benefits
Reference
148996

Senior Actuary

London (Central)
Negotiable
Reference
148991
See all jobs »
 
 
 
 

Sign up to our newsletter

News, jobs and updates

Sign up

Subscribe to The Actuary

Receive the print edition straight to your door

Subscribe
Spread-iPad-slantB-june.png

Topics

  • Data Science
  • Investment
  • Risk & ERM
  • Pensions
  • Environment
  • Soft skills
  • General Insurance
  • Regulation Standards
  • Health care
  • Technology
  • Reinsurance
  • Global
  • Life insurance
​
FOLLOW US
The Actuary on LinkedIn
@TheActuaryMag on Twitter
Facebook: The Actuary Magazine
CONTACT US
The Actuary
Tel: (+44) 020 7880 6200
​

IFoA

About IFoA
Become an actuary
IFoA Events
About membership

Information

Privacy Policy
Terms & Conditions
Cookie Policy
Think Green

Get in touch

Contact us
Advertise with us
Subscribe to The Actuary Magazine
Contribute

The Actuary Jobs

Actuarial job search
Pensions jobs
General insurance jobs
Solvency II jobs

© 2023 The Actuary. The Actuary is published on behalf of the Institute and Faculty of Actuaries by Redactive Publishing Limited. All rights reserved. Reproduction of any part is not allowed without written permission.

Redactive Media Group Ltd, 71-75 Shelton Street, London WC2H 9JQ