Skip to main content
The Actuary: The magazine of the Institute and Faculty of Actuaries - return to the homepage Logo of The Actuary website
  • Search
  • Visit The Actuary Magazine on Facebook
  • Visit The Actuary Magazine on LinkedIn
  • Visit @TheActuaryMag on Twitter
Visit the website of the Institute and Faculty of Actuaries Logo of the Institute and Faculty of Actuaries

Main navigation

  • News
  • Features
    • General Features
    • Interviews
    • Students
    • Opinion
  • Topics
  • Knowledge
    • Business Skills
    • Careers
    • Events
    • Predictions by The Actuary
    • Whitepapers
    • Moody's - Climate Risk Insurers series
    • Webinars
    • Podcasts
  • Jobs
  • IFoA
    • CEO Comment
    • IFoA News
    • People & Social News
    • President Comment
  • Archive
Quick links:
  • Home
  • The Actuary Issues
  • February 2015
02

Automatic assistance

Open-access content Thursday 29th January 2015

Insurers continue to experience pressure to produce audit-quality financial statements to ever-tighter timescales, and Solvency II reporting will not be immune to this. Gabi Baumgartner and Tejas Nandurdikar share some advice to speed up the process

2

As there is reasonable clarity on the reporting requirements under Solvency II, some firms have prepared by investing in technology, particularly parallel or cloud computing. This means that machine runtime can be reduced to a small proportion of the usual elapsed time of a reporting process. 

The more challenging improvements in process involve people, particularly when judgment or problem-solving are involved. These steps can be hard to predict, and this is compounded when the calculations and subject matter are complex, such as in the use of economic scenario generators to value guarantees embedded in insurance contracts. 

However, a number of approaches exist that can mitigate the human bottlenecks in financial and solvency capital reporting.

—

Automate mechanical tasks

Solvency II reporting processes, although in their relative infancy, may grow organically, with many manual adjustments and workarounds accumulating over the years. Automating these processes - from data collection to results production, including test outcomes - removes process bottlenecks and reduces the scope for transcription errors. 

In other cases, reliance on the timing of key inputs can be reduced via automated tasks. For example, it is worthwhile to implement the Smith-Wilson algorithm to derive the Solvency II yield curves directly from source data. This is in order to reduce dependency on EIOPA's release of the yield curves.

Care is needed however, as automation can introduce new errors. Good practice is to incorporate informal checks - even if they are only on the 'copy-pasting' of data - and it is important to capture these intermediate checks in an automated process, too. Tremendous benefit can be achieved in gaining processing time through more computing power, as long as there are enough checks in place.


Be clear about tolerances

Tolerance bands are commonly used to monitor the outcome of various modelled variables. They are a useful way of tracking the outcome of variables against a specified range. For example, output yield curves and option prices should replicate market data within a given tolerance. Martingale tests for market consistency should not come out significantly different from unity. The tolerance bands can be derived in two different ways - statistical tolerances and accounting tolerances.

The statistical approach measures test results relative to the sampling error expected from a certain number of scenarios. If you run more scenarios, then the tolerance bands narrow. 

If you run too few scenarios then the bands are wide and the test lacks power, so that even if the model is wrong you may not detect it.

In contrast, accounting tolerances relate to the size of any error relative to what is being measured, and the impact of any error on decisions. The accounting materiality threshold will include many possible sources of error besides statistical sampling. Acceptable tolerances are not affected by the number of scenarios, although the ability to comply with the tolerances should improve as the scenario count increases.

It is good practice to derive tolerances based on the test purpose and the reporting framework, and to be clear as to why each tolerance is needed. Market movements will often affect the volatility of the variables being modelled. With more volatile market conditions, more scenarios need to be run in order to maintain the same level of tolerance as under benign market conditions. The relevant tolerance - statistical or accounting - may vary according to the number of scenarios run, as Figure 1 (below) shows.

Using inaccurate tolerance levels will often lead to a number of false red flags being raised following a model run. This adds to the processing time as models need to be re-run. 

To minimise these delays, the tolerance levels must be accurately set. Instead of blindly using last time's levels or levels used by peers, the tolerance levels must be linked to accounting materiality and objective sampling errors.

It is worth noting that we expect a number of fails from statistical tests. This is due to the construction of the tolerance bands, also referred to as confidence intervals in statistical textbooks. Confidence intervals are typically stated at the 95% confidence level. In this case, we expect the true answer to be within the confidence intervals 95% of the time if we were to repeat the calculation with new samples multiple times. It is therefore important to consider whether the fail is likely to be the result of a sampling error or a genuine problem with the model itself.

P28_Fig1 Jan/Feb 15

Use reliable algorithms

Complex algorithms are used in order to automate and industrialise the production of solvency capital numbers and financial reports. There are likely to be weak points in algorithms that need to be fully understood. Most difficult algorithms involve optimisation or solving simultaneous equations. For example, economic scenarios used for the best-estimate liabilities under Solvency II may be calibrated to replicate interest rate and equity derivative prices. Solutions may or may not exist and may not be unique. 

Algorithms may fail even when a solution exists, or may report false solutions. This is more of a problem with complicated models, and it is often best to split a model into pieces and only try to solve equations in one or two variables at a time. Running legacy algorithms year on year without appropriate testing increases the chance of the algorithm failing. 

There has to be a robust testing environment in place before the model is run. 

Investing in making the algorithms reliable and fit for purpose prior to the main production process will go a long way towards reducing delays when the model is run.


Anticipate social and commercial constraints

There are often social and commercial constraints associated with reporting financial information. Such constraints need to be recognised and legitimised within the working environment. Computer code can be sped up, but it is a different matter to accelerate human judgments and negotiations. Reviewers will consider not only technical matters but also whether a particular result is likely to be acceptable from commercial or regulatory perspectives, sometimes with an eye on anticipated peer behaviour. This is a sensitive area that needs to be addressed in order to make the reporting process more transparent and, as a result, faster.

There will always be constraints that cannot be automated. Appropriate tolerance levels and triggers may help, however, in reducing human intervention. The aim is to direct human effort towards areas where expert judgment is required and to automate other areas of the process.


If it still goes wrong

Good preparation reduces the chance of something going awry. Some risk remains, however, and it is important to have a plan B. This must consider remote contingencies and practical and reasonable workarounds. Having a contingency framework agreed by management will help with reducing delays in responding if something goes wrong, recognising that in some, albeit extreme, circumstances delaying the publication of results may be the 'least-bad' option.

Overall, more investment is required upfront to set up systems capable of producing accurate and timely reports required by the business. 

It is through such investment that problems may be anticipated early, and it may be possible to automate solutions for these. It should be clear where human judgment is needed, and rigorous testing should be carried out to make other areas of the process as independent as possible. 

A realistic plan B can help reduce costs and delays in the event of something going wrong. Raising and promptly addressing some specific and difficult questions such as the following will be helpful.

? Is the business comfortable with the way tolerances are set?

? When was the last time that tolerances were refreshed?

? Is the business driven by statistical or accounting considerations? 

? Where does the business feel controls are too weak, or, indeed, too strong?

? Where are the known problems that everyone is postponing grasping? 

? If the business had the budget, what is the first process improvement it would make?

These could certainly be a starting point towards an even faster Solvency II close.

Gabi Baumgartner is a senior manager at Deloitte. She runs the team responsible for Deloitte's economic scenario generator software. 

Tejas Nandurdikar is a senior consultant in Deloitte's actuarial and advanced analytics practice.

This article appeared in our February 2015 issue of The Actuary.
Click here to view this issue
Filed in
02
Topics
Regulation Standards

You might also like...

Share
  • Twitter
  • Facebook
  • Linked in
  • Mail
  • Print

Latest Jobs

Senior Catastrophe Analyst

London, England
£70000 - £100000 per annum
Reference
146055

Catastrophe Analyst

London, England
Up to £50000 per annum + + Bonus
Reference
146053

Principal Pricing Analyst

England, London
£60000 - £70000 per annum
Reference
146052
See all jobs »
 
 

Today's top reads

 
 

Sign up to our newsletter

News, jobs and updates

Sign up

Subscribe to The Actuary

Receive the print edition straight to your door

Subscribe
Spread-iPad-slantB-june.png

Topics

  • Data Science
  • Investment
  • Risk & ERM
  • Pensions
  • Environment
  • Soft skills
  • General Insurance
  • Regulation Standards
  • Health care
  • Technology
  • Reinsurance
  • Global
  • Life insurance
​
FOLLOW US
The Actuary on LinkedIn
@TheActuaryMag on Twitter
Facebook: The Actuary Magazine
CONTACT US
The Actuary
Tel: (+44) 020 7880 6200
​

IFoA

About IFoA
Become an actuary
IFoA Events
About membership

Information

Privacy Policy
Terms & Conditions
Cookie Policy
Think Green

Get in touch

Contact us
Advertise with us
Subscribe to The Actuary Magazine
Contribute

The Actuary Jobs

Actuarial job search
Pensions jobs
General insurance jobs
Solvency II jobs

© 2023 The Actuary. The Actuary is published on behalf of the Institute and Faculty of Actuaries by Redactive Publishing Limited. All rights reserved. Reproduction of any part is not allowed without written permission.

Redactive Media Group Ltd, 71-75 Shelton Street, London WC2H 9JQ