Gurpreet Johal and Graham Robertson consider how reserving might evolve in the current age of technological advancements
'The role of the general insurance reserving actuary is becoming increasingly difficult. The introduction of Solvency II has increased the workload of reserving teams through the introduction of technical provisions, the need to improve the documentation of expert judgment and the measurement and articulation of reserve variability.
With IFRS 4 Phase II on the horizon, the pressure is only going to increase. New emerging risks, such as cyber, new sources of information and increased scrutiny on reserving adequacy are requiring the actuary to question how they approach reserving and the methods they use. Organisations are adding to the pressure by looking for faster results, greater accuracy and more involvement of the business in the reserving process. All of this is occurring in a soft market with pressures to cut cost base.
Something needs to change for reserving teams to meet the higher expectations. Reserving and finance processes have built up over an extended period of time, with limited investment in infrastructure. This creates processes that have legacy issues and inefficiencies contained within them. Leading insurers are investing in reserving and finance transformations, redesigning their processes and how they use their data. A successful transformation will provide a quicker, cheaper and scalable quarter-end process through automation, leaving the actuary with more time to engage with underwriters, finance and management on the key issues.
What is technology enabling?
Thanks to actuaries' familiarity with them, spreadsheets are usually involved in many of the reserving tools. What often emerges is an inefficient reserving process that takes time to move data around different tools with many manual interventions. Switching between two to three tools is irritating for frequent operations. The end-to-end reserving process, defined from source system cut-off to reserve estimates in the ledger, can be made very quickly through a transformation involving a single data warehouse, fast interfaces and efficient code for large calculations like roll-forwards or detailed allocations to ledger fields.
IT support is needed to achieve these advantages. However, it is important actuaries retain the flexibility to adapt and improve the modelling parts of the reserving process. An increasing number of database tools in the market provide business-orientated spreadsheet front-ends to allow that flexibility. This is putting enterprise model build, like reserving templates backed up by databases, in the hands of actuaries. In many cases, additional controls have been added to spreadsheets to remove many of the risks associated with them.
We expect this trend to continue so that more power is put in the hands of business users, not IT. This means improved access to the detailed data so that actuaries can analyse it better.
For example, actuaries could view models applied to different levels of granularity, including policy and claim level. Another area that has improved greatly over the past five years is around visualisation. Even the more traditional vendors are enhancing their products in this area. In our experience, the visual representation and drill-down capability is greatly improving the understanding and communication of reserving work, including explaining reserve uncertainty.
It is also important that the reserving tools can interface well with surrounding tools and databases, like the source systems and the ledger. The next-generation reserving tools will meet the needs of the organisation and not just individual users. These would be scalable and more robust than those grown organically in reserving functions, which are already creaking without additional demand for detailed data and additional methods. A coherent reserving architecture is fundamental to achieving reserving process efficiency.
Removing process delays caused by issues in data processing or reporting also frees up time for proper review of alternative reserving methods. Reserving actuaries have a tendency to select from their favourite methods and adjust assumptions for those, rather than selecting a more appropriate method. Fast access to detailed data needs to be available to support the reserving methods, but this is something that many insurers don't currently provide. This is usually owing to legacy systems and processes that could be transformed to give the reserving actuaries the detailed data fast. The technology is now available to support the demands of the reserving actuary, and the better reserving functions already have the data they need.
Better triangle-based methods
The methods of life actuaries were disrupted and improved with advanced technology in the 1980s and 1990s, allowing a move away from commutation factors to valuations on millions of individual policies. General insurance reserving methods are at a similar level of disruption now, owing to advanced technology and the opportunities arising from this.
Reserving actuaries predominately use triangle-based methods and there is much familiarity with these techniques. Triangles help visualise the data and are useful for conversations with underwriting and claims teams. However, reserving methods should not be limited to the analysis of triangles only.
The industry may be waking up to this, as a number of new research papers have emerged in recent years that propose alternative methods to the traditional ones. Some of these focus on reserving the variability at the same time as producing the best estimate, which can remove duplication from processes.
The main limitation of using triangles is that they summarise the data, which means that reserving actuaries may not spot trends that should affect their reserving.
Examples of trends that may be lost are claims development patterns speeding up, owing to changes in claims processing, average limits changing on excess of loss policies or a change in mix of business by broker. This information is often captured by the reserving actuaries through conversations with other functions. The reserving actuary then uses this knowledge in their judgments.
Unfortunately, the judgments often get applied within the constraints of certain methods, for example, by changing development percentages in a chain-ladder model or prior loss ratios in a Bornhuetter-Ferguson model. What the actuaries should really be considering is whether the information invalidates their preferred methods.
To help make that decision, the actuary needs hard information from the detailed underlying data. Simple statistics showing trends and changes in mix can help the reserving actuary decide which method to use. For example, if recent claims are settling more quickly than older claims, this should deter the actuary from using triangle-based methods and projecting the longer settlement patterns into the future. Dashboards can be constructed that summarise the detailed data and flag important features to the actuary. These may cause the actuary to choose a different method or to change the assumptions within a method. For example, a simple colouring of the large positive and negative link ratios in a chain ladder model can enable an actuary to spot calendar year effects that should be adjusted for, or the use of residual plots can alert actuaries to bias in their selected parameters. It is important that the breadth of predictive factors are captured and summarised well. Dashboard indicators need to be effective and add insight to allow the actuaries to make informed judgments quickly. When constructed well, interactive dashboards can add much value by enhancing transparency and enabling drill-down investigations.
Reserving in the future
It is always bold making predictions about the future, but we envisage a technology-enabled evolution in reserving, rather than a revolution. In the next five years, we believe reserving will:
? Have a very lean quarter-end process. Time will be freed up so that actuaries can aid better decision making and governance, rather than crunching data
? Be capable of more frequent updates. Assets can be revalued on a daily basis and liabilities could be too, via a daily roll-forward mechanism
? Be database driven. The detailed data will be available for other reasons like claims analytics, as well as reserving
? Still involve spreadsheets. These will be pulling data from and to the databases instead of being in offline use
? Include triangle-based methods. These methods will still be relevant in many cases and the visualisation of development graphs is a useful tool for review and communication
? Include triangle-free methods. These will become more mainstream as insurers transform their legacy systems to provide the required detailed data
? Contain visualisation diagnostics to aid method selection and communication. These are key time-savers to help actuaries choose appropriate methods.
Technology is advancing to make some of these things happen and reserving actuaries should be the leading advocates of the change. We should focus on getting the balance right between process automation freeing up time for valuable analysis and our communication of the information. The ambition is not analysis for analysis' sake, but to be able to provide the business with better information quickly.
The most successful actuaries of the future will be those with excellent communications skills who can deliver high-quality insight faster.
Gurpreet Johal is an insurance partner at Deloitte and leads the actuarial, reward and analytics practice
Graham Robertson is a senior manager in Deloitte's actuarial, reward and analytics practice, with focus on finance, risk and actuarial transformations