Yiannis Parizas and Phanis Ioannou consider how non-life organisations could gain a strategic advantage through various aspects of the pricing process
How can an insurer add value to its business through its pricing process? We examine the question from multiple angles: model choices and calibration, cost measurement and management, and pricing infrastructure and model deployment.
Model choice, infrastructure architecture and calibration accuracy
To understand the importance of model choice and calibration accuracy, we will consider the results of a motor insurance market simulation competition conducted by Imperial College London’s Computational Privacy Group (bit.ly/AIcrowd_pri_game), supported by the IFoA and other societies. In this competition, participants acted as insurance companies, building pricing models and competing against other companies to maximise profit.
A dataset of real historical motor policy claims data was initially provided, and candidates competed on claims prediction accuracy (root-mean-square error, or RMSE) and profitability. The teams submitted their models, which were ranked by their RMSE on a second dataset. A rank of profit (assuming the cheapest insurer sells the policy over randomised competitor markets) was evaluated on a third database. Competitors did not have access to the second and third datasets.
The competition was an example of a perfectly competitive market in which the product is a commodity (no product differentiation is possible) – like motor products on an aggregator. It did not incorporate external data, competitor price modelling, demand modelling or product and operational differences.
By analysing the profit leaderboard versus the model accuracy leaderboard and the corresponding market share, we can draw some useful conclusions. We have modelled the market share and RMSE to understand whether there was a statistically significant relationship between these factors and the average profit generated by participants (93% of participants were loss-making).
RMSE is a statistical metric that measures model accuracy based on infrastructure, model choice and calibration, while market share represents each participant’s sales as percentage of the market. These two factors play key roles in price optimisation, as the chosen price level ultimately defines your market share against that of nine other randomly selected participants.
Figure 1 and Figure 2 show the linear regression predictions between these factors and average profit. In Figure 1 we can see that the higher the market share, the lower the expected profit – and that above a 70% market share, losses increase much more steeply. The model includes an interaction between a market share of less than 70% and an RMSE of less than 500.3. Figure 2 shows that, in general, the higher the RMSE, the lower the profit, and models with lower RMSEs show less profitability sensitivity to market share increases.
We can draw this conclusion from the gradients of the low and high RMSEs in Figure 1.
The second-lowest RMSE was our own submission and included a blend of different models, such as XGBoost, generalised linear models and adjusted averages. To show how advanced and complex a model architecture could be, we present the one produced for the competition:
= h(f(XGboost Freq,GLM Freq) * g(XGboost Sev,GLM Sev), j(XGBoost burning cost, GLM burning cost))
+ 1(XGboost large propensity,GLM large propensity) * v(GLM large loss cost,adjusted average)
where h, f, g, j, l and v are functions representing ensemble methods implemented in the model architecture. On top of the above expected cost, a loading of 15%, with a minimum of 7.5 and a maximum of 75 currency units loaded, yielded the competition’s eighth most profitable position.
To further support our analysis, and for the interested reader, we have published the dataset used, analysis code and visualisations on a Github repository (bit.ly/PriCompGame).
Another element of the pricing formula for non-life insurers is the cost of other operations that may be attributed to the insurance products, such as unallocated claims management costs or IT costs. Activity-based costing (ABC) is widely used in the industry and can assign overhead and indirect costs (such as salaries and utilities) to individual policies or claims. It is based on activities, units of work or tasks with a specific goal, and considers them as cost drivers.
The method assigns a portion of the total pool cost to a specific activity and then to a product or service, based on the usage of the activity. As a result, an insurer can use ABC to improve loading accuracy by better understanding its costs and enabling a more appropriate pricing methodology. For example, instead of proportionately assigning an unallocated claims management cost (such as claims payroll) to individual claims, ABC can be used to allocate costs to individual claims in a way that is proportional to the required amount of time spent managing them (the cost driver). Under this method, the insurer might allocate fewer costs to claims that are treated as business-as-usual than to claims requiring more intensive management.
Underwriting, claims, fraud, reinsurance, investments and other operations
An insurer should be able to manage costs as well as measure them accurately. This might include improving underwriting procedures and controls, adapting to consumer need by providing more appropriate products, and improving terms and conditions, thereby optimising product performance. This can remove unprofitable risk segments, and vice versa.
Effective claims management procedures can reduce claims costs. Negotiation is an important skill, and collaborations with external service providers (such as motor insurers collaborating with garages) can also reduce costs. It is also important to invest in and build appropriate fraud detection infrastructure – see our article ‘Road testing: Machine learning and the efficiency of fraud detection’, in the June 2021 issue of The Actuary at bit.ly/TheActuary_road_testing.
Optimising a company’s reinsurance programme can create strategic advantages through the lowering of costs and thus product prices. Prices can also be reduced by optimising for a higher investment return and by optimising other operations and IT infrastructure in order to gain efficiencies.
Insurers also need dynamic processes to make sure that any changes in actual costs are quickly reflected in pricing. This can be achieved through open communication between a company’s various units, and by setting up appropriate processes and infrastructure. Analytics and monitoring are vital in the pipeline because when performance diverges, they trigger corrective actions such as model recalibration and price optimisation changes to control business performance.
External data and competition prices
Using external data in the modelling process can also improve accuracy. Joining internal information with external sources (such as location and crime rates, customer profiles and previous claims history) has proven to be a statistically significant factor in boosting model performance.
Competitors’ price data is a special form of external data and can be particularly important, especially when used as a signal of divergence (for example leaving money on the table). Insurers model their competitors’ prices and use these predictions for price optimisation. Predicted competitor prices are the most important factor in new business demand models for price elastic distribution channels. For more advanced deployment infrastructures, competitor price predictions would be a different microservice that would feed into the optimisation algorithm.
“Insurers need dynamic processes to make sure that any changes in actual costs are quickly reflected in pricing”
Some consider price optimisation unethical. The issue is not the mathematical techniques, but the fact that motives have not always been as ethical as they should have been, and governance of the process not as stringent. Some jurisdictions prohibit or restrict these techniques through regulation that insists providers only charge close to the technical prices. But used in the right way, price optimisation can benefit customers and providers by helping insurers to achieve an ethical pricing strategy in an efficient way.
Insurers use demand, cost and competition models to forecast possible scenarios for the next three to five years, considering different strategic decisions such as new business discounts. Comparing the expected target outcomes (for example profitability) in each scenario allows the insurer to choose the optimal strategy.
To achieve price optimisation, insurers use different strategies for different distribution channels, usually by offering different brands. Intermediary prices will need smoother prices, as the broker is seen as the customer and wants to serve all of their own customers. Aggregators are about having the lowest price. In some cases, direct channel customers have been more likely to suffer from price walking due to the weaker competition in this segment.
Commercial solutions can provide automation, deployment speed and centralised control. The open-source route can allow organisations to customise and adapt to the latest technologies more quickly than the competition, and a variety of skills are needed to support the infrastructure. Deployment through the existing policy management system can be the least costly route, but not all models and complexities are supported, and deployment is slow and not centrally controlled. The deployment infrastructure should adapt to insurers’ needs and skills in order to maximise the value added. Due to the importance of infrastructure deployment in an organisation’s success, we will be elaborating on this subject in a future article.
Yiannis Parizas is an actuarial pricing consultant
Phanis Ioannou is a quantitative risk manager at Grant Thornton