[Skip to content]

Sign up for our daily newsletter
The Actuary The magazine of the Institute & Faculty of Actuaries
.

Artificial intelligence and life assurance

Customers perceive and assess companies on the basis of how well they fulfil their requirements. With life products, actuaries and life insurers are continually faced with a common business problem in using volumes of data that accumulate randomly into useful information. Generally, actuaries employ standardised symbols in representing insurance quantities. Aided by the number-crunching ability of today’s computers, new and powerful applications have been developed. Some of these applications employ artificial intelligence intertwined with knowledge engineering.
Artificial intelligence is defined as a set of machine competencies that allow it to use acquired knowledge and to solve new tasks ‘efficiently’ while performing under new conditions. This is made possible by expert rules and algorithmic representations, which in turn vary in properties with regard to composed formulae. The system developed becomes knowledge-based when actuarial formulae are written into artificial intelligence algorithms and both, combined, result in computational intelligence. Knowledge-based systems are essentially computational systems which recognise, and therefore swiftly manipulate, all forms of actuarial definitions and so create a viable link between actuarial symbols and algorithmic syntax.

Computationally intelligent germs
Starting with a fundamental probability law relating to life insurance distributions, the germ serves as a nucleus for computational convenience. Within the germ’s context, moment-generating functions are applied to sufficiently describe the core insurance probability law. Various types of life product, such as temporary, deferred, linear, geometric, etc, are then computed as possible hybrid instances of the germ function. Once these are established, derivation of net and gross premiums can then be automatically facilitated.
In deriving insurance types, the germ is robust enough to allow the definitions of different but related operations, ie endowment, annuity, n year temporary, m year deferred.
These operations are then mapped onto the core germ. In effect, the core probability distribution is the germ from which other insurances can be mapped. Operations are written into algorithms, which as an analogy specify the underlying skeleton calculations made by the actuary.
In practical reality, ‘thinking pads’ have been developed based on the above concept. However, it has been found that computational performance of some formulae may slow down. This, by and large, is compensated for as greater degrees of formula flexibility can be attained. In some countries a small change in interest rates (from 4.0% to 3.25%) dictates the creation and embedding of 12,000 tables in the administrative software of the average life firm. Normally this Herculean task implies a time delay of nine months by law, but the advent of the thinking pad has dramatically reduced this timescale to only three days.
After design, actuarial formulae can, if requested, be automatically downloaded to the mainframe for batch computing. Research is currently underway into applying computational intelligence in optimising the downloading of formulae without human supervision. In this way the actuary’s knowledge is stored and reused without intrusion from computer programmers.

Dialogue management
By creating basic building blocks (germs) in defining algorithms, it is possible to develop an efficient flow of control for the design of actuarial formulae. The flow of control of the dialogue mirrors likely pathways, through which basic operations are employed to arrive at intended formulae. In a similar vein, building up of complex actuarial formulas from basic symbols can be adapted to basic building germs in arriving at the end formula.
A flow of control implies logical tracks similar to a network of nodes. These in turn consist of a series of chronological statements. These logical tracks, although written in the protocol of IT, are simply physiological applications that are based on the actuary’s professional knowledge.
As in the actuarial profession, fast and accurate calculations can be facilitated by a well-defined flow of control, and vice versa. For optimal use the intelligent and experienced dialogue manager must discern operations that eventually facilitate or otherwise inhibit derivation flows.
continued on next page
Efficiency and safety
Basically there are seven major types of life insurance in the massive store of potential germs:
– endowment
– annuity
– whole life
– n years temporary
– m years deferred
– geometric increases
– linear increases
all of which can be individually mapped into the core germ. Additionally there are more than 1,000 unique actuarial formulae, which can be employed as building bases in generating other actuarial formulae. To avoid the creation of wrong and unnecessary formulae, a closure algorithm process is a very robust way of establishing the limitations of insurance calculations using the germ principle.
It serves as a check to the programmer or engineer and prevents the generation of wrong insurance. By applying the linear machine matrix approach, based on a transitive closure algorithm principle, it is possible to determine how many fundamental actuarial formulae exist before other varying forms can be created.
Interestingly, the germ model developed can be effectively integrated in managing a dialogue with a computer on a website and so becomes a very powerful tool in rendering insurance services. A major advantage of this is that insurance underwriters can employ systems with fully capable computational intelligence in a computer. This can help to derive and deliver tailor-made unit-linked products to prospective clients at will. o

Henk Koppelaar is a professor at Delft University of Technology, and Michael Olowe is a consultant at Practis Software in the Netherlands

01_02_05.pdf