[Skip to content]

Sign up for our daily newsletter
The Actuary The magazine of the Institute & Faculty of Actuaries
.

Financial modelling tools where next?

Financial models and PCs are the key tools for actuaries today. Whether using a dedicated actuarial modelling system, a programming language, or a spreadsheet package, actuaries today spend a great deal of their time messing about with numbers on computers.

Some thoughts on the past
Although it may seem difficult to believe, there was once a time when actuaries didn’t have access to computers. And no, it wasn’t because the IT department was being particularly difficult, we’re talking about ‘The Time Before Computers Existed’, a fabled time when dinosaurs also roamed the home counties.
Although computer historians will spend hours arguing it, Cambridge mathematician Alan Turing is widely recognised as the father of modern computing. He developed a machine called Colossus in 1943 (see April’s Actuary, p31) and used it at Bletchley Park to help crack the German Enigma codes. This first computer used 1,800 vacuum tubes and was quickly superseded by a machine called ENIAC (electronic numerical integrator and computer). Built at the University of Pennsylvania between 1943 and 1946, ENIAC had 70,000 resistors, 10,000 capacitors, 6,000 switches, 18,000 vacuum tubes, was ten feet tall, took up 1,000 square feet of space and weighed 30 tons.
The next 50 years saw rapid progression in computing power. IBM claimed its first ‘personal computer’ in 1957 with the 610 Auto-Point computer. This was not a personal computer in today’s sense of the word what IBM meant was that it only needed one operator. It still cost $55,000. Personal computers as we know them started to emerge after the 8008 microprocessor was launched in November 1972. The following year saw microcomputers such as the Micral and the Mark 8 launched (in kit form of course), then in 1975 came Altair 8800, based on the new 8080 processor. This cost $439 and had 256 bytes of RAM and required programming by means of a switch panel.
The mid- to late-1970s are packed with what seem now to be seminal events: Bill Gates and Paul Allen founded Microsoft in 1975; Steve Wozniak and Steve Jobs founded Apple in 1976; the Commodore PET was launched in 1977, along with the Tandy TRS-80. Then in 1979 Dan Bricklin unleashed the first killer app on the world Visicalc, the first microcomputer-based spreadsheet.

Into the 80s the emergence of open platforms
In 1981 IBM, which had once derided personal microcomputers stating that they were a passing fad, launched its own PC. A growing software firm Microsoft was contracted to supply the operating system. Microsoft at the time had 128 employees and a turnover of $16 million. The rest, as they say, is history. The IBM PC was phenomenally successful. It quickly became the industry standard.
Two years later in 1983 a new corporation called Lotus, formed by Mitch Kapor, an ex-employee of Dan Bricklin, launched a new spreadsheet package called 1-2-3. It quickly became the new industry standard, where it remained until Microsoft launched version 3 of its Windows-based package, Excel, in 1989.
In all areas of technology there has been a tendency for consumers to favour the purchase of an open platform. Examples abound VHS, IBM-compatible PCs, Microsoft operating system, Word, Excel, etc. This has provided comfort in the knowledge that multiple suppliers can service the market and that purchasing technology for today’s need will not prevent meeting tomorrow’s requirements.

So what were actuaries doing?
Actuaries were quick to see the advantages of computers. Dedicated valuation systems were written for the mainframe computers that insurance companies had to store details of their policies. Actuaries also started to use spreadsheets to replace their calculators, slide-rules, and pieces of paper. Revenue accounts could now be produced on the computer! As profit-testing ideas were developed, mainframe systems were re-written to calculate profit tests and embedded values. The first dedicated actuarial modelling packages also began to appear on the market either offering to calculate statutory valuations or profit tests (but rarely both).
Time went on and actuarial software appeared on the PC platform rather than the mainframe, and gave actuaries a lot more control over their embedded value or valuation runs. PCs became more powerful with the introduction of the 80286, 80386, 80486, and eventually the Intel Pentium family of chips. By now the specialised actuarial modelling system was an essential tool on the actuary’s desk, to calculate profit tests, embedded values, asset shares and statutory valuations. Actuaries were also starting to create asset-liability models and link to stochastic models. Actuaries still use spreadsheets, such as Excel, to present results, create simple models (and in some cases complex models which often involve quite complex VBA coding) and do further calculations to work around the limitations of their actuarial system.

Systematic or opportunistic calculations
We can identify uses such as embedded values, statutory valuations, asset share calculations, etc, as systematic models. They usually involve a large-scale implementation project with well defined goals and objectives. These projects are best considered and structured in the same way as any other large scale IT implementation project within an organisation. The one key difference, however, is that the users (the actuarial department) are usually sufficiently skilled to take ownership of the project rather than outsourcing it to the general IT department. The traditional actuarial systems sold are structured to enable such in-house development. However, these systems are often based on today’s products and legislation and are not sufficiently flexible and open to be used to address tomorrow’s needs.
We can see from the past that actuaries have constantly evolving requirements for actuarial models and modelling tools. Already we see requirements to produce financial condition reports, complex asset-liability models, risk-based capital calculations and to restate figures on US GAAP basis. Looking into the crystal ball reveals international accounting standards, customer lifetime value applications, a blurring of the distinction between traditional reinsurance and financial instruments, and much more yet to be conceived. As computing power increases (estimates are that it doubles each year), actuarial requirements seem to triple!
If the requirements of the past are systematic, then the requirements of the future are opportunistic. In other words, actuaries want their actuarial software to free them from the mundane tasks of programming their models and ‘turning the handle’ to produce numbers. They want the modelling tool to empower them to adapt to their changing needs quickly.
In any growing and dynamic organisation there will always be a mix of systematic and opportunistic actuarial work. In many organisations such opportunistic work is done outside of the main traditional systematic actuarial tool employed. Typically, spreadsheets and general programming languages are used. In many cases these grow from simple models to become highly complex models (a spreadsheet is a program!) with complex underlying codes. Thus, in most organisations there is a multitude of programming languages employed for actuarial modelling.

Is a new approach required?
It is often said that we can learn lessons from history (and as actuaries we should know better than most). Looking back to the introduction of the Visicalc spreadsheet in 1979 we can see that it caused a sea change and put real power in the hands of the end user in a format he or she could easily use. Previously all programs had to be written on mainframes, or on the early programming languages (Microsoft BASIC had been released in 1978). This required specialist knowledge and support from the IT departments.
Visicalc allowed the user to create financial models on a PC without any complex programming. Instead of having to work out how to code a loop or how to print out results, Visicalc did it all for the user. The user was left to concentrate on the model being created.
The same principles apply in actuarial modelling. Increasingly, because of the inflexibility of our legacy systematic modelling programs, we are becoming more like programmers and less like actuaries. A financial modelling platform that can be adapted for all future needs takes care of all non-actuarial issues and leaves the user to focus on being an actuary.
Figure 1 shows where a platform fits in between programming languages which do not provide enough standard modelling functionality for actuaries, and the traditional systematic tool, which does not provide enough flexibility. At the 1999 SOA annual conference, an interesting session entitled ‘Actuarial software build vs buy’ was held. During the session a straw poll on who buys and who builds their software was held. Interestingly, both options drew 80%. This suggests that most people buy software and then either build and add to it or build software to meet needs not fulfilled by the purchased software, ie they either start at the programming language layer of figure 1 and work out, or they start at the outer actuarial systems layer and work within its confines. Does it make more sense to start between the two?

Is there a conclusion?
Life insurance actuaries, as part of their day-to-day job, require tools to calculate embedded values and statutory reserves. They are also using complex spreadsheets and database programs to do calculations outside their actuarial modelling system because traditional actuarial systems are not flexible enough for their emerging requirements.
Non-life insurance actuaries are often left working solely with spreadsheets traditional actuarial systems cannot meet their needs at all. The convergence of financial products and the commonality of financial modelling across the different fields of actuarial specialisation call for a tool to be used across traditional barriers. The next generation of actuarial software must be platform-based, allowing users to modify, customise, and integrate with their other modelling tools so as to meet both today’s and tomorrow’s needs.

00_06_07.pdf