[Skip to content]

Sign up for our daily newsletter
The Actuary The magazine of the Institute & Faculty of Actuaries
.

IT: Changing the system

The relentless development of technology and information-processing power continues to revolutionise many aspects of modern life, with a number of advantages in the business arena.

In the first instance, a substantial reduction can be made in ongoing costs, albeit partially offset by costs for initial set-up and periodic upgrade or replacement. Service levels are also improving due to the rapid and reliable processing of complex or multiple transactions, and intellectual capital can achieve practical expression by being harnessed with the processing power required to make it useful.

The actuarial world is not immune to this trend and the potential benefits for enterprise risk management (ERM) are evident. The fundamental ability to calculate vast quantities of numbers rapidly and, as a consequence, to process more data and access more complex models, allows managers and regulators greater insight into the dynamics of the business both inside and out.

In reality, there are various practical and theoretical difficulties, but the use of increasingly complex actuarial models will continue to be mandated for financial management, reporting and solvency purposes.

As a result, there is a distinct opportunity for sophisticated actuarial modelling systems. The changes giving rise to this opportunity, however, have also drawn the attention of other disciplines and professionals, muddying the waters for providers of such systems.

The landscape
Until recently, a great deal of actuarial software has been developed and supported by actuarial consultancies as part of a broad range of symbiotic services, and treated as niche by IT departments. It has been marketed to actuaries and usually bought by actuaries, with little IT input to the whole process other than to ensure the necessary supporting infrastructure is in place. The focus of all parties has been on the actuarial requirements.

Such end-user requirements have not lost their importance, but the increasing emphasis upon capturing technical actuarial skills within a broad-based, robust ERM framework - such as that arising under Solvency II - means that actuaries now have less freedom to practise their black arts in isolation.

Instead, their activities are increasingly attracting attention from auditors - whose role in the valuation process was strengthened some time ago, at least in the UK, and whose position may be further bolstered by international financial reporting standards - and from IT professionals. In the latter case, this is driven by a number of factors:
>> A requirement to comply rigorously with corporate IT disciplines
>> IT companies offering integration services for systems, perhaps based on Basel II experience
>> Software operations showing more interest in developing financial modelling software.

General-purpose software houses have not always displayed a full understanding of development and support for specialised software. This is partly because they may not provide a comfortable home for domain specialists used to a particular working environment and remuneration structure. However, actuarial consultancies can continue to be successful by providing innovative software packages with strong intellectual capital content - indeed, they probably have no choice.

Sarbanes-Oxley (SOX) has also increased focus on the robustness of financial management and reporting processes. Even where SOX does not apply formally, combined with other influences it has created a climate in which many businesses feel the necessity to enhance their processes and accompanying levels of scrutiny.

IT is the bedrock of much of an insurance company’s processes and it is therefore unsurprising if risk managers give considerable weight to the opinion of IT professionals and if, in turn, those individuals are cautious - even conservative - in meeting their responsibilities. This can be complicated in situations where system selection is outsourced to independent consultants, who often adopt an over-engineered approach.

Formally, there is no such thing as SOX-compliant software, in the sense that each enterprise’s processes must be audited for compliance in their own right. Such compliance cannot be delegated or bought in. Nonetheless, it is possible for software packages to vary significantly in their support of a business gaining such compliance, not least from their reputation with auditors, and hence to vary in their acceptability to IT departments.

System architecture
Most actuarial systems have adopted thick-client architecture [A ’thick client’ is a computer (client) in client-server architecture networks that typically provides functionality independently of the central server; application-specific processing and not just simple interfacing is carried out locally on the user system.”, with most of the processing carried out on desktop PCs and little activity at the server other than file storage. Some risk managers regard this as a potential failure point, however, and are looking for reassurance that the actuarial system is not a weak link in the chain. Their expectation is for it to comply and integrate with the whole ERM implementation and, in short, function as an industrial-strength system.

There are no universal criteria for what makes an industrial-strength system. It is perhaps like a rhinoceros - hard to describe, but you know one when you see one - but in any case it is likely to be context-specific. While acknowledging that the concept is, to a certain extent, subjective and malleable, it is useful to understand what it can encompass. Strategic IT aims commonly include:
>> Reusability - putting code to multiple uses for efficiency and consistency
>> Maintainability - ease of identifying and rectifying problems
>> Redundancy - maintaining service by other means during failure of a part
>> Scalability - flexibility to increase capacity without artificial ceilings
>> Robustness - ensuring that systems are fit for purpose and reliable
>> Security - preventing unauthorised or unintended access or alteration.

These aims are usually of direct benefit to IT departments themselves, but they arise from, and are justified by, business needs of reliability, security and management control. The aims are not totally independent and conflicts will arise. In theory, the ultimate business needs should be the final arbiter in resolving those conflicts.

Attempts to control PCs directly can pose difficulties, and solutions often centre on some kind of client-server architecture that reduces the capability of PCs - or at least the reliance on them - and centralises much of the processing. All data is stored on servers, which generally have far greater security controls than most clients.

IT departments argue for server-based computing to improve system management and security, while end-users argue - and often with good reason - for more local processing power, control and flexibility. This can put end-users and IT departments at odds with each other, perhaps no more so than in the actuarial world where end-users have been accustomed to considerable freedom. In fact, it is usually possible to meet all legitimate concerns of IT departments and risk managers using thick-client architecture.

Another approach is service-oriented architecture (SOA), which repackages a relatively old and intuitive idea and is, to some extent, more a business philosophy than a technical specification. SOA breaks down applications into a number of self-contained functions called ’services’. Applications are then created by appropriate combination of the services, with a common language and protocol for transferring data and co-ordinating service activities.

Much of what SOA stands for may prove to be of little direct relevance to actuarial software, partly due to significant performance problems. It is currently a hot topic in parts of the IT world, however, and there is some temptation to adopt the acronym just to stick the label on the box.

Being slightly less cynical, its potential value lies in the light approach of ’wrapping’ the complete actuarial system - or possibly a few major sub-components - to present it as a service. This offers a quick and relatively cheap method of integrating the system into an ERM framework in a robust manner, highlighting some of the advantages of SOA.

High-performance computing (HPC)
As processing power has increased in its penetration and raw capability, it has encouraged the development of applications whose appetite for power has increased to match or even outstrip that available. Readers will be familiar with this from the actuarial discipline, but there are many other areas of application for complex models, and whole conferences are now dedicated to the subject of HPC.

Outright performance improvements from current technology have hit a wall, although the wall is not insurmountable as some of the issues are commercial rather than strict technical limits. New, experimental technologies also promise radical step-changes in performance, but they will not be much use for valuations at the end of 2008.

It has been clear for a long time, indeed well before Intel kept the PC market moving by putting two or more processing cores onto a single chip, that an important HPC solution is distributed computing. The concept is simple - divide a task among multiple computers - but the devil is in the detail. It requires tasks that can be broken down into parallel (independent) sub-tasks, and infrastructure with low performance overheads.

On one hand, many organisations have built bespoke distributed computing systems, either as add-ons for commercial packages or to assist with common-good projects such as medical investigations. Alongside this, a whole industry has built up around general-purpose, distributed processing, which often goes under the generic title of ’grid computing’. There is now a substantial body of grid-computing providers, and it seems likely that some consolidation will take place over the next few years.

Other initiatives include GPGPU, which stands for General Purpose computation on Graphics Processing Units. This arises from the observation that, in order to manage images in real time, GPUs have multiple cells working on different parts of graphical objects in tandem. The idea is to apply this inherently parallel capability to other (non-graphical) applications, but at present the technology is rudimentary and its ultimate general usability uncertain.

The support of distributed processing is arguably no different to any other part of a server’s ’plumbing’, and it certainly has the most potential when it is able to work closely with, or be part of, the underlying operating system. It is therefore possible that, in time, Microsoft will be the de facto provider within a Windows-only environment (subject to the European Union and other anti-trust regulators). In that situation, the challenge for other providers would be relentless innovation to show advantage over Microsoft.

Despite some important differences, grid computing also has certain points of similarity with SOA, and some believe that the two will merge in time. The Global Grid Forum is trying to drive a common standard - the Open Grid Services Architecture - to bring them together in a formal manner.

Dangerous distractions
Although actuarial system providers now have to pay much more attention to shifting IT developments, they cannot afford to let this distract them from changing end-user requirements. As a result, they find themselves juggling several balls, including:
>> Development of intellectual capital to meet financial management challenges
¡ö Packaging such capital within innovative software in practical and user-friendly forms
>> Ensuring synergy with corporate IT standards
>> Ensuring compliance with corporate audit and control standards
>> Working with systems-integration specialists
>> Meeting IT-defined service standards, as well as providing actuarial support.

This requires a significant amount of effort but, in the process, providers are interfacing with more of an insurance company’s professionals and disciplines than previously. Not only do they obtain feedback of benefit to systems development, but they also gain an outsider’s overview of the whole ERM framework within a business and are able to advise accordingly.

Other advisers and suppliers may share a similar perspective, of course, but actuarial system providers are uniquely equipped to couple it with actuarial insight and communicate additional value to their clients.

Paul Hopkins works in the insurance and financial services practice of Watson Wyatt. He was chief business architect for the VIPitech system, and manages the practice’s software development strategy.