Modern risk strategy is hard, primarily because modern business is complex. Neil Cantle looks at the Own Risk and Solvency Assessment that firms will be facing

When we look at the risk strategy our business is trying to deliver we see a forest of multiple factors which depend on other factors, which in turn interact with others. It is hard to 'see the wood for the trees' and make sense of it all. Insurance companies are moving into a regulatory regime which requires an 'Own Risk and Solvency Assessment' (ORSA) exercise - a formal assessment of the risks they face, the resources available to meet them and clear communication about how they intend to manage them.
Strategy and risk
The achievement of strategic goals is rarely certain. The reality of the business world means while results should be somewhat clustered around the planned outcome, they could be quite different to what was planned. Some of these outcomes could be welcome out-performance, but many will not - risk represents the possibility of these unwelcome outcomes.
Risk management is an evolving discipline - historically more about hazard avoidance and mitigation but increasingly about insights into business performance and resilience. So how should one go about trying to unearth the uncertainties inherent in a modern insurance company and make sense of them?
The articulation of risk appetite is at the heart of exercises like ORSA, as it explains the types, and amounts, of uncertainties that you would like to be exposed to in pursuing your chosen strategy, those you will accept as necessary evils and those you would like to avoid. To operationalise the concept however, it is necessary to understand how those uncertainties arise and assign operational parameters to help the organisation know the boundaries of day-to-day activity. But this is where things get hard - surely there could be a million ways in which profit might not be the figure we had planned?
Complex phenomena have been studied by a number of disciplines outside of the business world and it turns out that the insights from these are helpful for us here. As described in Allan, Cantle, et. al. (2012), "For complex systems, like an economy or financial organisations, a new paradigm or philosophy is required to understand how the constituent parts interact to create behaviours not predictable from the 'sum of the parts'. Systems theory provides a more robust conceptual framework which views risk as an emerging property arising from the complex and adaptive interactions which occur within companies, sectors and economies."
People traditionally focus a lot of energy on the visible part of risk and uncertainty - the part above the water in the image (right). They identify undesirable outcomes 'crises' and seek to identify their 'causes' - the events which lead to them. The information collected at this level is often categorised and stored in databases in the belief that it can help inform predictions of future trends - a promise it generally fails to deliver on. This failure arises because we are still a way from understanding 'why' these events took place - we simply know that they did and what some of the potential consequences might be. We need to look deeper, and consider the part of the iceberg underneath the water in the image below. We have to seek out the patterns that will help us to make sense of how the events might be related in some way, and ultimately seek an understanding of the underlying mechanism which produces these. People are generally afraid to venture 'beneath the water' as they believe that the complex outcomes we see are surely the result of impenetrably complex dynamics and that describing them at this level would be impossible or prohibitively complex to be useful.
Some important misconceptions and myths about the behaviours of complex systems mean that some of the techniques typically used can actually be dangerously misleading.
The first error is thinking that a complex systems problem is best solved by reducing it to a series of simpler parts. The outcomes of complex systems are emergent, arising from the interactions of many underlying parts, and the understanding of these interactions is crucial to understanding the system overall. So, unlike merely complicated systems, complex ones cannot be reduced and must be studied holistically first. The second major error is ignoring adaptation and basing statistical analyses on historical behaviours which are unlikely to repeat. We therefore need a way to understand what is actually going on before we try to simplify our information or models.

People at the heart
There is an inescapable link between people and risk, not least because risk itself is a social construct. Companies are essentially groups of people, all trying to follow processes and procedures to achieve the particular goals of their organisation, introducing myriad complexities as they go about their work. But people are not just passive parts of the system. They are often actively trying to anticipate outcomes and influence them, creating feedback and non-linearities. A lack of complete information and understanding means that human interventions nearly always have unintended consequences.
In trying to simplify the risk problems we face we tend to make assumptions about the behaviours of others. In particular it is often assumed that everyone is behaving rationally and that their behaviours are consistent over time. Neither of these things tend to be true.
People also suffer a number of further cognitive shortcomings when we look at their role in risk assessment. People rely on judgmental heuristics (which are influenced by recent experience) and are fundamentally poor at assessing probability (Fenton and Neil (2012) gives a series of good examples of how people get this wrong) and yet we consistently rely upon expert opinions in our risk management activity - even models calibrated 'factually' with historical data are relying upon an expert's opinion that such a trend will continue into the future.
Another big challenge is that stable environments naturally select resources with skills optimised for that environment, reducing future flexibility. This process of specialisation and optimisation forms part of an adaptive cycle (Holling and Gunderson (2002)) - as a company becomes increasingly optimised and forgoes resources which assist flexibility it becomes increasingly fragile and exposed to changes in the environment. In areas such as ecology it has increasingly been accepted that resilience is a far more sensible target than optimisation when you are dealing with complex systems, but it does require short-term inefficiency by investing in resources which preserve flexibility.
Culture, or rather organisational behaviour, plays a crucial role in risk management too. The prevailing behavioural environment can have profound impacts on the way in which risks arise and how they are identified, assessed and managed. In particular, there is no single 'mood' or culture at any point in time, but rather a dynamic and evolving blend of four risk attitudes as described in Ingram and Thompson (2011):
? Pragmatists who believe that the world is uncertain and unpredictable
? Conservators whose world belief is of peril and high risk
? Maximisers who see the world as low risk and fundamentally self-correcting
In your head you form a view of the world that is helpful in making sense of the complexities around you. It is possible to largely recover these images by reformatting narratives about particular topics as cognitive maps. Each node on the map represents a 'concept' mentioned in the narrative and the links between nodes represent the connections that you make between these concepts. So, for example, the sentence 'increasing life spans is causing a strain on retirement income' could be represented by the linked nodes 'increasing life spans' and 'strain on retirement income'. Such maps can contain hundreds of nodes but the structure of the map lends itself to rigorous analysis which can identify the most connected parts of the narrative (immediately or more globally). These nodes which most often lead to such important concepts can also enable the identification of biases from the respondents and missing elements of the narrative. Narratives from multiple sources can be combined into a single coherent view of the problem.
? Managers whose world is risky, but not too risky for firms that are guided properly.
We therefore need to understand which blend of risk attitudes we have at any point in time and the drivers leading them to change.
It is also important to note the overall culture, or that of subgroups, is an emergent property of the group and is therefore different to how someone might behave on their own. It is important not to 'judge' culture against some perceived perfection, but to understand
the interaction between it and risk management activity.
Harnessing expert input
So, people are at the heart of generating complexity and the failure to understand its meaning for risk. All is not lost, however - there is a range of techniques that can be used to make sense of these things.
We have seen that people are not necessarily the best source of information about risk - but they are often the only source of information. This is particularly the case where events are rare or where emerging trends can be imagined to a new conclusion that has not been seen before - historical data will have little or nothing to add to the analysis of such situations and yet these are precisely the ones that most risk managers are faced with on a daily basis. It is possible to recover the collective insights of your experts using cognitive techniques, like cognitive mapping (Eden, 1988) (see above for a brief overview of cognitive mapping), to distil a robust and meaningful insight into what is happening. The use of cognitive maps to capture and analyse the narrative of your experts provides a rigorous way to form a coherent single story. From this you can develop a deep understanding of the most important dynamics of your risk profile to feed into a wide range of risk management activity, including the ORSA.
Forming an understanding of the underlying drivers of uncertainty is crucial if we are to make any kind of progress in assessing the risks that can emerge. In the real world we are nearly always faced with large gaps in our data relating to any but the most frequent observations, so a cognitive method for getting our first understanding of the system is invaluable.
Assessing complex risks
There are now a number of additional factors we can consider in trying to assess and understand our risks. First we can attempt to build models which replicate the interesting dynamics that our experts have explained.
The benefits of using a cognitive approach before proceeding to modelling are described, for example, in Cantle, Charmaille et al (2012) "financial stresses are serious, but the political and reputational aspects of [the organisation's] critical success factors mean that failure could very well come from other directions Actuarial models are very powerful however, for reverse stress testing the challenge is to know which scenarios should be considered The model simply cannot tell us which scenarios to look at. We must decide which scenarios to look at ourselves and then use the model to evaluate them."
Assuming we have sufficient data, statistical models may well be capable of mimicking the outcomes but they have little to say about the drivers of such outcomes. As described in Fenton and Neil (2012), it is far more productive to consider causal models, such as Bayesian Networks, which "help us to make sense of how risks emerge, are connected, and how we might represent our control and mitigation of them." In particular we would like to be consistent in the way that we handle uncertainty when we study our risks, meaning that we have to find a way to incorporate subjective judgments about uncertainty. We also need to be able to revise our views when new evidence is observed.
The Bayesian approach permits a subjective view of uncertainty which enables us to make much better progress with our risk studies than the classic frequentist approach that typical statistics requires.
Processes like ORSA demand rigour in areas where risk management is traditionally weak, such as capturing judgment and expert knowledge about things the data doesn't know. Framing risk using insights from other sciences which embrace complexity, culture and psychology brings the opportunity to add that rigour and also improve the resulting insights obtained.
References
Allan, N., Cantle, N., Godfrey, P., Yin, Y. 2012. A review of the use of complex systems applied to risk appetite and emerging risks in ERM practice. In British Actuarial Journal December 2012 pp. 1-72
Cantle, N., Charmaille, J-P., Clarke, M., Currie, L. 2012. An Application of Modern Social Sciences Techniques to Reverse Stress Testing at the UK Pension Protection Fund. Paper presented at the ERM Symposium, Chicago 2013.
Eden, C. 1988. Cognitive Mapping. European Journal of Operational Research 36(1), 1-13
Fenton, N. and Neil, M. 2012. Risk assessment and decision analysis with Bayesian networks
Holling, C., Gunderson, L. 2002. Panarchy: Understanding Transformations in Human and Natural Systems
Ingram, D., Thompson, M. 2011. Changing Seasons of Risk Attitudes. In The Actuary, February/March 2011 pp20-24
Neil Cantle is a principal working in Milliman's London office.