Anthony Fitzsimmons and Derek Atkins ask whether there is an opportunity for actuaries to address hard-to-quantify risks

In his Harvard Business Review column 'Don't get blinded by the numbers', Professor Roger L Martin challenged the trend towards 'business-by-numbers' and the idea that a firm's success is driven by its data and its modelling capability.
He pointed out that models break down in the real world and proposed that strategy should be as much about interpreting as it is about analysing.
According to his observations, human concepts like trust cannot be reduced to numbers but yet they still need to be interpreted and understood; therefore, successful business strategists should be "good communicators, comfortable with ambiguity and ready to abandon the quest for certain, single-point answers".
This should strike a chord. Actuaries are brilliant at dealing with certain classes of risk but such skills did not prevent the financial, banking and Libor crises; nor were they relevant to preventing the United Airlines passenger fiasco, the Volkswagen scandal or corruption at Rolls-Royce. These slipped through the nets of tens of thousands of competent, diligent risk managers, internal auditors and actuaries.
In The Actuary article 'When culture goes wrong', Paul Harwood told fellow actuaries that much can be learned from financial scandals; and, in an earlier piece, he described "management risk", a concept that we have researched for years.
So why do companies led by intelligent and honest leaders fail, despite large and diligent teams of risk professionals? Consider a hypothetical example: a large plc announces that profits have been overstated by £300 million. The board is stunned. Shareholders are furious. Following an inquiry, the cause emerges: the accounts team overstated receivables. They are fired.
Air accident inquiries regularly used to blame accidents on 'pilot error'. Stanley Roscoe, a leading aviation psychologist of the 1980s, pointed out that such conclusions were "the substitution of one mystery for another". He spurred aviation investigators on to do better. They did. And the result is that today we all enjoy a level of flight safety that was unimaginable in the past.
We too should dig deeper, and asking 'why?' is the best tool for the job.

Few, if any, risks such as these appear in any company's risk register. In our research, we set out to deal with this lacuna, beginning with a deceptively simple definition of reputational risk: 'The risk of failure to fulfil the expectations of your stakeholders in terms of performance and behaviour'.
Performance is what you do, behaviour is about how you achieve it. Much performance risk is captured by classical enterprise risk management, but root-cause risks from behaviour are not. Worse still, these risks are double-acting; they increase an organisation's vulnerability to crises by causing systemic weaknesses; and if a crisis occurs, they tend to tip it into a reputational calamity, especially if they have manifested before.
This is the hole in classical risk management that explains why the financial and banking crises happened despite armies of risk professionals. And it explains the stream of new crises that continues unabated.
These risks and vulnerabilities are difficult for insiders to see in advance, because cognitive biases and other phenomena prevent us seeing ourselves as clearly as equally well informed outsiders can. Neither do risk professionals have the training to work with these risks, which we name 'behavioural', 'organisational' and 'board' risks.
Power problems
The range of these risks and their systemic consequences is limited only by the extent of human ingenuity. Other things being equal, their importance as root causes increases with power, so that their manifestation among top leaders matters most. Examples of risk areas include the following, which apply at all levels, including boards.
Character weaknesses such as insufficient self-confidence and humility to welcome challenge and contradiction
- Ineffective challenge
- Insufficient diversity of skill, knowledge, experience, background and perspective
- Ignorance of how heuristics and cognitive biases affect perception and decision-making
- Inadequate culture
- Undesirable incentives, behavioural as well as financial
- Defective communication
- Inability to learn from errors
- Complacency.
Risks such as these are hard to discuss internally, because our human 'tribes' operate under social conventions that include what anthropologists call 'social silences': subjects people won't discuss because, as FT journalist Gillian Tett put it, they are "dull, taboo, obvious or impolite". Since many of these risks have their origins in leaders, risk professionals will be reluctant to discuss them if they fear they may put their own career at risk.
When all is going well, people rarely question the role of luck in their success. Meanwhile, systemic risks manifest in multiple minor mishaps that are disguised by luck or fielded by fancy footwork. Harvested, these would provide valuable intelligence that could be analysed to find and fix deep vulnerabilities with systemic consequences before they cause harm. Missed, ignored or covered up, these opportunities are lost, increasing complacency and reputational risk.
Thus, unseen or unmentionable, and so unmanaged, these risks fester and incubate. In the meantime, leaders are lulled into complacency, believing all is well. The truth is that they are sitting on a ticking time bomb with a dodgy clock. When something blows up, they are stunned to discover that 'everyone' - but them - seemed to know what had been going on under their noses.
A programme to find and fix these risks ideally begins with education tailored for the board and executive team. Case studies are valuable in bringing the subject to life. From here, boards should require risk team training followed by clear authority and incentives to delve into these areas, even if their trail leads back to the board. And boards must, of course, find and fix their own weaknesses.
Cognitive biases will hide many of these risks from the insider's view, while social silences and fear may repress reporting what is known. It is therefore essential, in the first instance, to use a trusted outside specialist to help overcome these obstacles and ensure that any uncomfortable truths are both uncovered and explained to leaders in a way that enables them to absorb reality without loss of face.
Society needs a cohort of rigorous, robust risk professionals with the skill and strength of character to tackle these risks. The requisite skills could be developed through training existing risk professionals (such as risk managers, internal auditors and actuaries) as to how people 'work'; or they could be grown by teaching HR professionals, who already understand psychology and other social sciences, about risk.
While others hesitate there is an opportunity for actuaries to seize the initiative. Success will require a well organised training programme across the profession as well as honest identification of individuals with inherent aptitude.
Anthony Fitzsimmons is chairman of Reputability LLP
Derek Atkins was a visiting professor at Cass Business School and a partner in Reputability LLP. Sadly, Derek passed away while this article was being prepared.