Open-access content
Tuesday 1st December 2015
—
updated 5.50pm, Wednesday 29th April 2020
It is difficult to assess all sources of enterprise risk, says Brenda Boultwood, but it is vital if a company wants to demonstrate sound decision-making

Qualitative risk: the new frontier
Regulators require banks to develop operational risk frameworks, just as banks have to do for market risk and credit risk. The expected levels of operational risk should be established, and consistent with strategic objectives. Organisations must specify the type of data and the frequency with which they will assess their risk events.
A great deal of work is being done now to make qualitative risk more quantitative, but it is still more art than science. For example, if a bank's deposit system is down, this leads to customer or business disruptions, and there is the financial loss of incomplete transactions on that particular day. There is also a more subtle loss of reputation when the customer chooses to negotiate their next loan with a bank whose systems are more reliable. Banks are starting to collect information about these kinds of risk events. They are doing more risk assessments and are asking people in the field: how do we assess the quality of our risk and control environment?
Companies are also thinking about risk metrics because they want more rigour as they establish a more quantitative basis for qualitative risks. Management should decide on a set of metrics, which will then be monitored for breach of pre-assigned levels.
Take an example
Consider the risk of employee attrition. Part of mitigating this operational risk requires having strong human resources, involving how to attract, train and motivate employees with valuable skills. The risk of employee attrition can be quantified as the percentage of unintended employee turnover, transforming something once qualitative into something measurable.
The first stage of mitigating this risk is to start with a baseline to normalise observations. This can help the company measure the impact of loss owing to lack of well-trained employees. Although 'employee turnover' can be calculated and modelled, how can its actual impact be measured? If departing employees know the internal processes intimately and have forged connections with customers, the amount of monetary loss is rather difficult to quantify. Additionally, we must keep in mind the costs required to recruit and train new employees.
A quantitative risk such as market risk is understood in terms of value-at-risk (VaR). When a certain VaR level is breached during trading, this triggers a message to senior management, who decide what response is needed - accept the higher risk, liquidate the position, or buy a hedge - and then report that to the board. For a qualitative risk such as employee attrition, the company would choose as its baseline an expected level based on history - for example, 2.5% employee turnover.
Then things change. Perhaps a new competitor starts poaching staff, and turnover jumps to 10%. The dashboard that used to show green at 2.5% now shows red at 10%. When a certain level is breached, this triggers a message to senior management, who - just as in the market risk case - have a discussion. Important questions must be considered, such as: "Is this an anomaly?"
"Is this the new norm?". If 10% is the new norm, what is the correct response; more training, better policies and procedures, or more documentation of the skills required for that particular role? Perhaps more IT systems are the answer, because that would help automate much redundant work, and would also reduce dependence on what people are expected to remember.
The discussion around measuring this qualitative risk - establishing an operational risk tolerance metric for employee attrition - has now begun, and some tactical decision-making occurs. The board will eventually examine the breach, and the response thereto, in order to understand the impact on the organisation's overall strategy.
A window into enterprise risk
As companies grow and amass data, they become better at anticipating risk events. As a result, they can focus on better controlling their exposure, thereby limiting the potential for loss - both financial and reputational. There's a silver lining to risk events: fresh data points. Information before, during, and after risk events can and must be captured for use by your organisation's risk management, modelling, and analytics teams.
The data aggregation currently carried out by organisations is largely quantitative. It looks at values around revenue, profits, products produced, expenditures and more. Data aggregation around new metrics - especially around qualitative risks - is more difficult to establish.
Quality data can permit operational risk-based capital modelling, which ties into determining risk and regulatory capital adequacy. In addition to economic risk capital, senior management is also interested in regulatory capital, balance-sheet capital, and rating agency capital. Stress testing and scenario analysis of key risks can also reveal any potential Achilles' heel in desired capital levels.
Risk appetite linked to strategy
Ultimately, the board has accountability to a company's shareholders. Senior management is well aware that if a risk metric exceeds a threshold perceived to be normal, which is in line with past observations, management need an explanation. Is this a one-time event, or does this signal a trend?
With input from the risk management team, management will want to understand various risk implications that such data provides: does the organisation need a temporary plaster, or does it require a new blueprint? More controls might be needed, or the organisation might need to accept a new risk tolerance as part of its risk appetite. Risk data, and the insights that come from it, can provide management with the real-time knowledge needed to continuously fine-tune the organisation's strategy.
Use test analysis
Regulators require that banks perform use test analysis of their day-to-day risk appetite. A great way to assess your organisation's risk framework is to try it out on a new situation. Regulators want banks to be asking themselves what new risks are occurring, and how much extra risk capital those new risks could absorb.
Use test analysis should also be performed on potential mergers and acquisitions. Although both parties have a legal contract, do they know exactly how the partnership is going to work? Do they know exactly how each partner is going to deliver?
Use test analysis also feeds into stress testing. Analysts should ask themselves: "If we introduce a new product/deal/acquisition, what are all the new risks that we introduce?" The company can stress test different scenarios and estimate the level of confidence for what current levels of risk capital it could safely absorb if things went wrong.
A company has a risk appetite for both quantitative and qualitative risks, and both are needed to define an organisation's overall risk tolerances and boundaries. Metrics such as employee attrition, the number of network security penetrations, the length of system down-time or the number of process failures are critical data points. These can help a company calculate its operational risk capital and compare that to its risk appetite, as ultimately reflected in a company's capital and ability to absorb large losses.
No matter the industry, every organisation wants to invest in opportunities where the risk-adjusted return on capital is the highest. Every organisation also wants to avoid those rare, once-in-a-decade, black swan risk events that have the potential to destroy their business. It is only by having access to quality risk data that organisations can be in a position to thrive on risk.
Brenda Boultwood is senior vice-president at MetricStream