Quantifying Threats Through Risk Measurement
To be effective, enterprise risk management should not merely identify risks but determine the impact of those potential threats on the enterprise. Risk measurement develops data sets that quantify the factors that could disrupt the business. It then arranges them in order of importance so that risks can be better prioritized and managed.
This technique should be throughout the risk management process, from identification through mitigation. Early on, qualitative techniques categorize which risks are the most important and most likely to occur, so planning efforts can focus on those. Then, a scaled evaluation determines the probability of each risk and parses minor from severe threats on a one-to-10 scale.
Quantitative Risk Measurement
Research from Deloitte reveals that behavioral modeling, parametric modeling and baseline protection are among the top options available for risk measurement. In terms of specific tools, RiskAMP cites the Monte Carlo simulation as the most predominant. This model forecasts the likely outcomes from the risk data that are put into its simulation. Other tools that can help quantify risks include program evaluation and review techniques, statistical sampling, sensitivity analysis, decision tree analysis, financial ratios and critical chain.
Research from AICPA found 18 percent of responding organizations use primarily qualitative risk management methods, including those in large organizations, public companies and financial services firms. Meanwhile, 54 percent use a mixture of qualitative and quantitative methods to measure risk. The remainder have zero to minimal formal assessments of the risks they face.
The Challenge of Poor Data Quality
But the results from any tool are only as good as the data you put in. Many enterprises struggle with the volume of data they collect, much of which exists in organizational silos. Information needs to be aggregated across functional areas so risk management strategies can be set at an organizational level.
Another problem with data quality lies in information related to cyber risks. Since many incidents are not reported, it can be difficult to quantify the volume, frequency and likelihood of attacks with a high degree of accuracy. Risk management strategies become better informed as information sharing increases and threat intelligence feeds are more widely available from a variety of sources.
There are standards and frameworks available to help organizations to improve their risk management practices. The National Institute of Standards and Technology has published a Framework for Improving Critical Infrastructure Cybersecurity to help organizations improve their security programs. For risk management purposes, it recommends standards such as ISO 31000, ISO 27005 and its own SP 800-396 244.
The ability to effectively measure risk is essential to improve resiliency efforts and enhance enterprise security, but it isn’t an exact science. High-quality data is the best way to ensure an informed outcome.