Almost every enterprise risk management (ERM) program I have seen, consulted on, developed or read about follows a similar process of identifying, analyzing, responding to, and monitoring risks and opportunities. It doesn’t matter what risk framework you utilize — if you are not effectively making vulnerability and velocity part of that process, then your results simply won’t be as accurate as they could be.
The Committee of Sponsoring Organizations of the Treadway Commission (COSO) defines vulnerability as “the susceptibility of the entity to a risk event in terms of criteria related to the entity’s preparedness, agility, and adaptability.”
In the absence of it, your assessment results may elevate one area over another, guiding your organization and its resources towards mitigation efforts that, while still prudent, may not be the best approach at that time. Introducing vulnerability into your assessment process will assist in determining how well you are managing your risks, which directly affects the impact should a given event occur. To ensure you are deploying your resources effectively, understanding how vulnerable you are will assist you in determining where to allocate those resources to maximize your effectiveness. So, when you are reviewing your assessment data, and certain aspects of it seem to be equal, the inclusion of vulnerability can help be the deciding factor that tips the scale.
COSO defines velocity as “the speed of onset or the time it takes for a risk event to manifest itself, or in other words, the time that elapses between the occurrence of an event and the point at which the company first feels its effects.”
When you introduce velocity into your risk assessment process, you gain a better understanding of how quickly the impacts of risks will be felt by your organization. Determining how much potential reaction time you have to a given risk ultimately improves your overall assessment of risks, and will allow you to better align your prevention, mitigation, and response strategies. This means you can provide better prioritization of your resources to impact your most pressing risk needs.
As an example, let’s use an existing regulation with an upcoming change where the regulation impacts multiple processes, but the changes may impact fewer processes. When assessing compliance risk, you might generate similar risk scores if assessing only for likelihood and impact. The impact may be the same based upon the fines for non-compliance, and the likelihood may be relatively similar because some controls already exist to comply with the regulation. So, in the standard two-dimensional model (Likelihood x Impact), the risks might appear to pose essentially the same threat to the organization.
By adding vulnerability and velocity to your assessments, you begin to see that the processes impacted by the regulatory change not only increase your impact due to your vulnerabilities, but also due to the velocity incurred by how quickly the regulatory change requires compliance for those processes. In this example, you are clearly able to make determinations of which processes to allocate resources to first and which ones can possibly wait. Although this is an overly simplified example, when vulnerability and velocity are applied against your entire risk landscape, you will begin to see a change in how your risk data is presented — which means your risk mitigation strategies will be enhanced, and that saves time and money.
As Quantivate’s Vice President of ERM Services, William “Bill” Hord has over 29 years of experience in executive management within the financial services industry focused in risk management, business continuity, financial software, and lending & collections.