GRC Keynote: Betting on the Right Risks
Keynote speaker Caspar Barry opened GRC Conference 2020 by asking participants to think about risk the way good poker players do — that is, to use "probabilistic thinking." Barry, whose resume includes both the study of economics at the University of Cambridge and a career as a professional poker player in Las Vegas, shared his insights from both of these pursuits in describing a healthy approach to risk at the conference presented by The IIA and ISACA.
Unlike playing roulette, a game that favors the house and has an overall negative return on investment (ROI) probability-wise, Barry describes poker as "calculated risk-taking" rather than true gambling. The best poker players don't take dangerous risks, like betting on a bad hand, he explained. But they do realize they will have to play a certain amount of hands to have a statistical chance of winning.
Just as good poker players know the probabilities of the hands they are dealt, businesses must understand the probabilities of both unlikely bad and unlikely good events happening. This means understanding the nature of the Gaussian curve and realizing that not only is it necessary to control for risk — stay in the middle of the curve — it also is necessary to reach for good outlier events to achieve success.
While everyone is now focused on devastating events such as the COVID-19 pandemic, the flip side to this is that highly positive events are also much more likely than people think, Barry said. The challenge is investing in things that won't always succeed, but will pay off some of the time. He cited Google as an example of a company that creates opportunities by giving people space and time to pursue innovative ideas.
"To make money in poker, you have to be willing to invest money and be prepared to take risks," Barry said. It's the same for business, he explained. "Those opportunities are out there. But we need to be able to take risks in order to achieve opportunities." — Christine Janesko
Quantification Can Shine a Positive Light on Risk
Businesses have made great strides recently in leveraging technology to monitor and address risk, said Luke Nelson, KPMG managing director, Technology Risk Management, and Fernando Pinto, head of IT Risk at Societe Generale at today's GRC Conference. Yet, they said the next stage of the journey is using available tools to have a conversation with board members and executive management about technology risks in financial terms — terms that can be easily quantified and directly related to the business.
Historically, risk assessments are presented to stakeholders in a historical context using shifting standards that are difficult to compare to each other, the speakers said at a session titled "Driving Business Success With Technology Risk Quantification." For example, how can operational risks and technology risks be accurately compared on an even plane?
Instead, the speakers said internal auditors should be proactive in their view of emerging risks and threats, and apply accepted statistical principles that can be assessed directly in terms of ROI — rather, what the business will gain should the risk be addressed. In many ways, reporting data in this way simplifies the risk conversation and allows stakeholders to make decisions in stark money-based terms, which in turn makes top-level buy-in much easier to achieve.
Risk quantification requires adherence to two key pillars:
- A technology-simulation and decision-optimization model that is repetitive and grounded in sound analysis.
- A board reporting strategy that allows internal audit to present results appropriately.
Quantifying results without the ability to present them in a meaningful manner leads to the same result as no quantification at all, Nelson and Pinto explained.
Risk is almost always presented in a negative light. However, with this more proactive reporting philosophy that shows how reducing the risk allows the business to make more money, the risk conversation is cast in a more positive hue that creates greater interest, quicker action, and confident decision-making from those in positions of power, the speakers said. — Logan Wamsley
Implementing Cyber Resilience
Building cyber resilience is about understanding which systems and data support the organization's mission-central operations, according to presenters at the GRC Conference. Such knowledge can help ensure that the organization prioritizes cyber risk decisions based on the business' goals, said John Eckenrode, director of cyber resilience at Guidehouse, and Geoff Grogan, a management consultant on the firm's Cybersecurity Solutions team.
During a session titled "Cyber Resilience: 'Risk Management' Is Not Enough," Eckenrode and Grogan said organizations need to break away from a siloed approach to risk management. Instead, they described a three-tier model based on the U.S. National Institute of Standards and Technology's Special Publication 800-39, Managing Information Security Risk, consisting of:
- Tier 1: Senior management.
- Tier 2: Business lines.
- Tier 3: Projects and systems.
For successful cyber resilience, "communication is key and it must be intelligible among the tiers," Eckenrode said. For example, senior management should set the tone for business priorities, while business line owners should define which operations are central to the business' mission and communicate that down to Tier 3. Conversely, Tier 3 should communicate cyber risks and vulnerabilities up the pyramid, so that Tier 2 can identify the potential business impact and notify Tier 1.
Grogan said implementing cyber resilience involves four components. First, cyber risk governance facilitates communication of risk up and down the pyramid. Second, mission mapping identifies the systems that are key to mission-central operations, helping the organization determine which vulnerabilities are most important to address. Third, assessment and analysis help organizations understand the organization's cyber risks and risk posture, which then can be continuously monitored. Finally, scoring and reporting can provide risk information to management in ways it can understand and use. "Providing the right information at the right time to the right stakeholders is the recipe for success," Grogan said. — Tim McCollum
Control Automation: Reducing Risk and Cost Through Innovation
The costs related to testing controls over compliance are increasing, said Ethan Rojhani, a Grant Thornton partner specializing in controls advisory services, in Monday's session titled "Control Automation: Reducing Risk and Control Through Innovation." Publicly listed companies spend billions each year identifying, documenting, analyzing, remedying, and monitoring compliance controls. Yet, technological solutions are changing this situation.
Traditional, manual controls' testing is highly specialized, time-consuming, costly, and inefficient, Rojhani explained. At their best, manual methods allow testers to examine only a sample of transactions to determine whether controls pass or fail. Root cause analysis is limited and continuous monitoring is nearly impossible, he said.
Mike Golshanara, finance director in controls and compliance for Microsoft Corp., described how he worked with Rojhani to automate controls' testing using Microsoft Power Platform and Azure. According to the speakers, these tools require little coding knowledge, and they can be used easily by those with experience assessing controls and using Excel and similar apps.
Automated testing with cloud storage enables a trifecta of coordinated data acquisition, storage, and retrieval. The tools can test entire populations of transactions and generate results in real time. Moreover, trends over time can be analyzed and controls can be monitored continuously.
Rojhani said automation reduces testing time from days to minutes. Once validated, the automated process transforms documentation from thousands of spreadsheets into a single daily dashboard, which control assessors and end users alike can access in real time. Additionally, the automation architecture is scalable and can be modified for use with similar control tests, which reduces costs.
Those seeking to automate control testing should communicate with staff, vendors, and external auditors in advance to help them understand how the transition will help them solve problems more quickly, Rojhani explained. He identified three factors that promote successful adoption: 1) a receptive culture, 2) a clear business case, and 3) "energetic communications" to get people excited about control testing. — Lauressa Nelson