Internal Auditor’s blogs reflect the personal views and opinions of the authors. These views may differ from policies and official statements of The Institute of Internal Auditors and its committees and from opinions endorsed by the bloggers’ employers or the editors of Internal Auditor.

More on That Cybersecurity Thing​

Comments Views

Let's face it, you can't swing a dead audit schedule without hitting a reference to cybersecurity. We need to study it. We need to examine it. We need to be trained on it. We need to train others on it. We need to risk assess it. We need to audit it. We need to understand and process the ins and outs and in-betweens of all aspects of it. We need to gird our loins and sharpen our steel and batten the hatches in preparation for cybersecurity's assault on Helm's Deep.

Last week, I posted my thoughts about how our fascination regarding cybersecurity risk may be distracting us from the new risks on the horizon. But there is another issue. And it relates to how all our perceptions are impacted by the onslaughts of information we currently endure, making us believe something is more important and more prevalent than it actually is.

I have been reading (very slowly and very carefully) Daniel Kahneman's incredible book Thinking, Fast and Slow. I am reading it slowly because it is packed with insights about the way people think, make judgments, and perceive the worlds around them. And it is crammed with information that is vitally relevant to internal auditors. No hyperbole, every chapter has caused me to pause and think about what the contents of those chapters have to say about the way auditors make judgments and how people make judgments about auditors and their work. (I've already cited the book in a blog post back in October. Two posts, and I'm only up to page 140.)

In Chapter 13, "Availability, Emotion, and Risk," Kahneman was once again speaking directly to internal auditors. I'm sure he didn't know he was speaking directly to us, but he was. (And at this point I will note that the following couple of paragraphs, while not necessarily made up of direct quotes, draws heavily from the information, and sometimes the exact words, within that chapter.)

Kahneman cites research on availability bias — the human tendency to think that examples of things that come readily to mind are more representative than is actually the case. The researchers asked participants to consider pairs of causes of death: diabetes and asthma, or stroke and accidents. For each pair, the participants were to determine which they thought happened more frequently and estimate a ratio between the two frequencies. (For you own dining and dancing pleasure, you might want to take a stab at it right now.)

The results were compared to the current health statistics. Here are some of the discrepancies in thinking that were identified:

  • Strokes cause almost twice as many deaths as all accidents combined, but 80% of respondents judged accidental death to be more likely.
  • Tornadoes were seen as more frequent killers than asthma, although the latter caused 20 times more deaths
  • Death by lightning was judged less likely than death from botulism even though it is 52 times more frequent.
  • Death by disease is 18 times as likely as accidental death, but the two were considered equally likely.
  • Death by accidents was judged to be more than 300 times more likely than death by diabetes, but the true ratio is 4:1.

My guess is that, if you actually worked through the problem, you came up with similarly inaccurate conclusions. So, why are we so wrong?

Time for a direct quote from the book. "The lesson is clear: Estimates of causes of death are warped by media coverage. The coverage is itself biased toward novelty and poignancy. The media do not just shape what the public is interested in, but also are shaped by it. … Unusual events (such as botulism) attract disproportionate attention and are consequently perceived as less unusual than they really are."

Let's get back to cybersecurity. As noted in the opening of this post, cybersecurity — the scare, the impact, the steps to be taken, the rattling of cybersabers is everywhere. And it is everywhere because everyone is interested in it. And because it is everywhere, everyone becomes more interested in it. And because everyone is more interested in it, information and scares regarding cybersecurity show up more and more often.

And the snake eats its own tail.

Let me reemphasize that I am not against recognizing cybersecurity as a significant threat. But is it actually the biggest, most important threat faced by any organization? Or are we (and our stakeholders) just responding to the herd mentality that occurs when media drives our perceptions (and our perceptions drive the media)?

Which leads to the bigger question. Has our focus on cybersecurity (an important but perhaps not most important risk) caused us to lose focus on the risks that are always out there — the ones that never change but continue to be a big deal?

Take a look at organizations that have suffered fatal hits. Have any of those occurred because of events related to cybersecurity? Sure, there are organizations that took a beating when cybersecurity raised its ugly, little head. But I believe the number of fatal hits is actually zero.

Is cybersecurity really a bigger risk than, say, financial, economic, regulatory and compliance, crisis management, fraud, reputation, political, human resources and people, and culture? (And the hits just keep on coming — feel free to add your favorite golden oldies to the list.) They are the same old, staid risks that we see time and again. And they have become boring. So, the lure of something new and exciting (particularly when our stakeholders are clamoring for it) is quite enticing. But getting swept up in cybermania will not serve the best interests of anyone.

Again, I am not picking on cybersecurity in particular. Instead, what I hope to be pointing out is that we have to continue to take a broad look at risks — the past ones, the current ones, and the future ones. And we have to be aware of the impact public opinion can have on something like risk assessment and make sure we do not fall into the trap of availability bias. Currently it is cybersecurity, previously it was Sarbanes-Oxley, and in the future it will be — well, I haven't really got a clue. And neither do you. But the important thing — the thing that really speaks to what internal audit should strive to achieve — is keeping a balanced view of all potential risks to help the organization succeed.

Internal Auditor is pleased to provide you an opportunity to share your thoughts about these blog posts. Some comments may be reprinted elsewhere, online or offline.

 

 

Comment on this blog post

comments powered by Disqus
  • Temple_Dec 2018_Blog 1
  • IIA_AEC_Dec 2018 Blog 2
  • IIA Sawyers_Dec 2018_Blog 3