During World War II, fighter pilots would come back from raids in planes that were riddled with bullet holes. No surprise; this was, after all, war. But the Allies (as every fighting force before or since) were looking for ways to reduce losses. A group was put together to determine how to better protect the planes. Additional armor was considered an important part of the equation. So the group, by studying the bullet hole patterns on the planes, was determining where additional armor might be added.
This was no trivial decision. Obviously, the right solution would save lives. But, because of fuel and speed considerations, there was only so much weight that could be added. The decision on where and how much extra plating should be added wasn't as simple as saying "Slap it over the entire body of the plane."
There was a pattern of bullet holes, areas where the most damage was sustained. The team decided the best approach was to strengthen these most commonly damaged parts of the plane.
Makes perfect sense, doesn't it? However, there is an error in this thinking. And I'll give you a second to see if you can deduce what it is. (And, for those of you who already know the answer, don't shout it out; we'd like everyone to have a chance. And, in case you're wondering, I didn't catch on, either.)
Mathematician Abraham Wald was a part of the team and he pointed out that there might be a better way to look at the data. He conjectured that the reason certain areas of the planes weren't covered in bullet holes was that any planes shot in those areas never made it back. This becomes more obvious when you learn that those areas were the engines, the fuselage, and the fuel system. Additional armor wasn't necessary for non-fatal areas; it was necessary for areas that would result in failure. Therefore, the protection didn't need to be placed where the bullet holes were, it needed to be placed where the bullet holes weren't.
Pretty obvious in hindsight. And a pretty nifty story to boot. But this example does much more than entertain, it reveals that data is not, in and of itself, the answer. Data is important, data is crucial, and data is fundamental to any analysis. But data is just the facts that are available. Instead, the important thing is the story the data tells. And that means digging beyond "what" and moving to "why."
About a week ago, I gave a presentation on social media and fraud. I included a slide that included a bullet point so fundamental that it seemed almost silly to include. However, I've learned we must constantly remind ourselves — in fraud investigation, in internal audit, and in life — of some of the most fundamental things. Therefore, I included this simple question: "What do you want to know?"
Simple, but one of the trickiest questions to answer correctly. We go in, data-analysis guns a-blazing, confident in our assurance that we know exactly what we want. But, if we continue without thinking — without asking, again and again, what is we want to know — the result will be answers that are half-formed, hastily satisfying, and, more times than not, wrong.
The team analyzing the patterns of bullet holes thought they wanted to know the answer to the question "Where are the bullets hitting?" If they continued without questioning that approach, the results would have been disastrous. However, they continued (at least part of the team continued) to ask questions until they realized the wrong question was being asked. Instead of "Where are the bullets hitting?", they really wanted to know the more fundamental "How are planes shot down?" The location of the bullet holes did not answer that question, it only answered the question "Where have these particular planes been hit?" The simple act of understanding the real question being asked — the simple act of moving perceptions — changed the interpretation of the data. And a more persuasive and important story came from that data.
Internal audit's increased use of data analytics, as well as the implementation of bots, robotic process automation, and artificial intelligence, means we are getting more data. Useful tools, every one of them. But every one is actually useless if we do not know what we want to know. Without an idea of the question we are asking, all we are doing is drowning in data toward no real purpose.
We need to ask the right questions — to know whether we care about where the bullet holes are or if we care about where they are not. We need the ability to understand the story that data is telling. And we need the ability to turn that data into answers to the questions we should be asking.