If someone asked you if your internal audit department has an analytics program, would you hesitate and answer something like: "We hired an analytics person," "We bought a tool," or "We have a few tests running"? These are a team member, a tool, even some results, but none of them is a program. They are no more of a program than having someone responsible for quality would indicate that the department has a quality program.
People, tools, and results are elements of a program; they show evidence that
something is there. But at its core, a program is a new work function, and a work function is defined by whether you know what you are doing, and why and how you do it. When internal auditors understand the function, they can explain it to others, get their support, and if needed, divide work effectively. These are the hallmarks of an analytics program.
To explain analytics programs, it is useful to break them into five elements:
Program approach/fit — Why internal audit wants/has the program, and how the program fits with the mission of the organization and internal audit, in particular.
Test development process — The workflow, tools, and templates auditors use to approach these projects, get support, and track and obtain consistent results.
Development Roadmap — What internal audit will tackle first, what it will tackle next, and when it should re-evaluate the pipeline order. Also, why did the audit function choose this order?
Analytic tools and techniques — These tools and techniques start with access to data sources, and extend to analysis and communication approaches needed to process and explore the data to get the answers auditors seek. Beyond a tool, internal audit needs a toolbox.
Key Contacts — Members of the audit team and audit clients, as well as stakeholders who have interest in the results and understand the process. These contacts give auditors access to data and advice about how to interpret it. These individuals also can help with research and following up on findings. Key contacts will likely change with each new project under the program.
Benchmarking helps ideas flourish into new, and more powerful, concepts. When it comes to analytics programs, there are some people who know what they are talking about. Internal auditors should look at what they have to say:
Together these five elements articulate what internal audit is doing with analytics, why it does this, how it does this, and who is helping auditors make it happen. The elements enable auditors to talk about the program and rally support for it. They also define success and help internal audit track progress towards it. For example, internal audit could report that:
"The analytics program in our organization is embedded in the fraud and forensics unit of internal audit, where we use it to develop detective controls to support fraud-fighting efforts. We use Alteryx and Microsoft Power BI to combine simple red-flags tests to risk-rank transactions for human review. The red flags and their interpretation are developed in collaboration with the data owner business unit and/or benchmark with other audit shops. So far, we have tackled payroll and the procurement-to-payment cycle, and now we are scoping tests in travel and entertainment."
Note that although this is a data analytics program, access to data and business understanding are not called out as independent program elements. Instead, they are embedded in internal audit's network of key contacts. They are key elements of a project, but are not elements of the program that houses those projects.
Is the Program Working?
Internal audit has a successful analytics program when the department can transition it among team members without having to start again from scratch. This handover is possible because team members know what is done, why it is done, and in general terms, how to replicate it to address new problems. When internal audit reaches this stage of development, it will get extra credit if team members can recite the vision and explain what should be the next project on the horizon.
Analytics programs establish the support needed to achieve its aims: from tools and techniques, to relationships that will bring ideas on what to test and access to the data needed to do it. These relationships can support understanding of what the data is saying, and help in resolving the issues the program reveals. In the end, an analytics program is in place when it becomes "the way we do this here," and is no longer dependent on one key person to keep it going.