Data analytics tools are nearly ubiquitous in today’s high-performance audit functions, with most either developing their analytics capabilities or increasing its use. And while the technology offers significant capabilities for audit enhancement, its value hinges on the users’ ability to put analytics tools into practice and effectively plan analytics engagements. Accordingly, one of the most important steps in implementing a data analytics program is estimating the level of effort required.
Determining the right level of effort for data analytics at each engagement can be difficult, and its consequences immediate — including flawed analytics strategies and testing outlines. Some audit shops may systematically set aside a given percentage of the engagement budget for the use of data analytics. This approach is suitable for repeated audits or when the audit department has observed resource usage trends over several years. But because the objectives and scope of some engagements can be unique, requiring specific sets of testing hypotheses and data sources, developing a systematic and sustainable mechanism for determining level of effort can result in a reasonable and justifiable budget for data analytics.
At the author’s organization, tackling analytics budgeting involved three main steps: obtaining audit leadership support for analytics, crafting and following a methodology for determining analytics effort, and considering several critical success factors. Although the audit universe will vary from one setting to the next, and no methodology provides a one-size-fits-all approach, focusing on these three areas can provide a helpful foundation for those looking to enhance their analytics efforts.
Obtaining internal audit leadership support is critical, as it sets the tone at the top for the effort and helps ensure a strong commitment to the use of data analytics on engagements. The CAE ideally should indicate his or her support for analytics use before the start of the annual risk assessment and audit plan development process. When communicating to staff, the CAE needs to explain the data analytics strategy and stress the need to allocate sufficient staff time at the engagement level. The CAE’s open support will also reinforce budget accountability and trigger awareness and staff buy-in for the analytics budgeting process.
Estimate Level of Effort
To determine level of effort, the auditors and data analytics team can begin by using a flagging system to identify potential candidates for data analytics. The list of flagged engagements can then be used to prioritize analytics work for effort estimation. The analytics team should also adopt a methodology to assess the likelihood and intensity of data analytics activities, as well as develop a level-of-effort matrix.
Identify Potential Candidates During audit plan development, internal audit managers should encourage their staff members to be mindful of analytics needs and to flag potential candidates for application of the technology. Because they know the organization’s business processes, auditors should be at the forefront of identifying engagements that may require the use of analytics and determining how it can be best deployed to support audit results. They should also consider challenges that may be encountered on each engagement. Basic questions that auditors can ask themselves include:
- Can the audit team use data to support potential findings?
- Is the entity under consideration for review being monitored through the use of key performance indicators (KPIs)? What are those KPIs? What are the underlying data?
- What are the quick data analytics wins if the audit/review were to be conducted?
- Considering the objectives and scope of the engagements, what are the two or three broad testing hypotheses that can be formulated?
- Are the data needed internal or external to the organization?
- Does access to the data needed require additional effort and approval?
For experienced, data savvy auditors, brainstorming sessions can be a useful tool for high-level consideration of potential data needs and sources. The exercise can also facilitate development of detailed testing hypotheses and help define testing limitations. Early identification of data needed and the sources of that data can help shape data access negotiations with the IT team or the data owners.
Assess Likelihood Once flagging is complete, the auditors and data analytics team can assess the likelihood of analytics activity for each engagement. A three-tiered assessment system can be applied:
- None. The engagement will not involve any data analytics activities, as its focus, objectives, and scope suggest that analytics will not be required. Reviews of process design or frameworks may fall into this category.
- Likely. The engagement may involve some data analytics activities. The analytics and audit teams anticipate that analytics work will be carried out — they have identified broad preliminary objectives and scope but cannot confirm them before the start of the engagement.
- Certain. The analytics and audit teams have determined the need for analytics, and the objectives and scope of the engagement provide strong indication that analytics work will be carried out. The auditors have identified a preliminary data analytics scope and comprehensive testing hypotheses.
Some gray areas might appear, as likelihood assessments are not always clear-cut. For example, at the time of audit plan development, the audit staff might not have enough information to decide whether or not data analytics activities will be carried out for some engagements. Or, the team may determine that analytics objectives and scope will be defined during engagement planning. Engagements with these characteristics should be kept in mind, and a contingent
budget should be set aside to cover them should the need for analytics work arise.
In other circumstances, the delineation between Likely and Certain might not be sharply defined. When this occurs, a hybrid assessment can be used — None/Certain, None/Likely, or simply Yes/No.
Estimate Intensity Analytics intensity measures the degree to which analytics activities will be carried out in the selected engagements. The level of intensity can be measured using a low-medium-high scale:
- Low: Basic analysis is expected to be performed, and analytics resource usage is estimated to be low. The analysis may include profiling and pattern identification, as well stratification, gap analysis, and calculation of statistical parameters to identify outliers. Factors to consider when assessing the intensity as Low may include whether there are few data sources and if data are readily available.
- Medium: Data analytics activities include profiling and pattern identification, stratification, gap analysis, efficiency measurement, benchmarking, and calculation of statistical parameters to identify outliers. Factors to consider when assessing the intensity as Medium may include whether data needed is external to the organization, whether the analytics team will make additional effort to gather the internal data needed, and whether the analytics team anticipates that it will join several data sources in different systems to identify inappropriate matching values.
- High: The engagement is considered to be heavily data-driven, or analytics is the core of the review. Analytics activities include profiling and pattern identification, stratification, gap analysis, efficiency measurement, benchmarking, data sequencing, and calculation of statistical parameters to identify outliers. Additionally, the analytics and audit teams are expected to develop complex analysis and hypotheses. Factors to consider when assessing the intensity as High may include whether any data needed is external to the organization and if the analytics team will make additional effort to gather the internal data needed.
Develop a Matrix Using the likelihood and intensity data gathered, the analytics and internal audit team can create a level-of-effort matrix to help determine analytics budget estimates. The matrix should capture the thought process for assessing the level of data analytics activities.
“Level-of-effort Matrix” at right depicts an example matrix, showing the extent of data analytics activities at the engagement level. The dark tan color indicates that heavy analytics activities will be carried out in the engagements that fall into that category. For example, Engagement E2, with a likelihood of Certain and High intensity, will receive the highest percentage of the engagement’s total budget — say, 50 percent. Engagement E1, in which likelihood and intensity are assessed as Likely and Low, respectively, will receive a percentage significantly lower than that of Engagement E2 — perhaps 10 percent. Engagements with likelihood assessed as None will receive no budget allocation for analytics activities. The analytics team should set percentages using professional judgment, taking into consideration trends observed in the past.
Key Success Factors
To ensure an adequate level-of-effort estimation, the analytics team should view the budgeting exercise as a dynamic, multidimensional activity that takes into account some additional elements. Specifically, success factors for the continuous improvement of the data analytics level of effort include validation of the analytics budget, adoption of a mechanism for funding the budget, and variance measurement.
Validation Process Although analytics level-of-effort estimation is primarily the analytics team’s responsibility, team members should work closely with internal audit. During level-of-effort formulation, the analytics team should ensure critical inputs are considered, including minutes of relevant audit staff brainstorming sessions, audit clients’ feedback on the proposed audit plan, and, if available, analytics usage trends observed during prior years.
The analytics team should constantly seek feedback from internal audit staff and management to ensure the assumptions and measurement indicators are well-understood. After applying the matrix, the team should conduct validation meetings with stakeholders, which may result in changes to the level of effort for each engagement.
The analytics team should record both calculated and adjusted levels of effort and document significant changes. This documentation is critical, as it can help refine the criteria for assessing likelihood and intensity of data analytics activities for subsequent years.
Funding Mechanism Because data analytics can increase engagement efficiency, support for a specific analytics budget should be clearly communicated across the entire audit department. Before sharing the finalized budget, however, the department must first decide whether to increase the original budget for the engagement by the analytics budget or to make the analytics budget part of the original engagement budget. “Data Analytics Budget Funding” below depicts each of these scenarios.
In Scenario 2, the general budget of Engagement E2 is increased by 20 days, which corresponds to the data analytics level of effort. This scenario suggests that the analytics budget comes out of a central contingency envelope. By nature, this practice might defeat any efficiencies gain through the analytics work.
In Scenario 1, Engagement E1 has an unchanged general budget. This scenario reflects the notion of “doing more with less” on an individual engagement. Moreover, it generates a high perception of accountability among the data analytics and audit teams.
Variance Measurement After each engagement or at year-end, the analytics team should compare the initial or adjusted budget with the actual days spent. Any variances observed can help gauge the quality of level-of-effort matrix estimates. Low variances may indicate that empirical assessment was effective, whereas high variances might be an indicator that the criteria for assessing effort need some refinement. When budget overruns occur, the
analytics team should consider two important factors:
- Experience Level. If the data analytics team is too inexperienced, substantial deviations from the initial budget can be expected. But as the team gains more experience, deviations caused by this factor should decrease.
- Analytics Process Maturity. In early years of data analytics use, level of effort can be significant. Factors that may contribute to budget overruns include absence of a strong partnership/relationship with data owners or the IT department, absence of a clear process for identifying data needed, poor quality assurance surrounding the data analytics activities, absence of a robust infrastructure that supports the analytics team’s work, and poor quality of interactions between the analytics and audit teams.
Benefits and Bottom Line
Upfront identification of engagements that lend themselves to data analytics is critical, and it can yield several benefits. First, not only does it help determine the level of effort required, but it also provides a high-level indication of the types of data needed for those engagements. That way, the data analytics team can engage the IT function or the data owners early enough to avoid the bottlenecks of late requests. Additionally, it can have a direct impact on the CAE’s decision-making process by identifying the analytics skills needed as well as isolating areas where co-sourcing would be cost-effective.
Estimating data analytics level of effort for each engagement within the audit plan can be challenging — even daunting, especially if the assessment is performed during audit plan development. And while the matrix system yields a considerable amount of useful data for decision-making, professional judgment ultimately should be the cornerstone of the entire process. An auditor’s knowledge and experience should guide decision-making, using the level-of-effort methodology as a means of informing and supporting conclusions.