Thank You!

You are attempting to access subscriber-restricted content.

Are You Ready to Experience Everything Internal Auditor (Ia) Has to Offer?

​Bringing Clarity to the Foggy World of AI

Strategy and governance should be internal audit’s focus in assessing artificial intelligence systems.

Comments Views

In unveiling the U.S. government’s updated National Artificial Intelligence (AI) Research and Development Strategic Plan last June, U.S. Chief Technology Officer Michael Kratsios framed the reality many organizations face with AI. “The landscape for AI research and development (R&D) is becoming increasingly complex,” Kratsios said, noting the rapid advances in AI and growth in AI investments by companies, governments, and universities. “The federal government must therefore continually reevaluate its priorities for AI R&D investments to ensure that investments continue to advance the cutting edge of the field and are not duplicative of industry investments.”

Organizations are indeed investing in AI. About one-third of companies in Deloitte’s most-recent State of AI in the Enterprise survey said they were spending $5 million or more on AI technologies in fiscal year 2018. Moreover, 90% expected their level of investment to grow in 2019. These investments are occurring across all facets of business, from production and supply chain to security, finance, marketing, customer service, and internal audit. 

With so much money on the line, organizations must invest the right resources in the right places to capitalize on AI. But with the technology evolving rapidly, it’s not clear how they can accurately assess AI-related risks and ensure that projects are consistent with the organization’s mission, culture, and technology strategy. In this sometimes-foggy environment, internal audit can be a valuable ally by focusing on whether the organization has a sound AI strategy and the robust governance needed to execute that strategy.

Defining AI

The definition of artificial intelligence is somewhat ambiguous. There is not universal agreement about what AI is and what types of technologies should be considered AI, so it’s not always clear which technologies should be in scope for internal audits.

Technologies that fall into the realm of AI include deep learning, machine learning, image recognition, natural language processing, cognitive computing, intelligence amplification, cognitive augmentation, machine augmented intelligence, and augmented intelligence. Additionally, some people include robotic process automation (RPA) under AI because of its ability to execute complex algorithms. However, RPA is not AI because bot functions must adhere strictly to predetermined rules.

When considering which technologies fall under the umbrella of AI for internal audit purposes, it is important to understand how the organization defines it. For that reason, ISACA’s Auditing Artificial Intelligence guide recommends auditors communicate proactively with stakeholders to answer the question, “What does the organization mean when it says ‘AI?’” This alignment can help auditors manage stakeholder expectations about the audit process for AI. Moreover, it may tell auditors whether the organization’s definition of AI is broad enough — or narrow enough — for it to perceive risk in the marketplace.  

Start With Strategy

However the organization defines AI, most guidance agrees that internal audit should focus its audits on the organization’s AI strategy and governance. Without a clearly articulated and regularly reviewed strategy, investments in AI capability will yield disappointing results. Worse, they could result in financial and reputational damage to the organization. Internal audit should confirm the existence of a documented AI strategy and assess its strength based on these considerations:

  • Does the strategy clearly express the intended result of AI activities? The strategy should describe a future state for the business and how AI is expected to help reach it, as opposed to AI being viewed as an end unto itself.
  • Was it developed collaboratively between business and technology leaders? To provide value, AI endeavors must align business needs and technological capability. Auditors should verify whether a diverse group of stakeholders are providing input.
  • Is it consistent and compatible with the organization’s mission, values, and culture? With expanding use of AI comes new ethical concerns such as data privacy. Auditors should look for evidence that the organization has considered whether planned AI uses are consistent with what the organization should be doing. 
  • Does it consider the supporting competencies needed to leverage AI? Successfully implementing AI requires support and expertise around IT, data governance, cybersecurity, and more. These areas should be factored into the organization’s AI strategy. 
  • Is it adaptable? While the cadence will vary by organization, key stakeholders should review the AI strategy periodically to confirm its viability and to ensure it accounts for emerging threats and opportunities.


Organizations need their internal audit departments to ask these types of questions, not just once, but repeatedly. Research shows that organizations want their internal audit departments to be more forward-looking and provide more value in assessing strategic risks. Regarding supporting competencies, board members and C-level leaders are most concerned that their existing operations and infrastructure cannot adjust to meet performance expectations among “born digital” competitors, according to Protiviti’s Executive Perspectives on Top Risks 2019 report. As such, internal auditors can provide assurance that the organization’s AI strategy is appropriate and can be carried out realistically. 

Pay Attention to Data Governance

As with any other major system, organizations need to establish governance structures for AI initiatives to ensure there is appropriate control and accountability. Such structures can help the organization determine whether AI projects are performing as expected and accomplishing their objectives. The problem is that it’s not yet clear what AI governance looks like. 

According to a 2018 Internal Audit Foundation report, Artificial Intelligence: The Data Below, “There is not a template to follow to manage AI governance; the playbook has yet to be written.” Even so, the report advises internal auditors to assess the care business leaders have taken “to develop a robust governance structure in support of these applications.” That exploration should start with the data. 

Big data forms the foundation of AI capability, so internal audit should pay special attention to the organization’s data governance structure. Auditors should understand how the organization ensures that its data infrastructure has the capacity to accommodate the size and complexity of AI activity set forth in the AI strategy. At the same time, auditors should review how the organization manages risks to data quality and consistency, including controls around data collection, access rights, retention, taxonomy (naming), and editing and processing rules. They also should consider security, cyber resiliency, and business continuity, and assess the organization’s preparedness to handle threats to the accuracy, completeness, and availability of data.

AI value and performance also depend on the quality and accuracy of the algorithms that define the processes that AI performs on big data. Documented methodologies for algorithm development, as well as quality controls, must be in place to ensure these algorithms are written correctly, are free from bias, and use data appropriately. Moreover, internal audit should understand how the organization validates AI system decisions and evaluate whether the organization could defend those decisions.

In addition to governance around data and AI algorithms, internal audit should examine governance structures to determine whether:

  • Accountability, responsibility, and oversight are clearly established.
  • Policies and procedures are documented and are being followed.
  • Those with AI responsibilities have the necessary skills and expertise.
  • AI activities and related decisions and actions are consistent with the organization’s values, and ethical, social, and legal responsibilities.
  • Third-party risk management procedures are being performed around any vendors.

AI Gains Momentum

AI poses challenges that make auditing it daunting for many internal audit functions. To audit the technology effectively, internal audit functions must have or acquire sufficient resources, knowledge, and skills. That doesn’t mean they need expert-level knowledge on staff, though. 

Obtaining these capabilities has proved to be challenging. According to The IIA’s 2018 North American Pulse of Internal Audit, 78% of respondent chief audit executives indicated it was very difficult to recruit individuals with data mining and analytics skills. Nevertheless, the internal audit function should work to steadily increase its AI expertise through training and talent recruitment.

However, success in auditing AI does not depend directly on technical expertise. Instead, auditors must be able to assess strategy, governance, risk, and process quality — all things they can bring from an independent, cross-departmental point of view. 

The sooner internal auditors do this, the better, because AI, in all its various forms, is gaining momentum. Soon, it will be difficult to find an area of the business that does not leverage it in some way. And although the constantly evolving technologies and risks can be dizzying, internal audit can provide sound assurance that the organization is pointing its AI investments in the right direction. 

Kevin Alvero
Wade Cassels
Internal Auditor is pleased to provide you an opportunity to share your thoughts about the articles posted on this site. Some comments may be reprinted elsewhere, online or offline. We encourage lively, open discussion and only ask that you refrain from personal comments and remarks that are off topic. Internal Auditor reserves the right to remove comments.

About the Authors

 

 

Kevin AlveroKevin Alvero<p>Kevin Alvero, CFE, is senior vice president, Internal Audit, Compliance, and Governance, at Nielsen.​</p>https://iaonline.theiia.org/authors/Pages/Kevin-Alvero.aspx

 

 

Wade CasselsWade Cassels<p>Wade Cassels, CIA, CISA, CFE, is a senior auditor at Nielsen in Oldsmar, Fla.​</p>https://iaonline.theiia.org/authors/Pages/Wade-Cassels.aspx

 

Comment on this article

comments powered by Disqus
  • AuditBoard_Apr 2020_Premium 1
  • Fastpath_Apr 2020_Premium 2
  • IIA Membership Centers_Apr 2020_Premium 3