Thank You!

You are attempting to access subscriber-restricted content.

Are You Ready to Experience Everything Internal Auditor (Ia) Has to Offer?

Assessing Data Reliability

Internal auditors can follow practical steps to ensure reports are complete and accurate.​

Comments Views

Reports from extracted data can sometimes be misleading, which can be a problem when organizations rely on them to make critical business decisions. This is especially important for organizations subject to the U.S. Sarbanes-Oxley Act of 2002 as part of the testing process.

The U.S. Public Company Accounting Oversight Board warns that having inaccurate reports might lead to key controls deficiencies, so organizations should ensure that reports used in assessing the operation of key controls are complete and accurate. Internal auditors can easily apply tools and techniques to ensure that reports and data used for decision-making are reliable.

The Impact of Bad Data 

Poor data quality is responsible for an average of $15 million per year in financial losses, according to recent Gartner research. It also is a primary reason for 40% of all business initiatives failing to achieve their targeted goals. Unreliable reports can impact:

  • Strategic Decisions — performing mergers and acquisitions, changing organizational structure, expanding to new locations, or developing new product portfolios.
  • Operational Decisions — costing and pricing of projects, budget-related decisions and priorities, sales forecasts, production and inventory needs, and resource requirements. 
  • Financial Decisions — financial reporting, credits and loans, invoicing, collection, and investments.
  • Regulation and Compliance — employment labor laws, intellectual property, data privacy, and software licensing.

Start With a Risk Assessment

The first step is to perform a risk assessment to determine which reports should be subject to evaluation. This should include an assessment of the report type, impact of the report for decision-making, key control considerations, change management procedures, and access restriction. 

Reports can be categorized into three main types — canned, customized, and manual. Canned reports are generated from a system where no changes have been made. Those reports usually represent low risk for completeness and accuracy. Customized reports are developed based on user needs and represent higher risk for completeness and accuracy. Manual reports are created by an end user and have not passed a formal change management process for report testing. They usually represent the highest risk.

As each report type represents a different inherent risk level, identifying the report type is crucial for the reliability assessment, and should lead to different validation activities.

Other factors that should be considered when determining reports for testing include:

  • Data Usage. Does the report and underlying data relate to strategic, financial, operational, or regulatory decisions?
  • Impact of the Report. Would a mistake in the report pose a potential strategic, financial, operational, or regulatory risk to the organization?
  • Control Considerations. Is the report used in the execution of key controls to mitigate significant risks?
  • Change Management Procedures. How effective are the change management controls for report creation?
  • Access Restrictions. What access restriction mechanisms — such as password or permissions — are in place?

Test for Completeness 

Internal auditors need to verify the report type and understand the parameters used to generate it. Just one incorrect parameter can severely impact report reliability. Because several parameters typically are used to generate a report, the internal auditor should spend time with the report owner to understand if the parameters were correctly selected. 

Next, internal auditors should check whether any exclusions have been set up at either the application user-interface level or the code level. If it’s the latter, assistance from developers may be needed. Auditors also should be careful not to be fooled by the report name. A procurement report named “Total Expense for Vendors” may only show expenses that are procurement-related, but not all expenses.

Internal auditors should review several areas when testing reports for completeness.

  • Look at when the report was last modified. Checking the last modification date can highlight whether report changes occurred.
  • Common practice is to limit what data a user can see based on user access rights profiles, which should be in line with job responsibilities. It is critical to verify that the user generating the report provides a complete report. In many cases, the end user may be indifferent or unaware of this, so it is always advisable to approach the system owner.
  • Compare different reports that should show the same data. Because each report is built with different logic, this is a good way to test report completeness. Compare the same information from different sources and ask different stakeholders to opine on the reasonability of the data.
  • Use the “full and false inclusion” method. Take a sample of transactions that should or should not be in the report, and verify accordingly.
  • Verify if any manual checks or system validations prevent duplicate records. To identify such occurrences, perform a simple but effective duplication test for a sample of data fields.
  • Review blank data fields. Missing data is a good indicator that additional checks need to be performed.
  • When using a reporting tool, such as a business intelligence application, ensure that the latest version is being used. Upgrades usually solve technical defects, and data-warehouse interfaces can be different.

Test Data Accuracy 

In testing accuracy, internal auditors need to understand which data capture method was used, as each method has a different level of risk for data reliability: on a paper form, by users directly entering data, or by a system. It’s also important that auditors recognize the type of controls over system data entry and system data input validations, such as double keying and upper and lower limits. 

Other items that should be assessed by internal auditors in the testing of data accuracy include:

  • The meaning of a data field. Internal auditors should never assume, based on the column descriptions, that they understand what the data item is.
  • The source data for key data fields. This can be done by tracing back to identify the source data repository. 
  • Reasonableness. For example, is it reasonable that a car was rented for $2,000 a night? 
  • Date fields. Dual date format issues might adversely impact any date analysis. For example, a date in a report such as 03/05/2019 might be displayed as either March 5, 2019, or May 3, 2019, depending on the end user’s regional setting. 

Blind Trust

Unreliable data can negatively impact key decisions. In many cases, organizations are unaware of unreliable reports, resulting in stakeholders grappling with flawed data that, ultimately, might lead to wrong or nonoptimal choices. Unfortunately, this lack of awareness may lead many organizations to blindly trust their data, which can mean disaster. Organizations are data driven, so internal auditors must ensure that decisions are made based on complete and accurate reports. 

Danny Fridman
Dror Bar Moshe
David Gabra
Internal Auditor is pleased to provide you an opportunity to share your thoughts about the articles posted on this site. Some comments may be reprinted elsewhere, online or offline. We encourage lively, open discussion and only ask that you refrain from personal comments and remarks that are off topic. Internal Auditor reserves the right to remove comments.

About the Authors

 

 

Danny FridmanDanny Fridman<p>​Danny Fridman, CIA, CISA, CRISC, is head of internal audit at AMDOCS in Ra’anana, Israel.<br></p>https://iaonline.theiia.org/authors/Pages/Danny-Fridman.aspx

 

 

Dror Bar MosheDror Bar Moshe<p>​Dror Bar Moshe, CIA, CPA, CFE, CISA, is deputy head of internal audit at AMDOCS.<br></p>https://iaonline.theiia.org/authors/Pages/Dror-Bar-Moshe.aspx

 

 

David GabraDavid Gabra<p>​David Gabra, CISA, is an internal auditor at AMDOCS.<br></p>https://iaonline.theiia.org/authors/Pages/David-Gabra.aspx

 

Comment on this article

comments powered by Disqus
  • Birmingham City Univ_August 2019_Premium 1
  • IIA Training_August 2019_Premium 2
  • IIA CIA_August 2019_Premium 3