Peace in Our Time in Our Time<p>Too many organizations use internal audit results to drive priorities for the IT function, which can have a devastating effect on morale. This approach sets an example for the entire organization about how to get systems-related objectives met. Initially, this can be benign as leaders try to do the right thing and help uncover systems issues that need attention. Eventually, pointing the auditors to real or suspected issues allows them to elevate any project to the highest priority, whether it is strategic or not.</p><p>For example, a software company starved back-office systems in favor of product development. As a result, IT fell seriously behind in patching internal production systems. Because the organization was audit-driven, at the next opportunity, management pointed auditors at patching, and the inevitable findings in patch management became the flag around which any desired project was wrapped to secure new funding. Step one: Hold IT accountable for not patching that system. Step two: Secure funding to “fix IT’s mess.”</p><p>Allowing audits to drive strategy wastes time and money, and robs management of the audit’s real value — helping management validate that it is appropriately addressing risks to business processes. When the audit becomes the key objective, performing audits becomes an essential business process on its own. This mistake creates the potential for a wildly inappropriate scope that gives the IT staff the sense that audits are never-ending and self-serving. </p><h2>Fear and Loathing</h2><p>These issues can lead to audit fatigue and poorly executed audit activities. Before long, management is spending its time and attention fixing problems with audits instead of fixing problems found by audits.</p><p>In another example, a large financial services company purchased a much smaller company in an adjacent but highly regulated space. As is often the case, the smaller company had a much lower profile than the larger company, but that changed once it was part of a larger organization. The new management, lacking experience as a highly regulated entity, began to ramp up audits to get ahead of the regulators. As operational requirements competed with audit requests, “just get it done” replaced “do it right.” At some point in this dysfunctional downward spiral, “do whatever the auditor says to get this over with” became the strategy to end the pain. </p><p>This example provides context for the skepticism, distrust, and outright fear senior executives and IT staff members have about audits. Some worry about getting in trouble for doing something wrong. Many view the time spent on audit requests as wasted time or busy work. The fear and distrust for audits is naturally extended to the auditors, and this leads to an “us versus them” mentality. Both sides dig in and spend more time protecting their flank than solving their problems. </p><p>Some IT departments assign auditors “handlers” to choreograph activity, coach process owners to provide guarded answers, and quickly escalate issues, causing a bottleneck within leadership. Inexperienced auditors bring poor time management skills, poorly thought-out evidence requests, and negative attitudes to audits that put everyone on guard. Auditors then spend extra time gathering overwhelming evidence of control failure, and IT staff fabricates control evidence.</p><p>In addition to driving poor decision-making when used unwisely, audits often veer off track. In such cases, people too close to the situation sometimes focus on the audit as the key objective rather than managing the business process under audit. Besides these strategic mistakes, scope creep, poor communication, distrust among teams, and inexperience can plague any project and amplify any problems with an audit because of the extra scrutiny on the outcome. </p><p>In some organizations, IT may be severely underfunded and so far behind in resolving previous audit findings that the department gets accustomed to adding the next set to its ever-expanding project list. This forces leadership to spend so much time prioritizing and re-prioritizing work that audit failure becomes the de facto driver for funding. This, more than control failures, may be the finding that the audit should reveal.</p><h2>The Path to Peace</h2><p>It doesn’t have to be like this. When used appropriately to validate assumptions and uncover blind spots, the audit program is a crucial asset for management and plays an essential role in governance. Here are 10 tips to help internal auditors, management, and IT employees get on the right track.</p><p><strong>Audit team</strong> The audit team can become better partners to IT by taking these steps:<br></p><ul><li><em>Agree with senior leadership on the strategy and priorities of the audit program.</em> Establish priorities and understand where to focus audits based on the risks presented by the critical business processes.</li><li><em>Ensure each audit focuses on making the business process better, not finding problems</em>. Internal audit should keep this goal in mind as it sets audit objectives, determines scope, and frames findings. Always solicit recommendations for improvement from management. </li><li><em>Help the organization navigate audits and examinations by external organizations (within the limits of independence).</em> This is particularly important as it pertains to audit scope. For example, it’s not helpful to have nonregulated businesses examined by regulators. It wastes time and exposes the organization to inappropriate jeopardy. Auditors should make sure all parties agree to the scope before the audit starts. </li><li><em>Agree up front on the criteria for identifying the required evidence. </em>These criteria include sample selection criteria, the duration of the assessment, and the amount of evidence required to validate each test objective.</li><li><em>Agree on the process and tools to be used for requesting and receiving the evidence. </em>Agree on how quickly evidence is to be gathered once requested.<br><br></li></ul><p><strong>Management</strong> IT management can demonstrate transparency and respect for the audit process by:</p><ul><li><em>Avoiding assigning junior people to handle examiners or auditors.</em> When management tries to offload audit responsibility to the least useful resource, it almost always has a negative impact.</li><li><em>Not coaching employees on how to be coy with auditors. </em>Internal auditors are trained to spot inconsistency and lack of transparency. Trying to hide details from auditors is unprofessional and causes them to dig deeper in that area.<br><br></li></ul><p><strong>Employees</strong> IT staff members who are asked to support audit activities can establish trust by taking these steps:</p><ul><li><em>Don’t assume your competence is being questioned.</em> “I don’t know, but let me find out for you” is a better answer than guessing.</li><li><em>Don’t try to sound like a lawyer. </em>The best way to be understood is for employees to use the language and style that is comfortable to them. The surest way to get management’s attention — and not in a good way — is to call a minor testing deviation a “material weakness.”</li><li><em>The auditor is not a whistleblower hotline. </em>Managers should remind employees to bring internal issues to their manager or a neutral member of the management team.</li></ul><h2>Look in the Mirror<br></h2><p>Internal auditors should ensure their organization doesn’t take a dysfunctional audit approach. They should review their audit strategy to make sure it addresses business process risk, provides the necessary governance assistance to management and the board, and addresses the organization’s regulatory requirements. They shouldn’t let audits drive the business. <br></p>Bill Bonney1
The Threat Hunters Threat Hunters<p>​They're on the hunt, in companies around the world. Combining technology tools with detective skills, they are hunting for hidden adversaries on their networks. And their numbers are growing.</p><p>More than four in ten organizations responding to the <a href="" target="_blank">SANS 2018 Threat Hunting Survey</a> (PDF) say they conduct continuous threat hunts, up from 35% in information security training firm SANS Institute's 2017 study. More than one-third commence such hunts to look for underlying problems in response to a security event.</p><p>Their aim is to root out intruders, who can dwell on a network for an average of more than 90 days before they are detected. "Most of the organizations that are hunting tend to be larger enterprises or those that have been heavily targeted in the past," according to co-authors Robert M. Lee, a SANS instructor, and Rob Lee, curriculum lead at the institute. SANS surveyed 600 organizations for the report.</p><p>Threat hunting goes well beyond the intrusion detection most organizations rely on to discover security breaches. The SANS report defines it as an iterative approach for searching for and identifying adversaries on an organization's network. It's about combining threat intelligence and hypothesis generation to hone in on the most likely locations that intruders will target. </p><p>Threat hunting can be effective, the report notes. For example, 21% found four to 10 threats during threat hunts. Nearly 17% found as many as 50 such threats.</p><h2>Intelligence Is Key</h2><p>One reason for threat hunting's effectiveness is that hunters are harnessing better threat intelligence, the report finds. Most respondents (58%) say they rely on intelligence generated internally based on previous incidents. Moreover, 70% tap into intelligence from third-party sources such as anti-virus signatures.</p><p>"Nothing is more valuable than correctly self-generated intelligence to feed hunting operations," the authors say. However, organizations without such capabilities may need to turn to third parties. In fact, they recommend blending the two forms of intelligence as a way to reduce adversary dwell times.</p><h2>People and Technology</h2><p>Still, respondents depend most on alerts from network monitoring tools for their threat intelligence, which the authors point out isn't really threat hunting — a common misconception. This reliance on sensors may indicate that organizations still see threat hunting as a technology solution. The survey results bear this out, with more than 40% prioritizing technology investments for threat hunts versus 30% for qualified personnel. </p><p>The emphasis on technology is misplaced, the authors say. Yes, threat hunters depend on automation to do things faster, more accurately, and at greater scale. "However, by its definition, hunting is best suited for finding the threats that surpass what automation alone can uncover," they stress. Instead, technology and people must be intertwined.</p><p>The authors recommend that organizations prioritize recruiting and training skilled staff for threat hunts. In particular, they say such professionals are more likely to detect threats and create tools they will need to be effective. </p><p>Respondents say the baseline skills for threat hunters are network, endpoint, threat intelligence, and analytics. More advanced capabilities include digital forensics and incident response.</p><h2>Hunting Tools</h2><p>Hunters need weapons, and this is where technology tools come into use. Nine out of 10 respondents say their threat hunters use the organization's existing IT infrastructure tools, while 62% have developed customized tools. </p><p>However, the authors question whether these tools are providing the view of the network needed for successful hunts, noting that they often are detection-based. Such tools may not find all the intruders who have breached the network, they say.</p><p>Whatever their tools, the report notes that threat hunting can be resource-intensive and requires an emphasis on analysis and developing hypotheses about adversaries. Although growing percentages of respondents are basing hunts on continuous monitoring or incident response, it may be more effective to conduct scheduled hunts. "Even a few hunts per year, when done correctly, can be highly effective for the organization," the authors say.<br></p>Tim McCollum0
Bias in the Machine in the Machine<p>​Can artificial intelligence (AI) discriminate? That is what Facebook’s AI is accused of doing. In March, the U.S. Department of Housing and Urban Development (HUD) announced it was suing the social media company for violating the Fair Housing Act. HUD alleges that Facebook’s advertising system allowed advertisers to limit housing ads based on race, gender, and other characteristics. The agency also claims Facebook’s ad system discriminates against users even when advertisers did not choose to do so.</p><p>Although it has yet to be proven whether Facebook committed any deliberate discrimination, the result is still the same. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face,” HUD Secretary Ben Carson said in announcing the lawsuit.</p><p>Each day, machine learning and AI (ML/AI) models make decisions that affect the lives of millions of people. As these models become more integrated with everyday decision-making, organizations need to be increasingly vigilant of the risk created by potentially discriminatory algorithms.</p><p>But who within those organizations is responsible for ensuring the ML/AI model is making fair, unbiased decisions? The model developer should not be responsible, because internal control principles dictate that the persons who create a system cannot be impartial evaluators of that same system. The model’s users also should not be responsible, because they typically lack the expertise to evaluate an ML/AI model. Users also may not question a model that seems to be performing well. For example, if a predictive policing model leads to more arrests and less crime, users are not likely to question whether that system unfairly targets a particular group. </p><p>Internal audit may be best suited to provide assurance to the board and senior management that the organization is mitigating the reputational, financial, and legal risks of implementing a biased ML/AI model. However, because this is a new assurance domain for the profession, auditors need a methodology for auditing the fairness of these models. </p><h2>Why Models Need to Be Fair</h2><p>An ML/AI model is a mathematical equation that uses data to produce a calculation such as a score, ranking, classification, or prediction. It is a specific set of instructions on how to analyze data to deliver a particular result — behavior, decision, action, or cause — to support a business process. </p><p>There are three main categories of analytic models. <em>Descriptive models</em> summarize large amounts of data into small bits of information that are easier for organizations to analyze and work with. <em>Predictive models</em> are more complex models used to identify patterns and correlations in data that can be used to predict future results. <em>Prescriptive models</em> enable data analysts to see how a decision today can create multiple future scenarios. </p><p>ML/AI models need to be fair and nondiscriminatory because the decisions they support can expose organizations to substantial risk if the classification criteria they use are unethical, illegal, or publicly unacceptable. Such criteria are referred to as inappropriate classification criteria (ICCs) and include race, gender, religion, sexual orientation, and age.<br></p><table cellspacing="0" width="100%" class="ms-rteTable-default"><tbody><tr><td class="ms-rteTable-default" style="width:100%;">​<strong>Controlling for Exogenous Variables</strong><br><p><br>Often, despite the best efforts to eliminate it, discrimination creeps into an organization’s analytic models through external data that has a systemic bias, thus exposing the organization to risk. Appropriate exogenous variables (AEV) are variables that provide appropriate classification criteria but have been subject to external systemic bias that has not been detected. An example of AEVs would be the credit score for individuals from minority communities or salary information for women.<br></p><p>Fortunately, analytic models can be used to control for this bias. For example, after controlling for gender differences in industry, occupation, education, age, job tenure, province of residence, marital status, and union status, an 8% wage gap persists between men and women in Canada, according to a February 2018 Maclean’s article. It is a relatively simple exercise to adjust the salary variable in a classification model by +8% for female subjects. <br></p></td></tr></tbody></table><p>In assurance engagements regarding bias, internal auditors primarily will be concerned with a type of predictive model known as a classification model. This model is used to separate people into groups based on certain attributes that an organization can use to support decisions. Examples of these attributes include:</p><ul><li>Identifying borrowers who are most likely to default on a loan.</li><li>Classifying employees as future high performers.</li><li>Selecting persons who are least likely to commit further crimes if granted probation.</li><li>Targeting consumers to receive special promotions or opportunities. In one case, the Communications Workers of America sued T-Mobile, Facebook, and a host of other companies, alleging that those companies discriminated by excluding older workers from seeing their job ads.</li></ul><p><br>To provide assurance to management and the audit committee that the organization’s ML/AI model does not discriminate, auditors need to assess two things: 1) That the model does not benefit or penalize a certain classification of people; and 2) if a classification is removed from the model, it still provides useful results. </p><p>Internal auditors can test for bias using a model fairness review methodology. This methodology comprises: </p><ol><li>Understanding the model’s business purpose.</li><li>Working with the audit client to determine and identify ICCs. In this step, auditors also may discuss possible appropriate exogenous variables (see “Controlling for Exogenous Variables” on this page). </li><li>Selecting a large sample — or the entire data set — of input data and classification results.</li><li>Conducting statistical analysis of the results to determine whether distribution of ICCs is within acceptable parameters.</li><li>Discussing initial results with the client.</li><li>Removing ICCs and re-running the classification model. Auditors also can replace ICCs with uniform values depending on the nature of the model.</li><li>Comparing distribution of ICCs before and after removal. </li></ol><h2>A Bias Audit</h2><p>As an example of how internal auditors can use this methodology, consider a marketing department at a credit card company that used a classification model to determine which customers should be given a discount. The data used for the model is half women and half men. Management wanted assurance that this model was not exposing the organization to potential liability by discriminating against either group.</p><p>Internal audit met with Marketing and confirmed that it used the model to select customers for preferred rates. These preferred rates are substantially lower than the rates offered to customers in general. After reviewing the information used by the model, internal audit noted these variables:</p><ul><li>Customer ID (metadata — not used as a variable).</li><li>Surname (ICC).</li><li>Credit score.</li><li>Geography (ICC).</li><li>Gender (ICC).</li><li>Age (ICC).</li><li>Tenure.</li><li>Balance.</li><li>Number of products.</li><li>Has credit card.</li><li>Estimated salary.</li></ul><p><br>In some cases, a variable may be an ICC for one type of model but not for another. For example, gender is an appropriate classification criterion for a clothing company promotion but not for a loan approval. Age may be appropriate in a health-care model but not in an applicant screening.</p><p>In the marketing example, internal audit analyzed the initial results of the classification model and observed that 35% of customers were classified as good candidates. However:</p><ul><li>50% of men and 20% of women were classified as good candidates.</li><li>6% of customers over 50 were classified as good candidates.</li><li>1% of women over 50 were classified as good candidates.</li></ul><p><br>Internal audit discussed the initial classification results with the marketing department to determine whether there are business reasons for the observed result and if those reasons are valid, defensible, and nondiscriminatory to mitigate the risk of legal liability. Based on this discussion, internal audit removed the identified ICC from the input data and re-ran the classification model. </p><p>In reporting the results to Marketing, internal audit noted the model was producing useful results. The results showed that 45% of customers were classified as good candidates, a finding with which Marketing concurred. However:</p><ul><li>50% of men and 40% of women were classified as good candidates.</li><li>21% of customers over 50 were classified as good candidates.</li><li>10% of women over 50 were classified as good candidates.</li></ul><p><br>Internal auditors noted that the model appears to be biased against groups such as women and people over 50, which is likely the result of exogenous variables. Auditors recommended that Marketing adjust its model to compensate for these variables.</p><h2>New Models, Old Risks</h2><p>Although the subject of bias in analytic models may be unfamiliar to internal auditors, their risk management role in this domain is crucial. Bias introduces an unacceptable risk to any organization regardless of where that bias originates. A decision made by an organization’s analytic model is a decision made by that entity’s senior management team. Internal audit can help management by providing risk-based and objective assurance, advice, and insight. As such, auditors should learn and adapt their methods to meet the challenges organizations face in adopting AI. <br></p>Allan Sammy1
Crime's Digital Transformation's-Digital-Transformation.aspxCrime's Digital Transformation<p>​An international police operation has taken down a gang that allegedly stole an estimated $100 million from more than 41,000 victims using malware, European police organization Europol announced this month. The gang allegedly infected computers with the GozNym malware, enabling its members to obtain online banking credentials and access to victims' bank accounts. They also used those accounts to launder the money they stole and transfer the funds to their own accounts, Europol alleges.</p><p>What sets the GozNym gang apart is its use of cloud and digital platforms to carry out its operations and recruit service providers, technical expertise, and other accomplices. A U.S. federal grand jury in Pittsburgh has indicted 10 gang members, while prosecutions are underway in Georgia, Moldova, and Ukraine. Law enforcement agencies in Bulgaria and Germany also were involved. </p><p>"The collaborative and simultaneous prosecution of the members of the GozNym criminal conspiracy in four countries represents a paradigm shift in how we investigate and prosecute cybercrime," says U.S. Attorney Scott Brady of the Western District of Pennsylvania. </p><p>But criminals are shifting the paradigm, too. A new wave of organized crime groups are using the tools of digital transformation to carry out crimes throughout the world. </p><p>"Digital transformation is making it easier not only for legitimate organizations to expand their reach, but also for fraudsters and other bad actors to expand theirs," notes the <a href="" target="_blank">2019 Current State of Cybercrime</a> report from cybersecurity firm RSA. The RSA study spotlights trends spanning mobile, legitimate platforms, and digital crime.</p><h2>Mobile</h2><p>Last year, mobile communications was the source of seven out of 10 fraudulent transactions, RSA notes. Such transactions via mobile apps have increased nearly seven-fold since 2015.</p><p>But it's not just fraud that has gone mobile. One in five cyberattacks could be attributed to rogue mobile apps. RSA reports that on average 82 rogue apps are identified each day. RSA expects that trend to continue this year, "especially as cybercriminals keep finding ways to introduce tactics and technologies such as phishing and malware to the mobile channel." </p><h2>Leveraging Legitimate Platforms</h2><p>Last year, RSA's report pointed out that criminals were using social media networks and messaging platforms such as Facebook, Instagram, and WhatsApp to communicate and to sell stolen credit card numbers and identities. That warning has been borne out by a 43% increase in social media fraud attacks, according to this year's report.</p><p>These platforms are attractive to criminals because they are free of charge and easy to use, the report notes. RSA predicts criminals will open more stores on social media platforms to trade in stolen identities and similar data. </p><p>Moreover, cybercriminals "are developing their own apps to increase their anonymity, avoid detection, and otherwise keep anti-fraud forces from tracking them down," RSA says. Another threat to watch is criminals exploiting on-demand service platforms such as Airbnb and Uber for money laundering and to commit fraud. </p><h2>Digital Crime</h2><p>Criminals are turning to digital technologies to aid and abet their crimes, RSA reports. For example, they are automating the process of verifying stolen user names and passwords, using account-checking tools. They also are targeting ever-more-ubiquitous Internet of Things devices. </p><p>Moreover, RSA warns that criminals are exploiting cross-channel vulnerabilities by combining mobile, cloud, and other digital channels to launch attacks. An example would be using social engineering tactics to have an organization's call center change the password on a victim's online account so that the criminal would have access.</p><h2>Crime as a Service</h2><p>The GozNym case highlights another trend not mentioned in the RSA report: leveraging underground criminal networks to recruit accomplices. According to Europol, the gang came together as a "cybercrime as a service" operation. Its ringleaders used Russian-language online criminal forums to connect with people who acted as hosts, money "mules," encryption providers, spammers, computer coders, and technical support.</p><p>For example, the gang's leader obtained online hosting services for the attacks from the Avalanche network, which provided services to more than 200 cybercriminals and hosted more than 20 malware campaigns. </p><h2>The Risk of Copycats</h2><p>The U.S. Justice Department reports that five of the accused GozNym members are still at large — complete with a Federal Bureau of Investigation wanted poster. But as with many technology advances, other criminals are likely to copy the GozNym gang's tactics and add their own innovations. </p><p>To protect themselves, organizations need to combine vigilance and technology. "In this way, digital transformation becomes both a critical contributing factor in the problem of growing cyber risks today — and a critical resource for solving it," RSA says. </p>Tim McCollum0
Auditing the Smart City the Smart City<p>​As cities aggressively adopt “smart” technology — especially in the very public-facing transportation and safety arenas — municipal auditors will increasingly find themselves facing a new version of a familiar risk: cybersecurity. The underpinning of Internet-of-Things (IoT) connectedness that makes smart tech so smart is also its Achilles’ heel, offering hackers access, on a vast scale, to all kinds of complicated technologies — and the people they affect. And countering that risk may require new internal audit skills and tools. </p><p>When the technology works, smart sensors create massive amounts of data that trigger mechanical responses: roadways charge electric vehicles as they pass above; connected cars find the best parking spots. But cybercrime experts take smart tech risks — and their implications for municipalities — quite seriously, painting a dark future portrait in the event things go awry. What happens, for example, if cybercriminals made every traffic light in a city green at the same time or scrambled the entire grid’s color cycles during rush hour? What if they completely shut down the city’s smart power grid? What if an attacker targeted water and sewage systems, tampering with automated meters that detect and respond to flood conditions? </p><p>Auditors take those risks seriously, too. “The benefits that smart and emerging technologies can deliver are accompanied by multiple new risks,” says Tonia Lediju, chief audit executive (CAE) for the City and County of San Francisco. “We need to ensure that cities have the right security governance, processes, and controls in place.”</p><h2>Smart City by the Bay</h2><p>In San Francisco, there’s a lot of smart tech to audit. Lediju says it’s one of the leading smart cities globally, and it’s working on even more smart mobility solutions — often in partnership with private companies or with the U.S. federal government. Initiatives include smart traffic signals, an electronic toll system with congestion pricing, and autonomous electric shuttles to Treasure Island in the San Francisco Bay. The city also uses smart parking meters that change prices according to the time and day of the week.</p><p>Lediju says her auditors tackle the new risks of smart tech head-on. The City Services Auditor Division assists the various city departments affected by new transportation technology, for example, in understanding the risks, monitoring the application controls designed to rein them in, and crafting preventive responses. Lediju says her team’s annual work plan includes auditing new technologies when deemed necessary, based on a risk assessment. </p><p>The division works closely day to day with the City and County of San Francisco’s Department of Technology, its Committee on Information Technology, and the departments adopting new technologies to ensure all risks are managed adequately, before adoption, Lediju says. She follows three key steps: understand the pipeline of emerging technologies being considered, identify risk trends, and help departments actively manage risks as they navigate relevant regulations. </p><p>In the cybersecurity space, the City Services Auditor Division “identifies systems’ vulnerabilities and risks through penetration and assessment tests, and recommends remediation,” Lediju explains. Testing encompasses several areas, including cybersecurity framework adoption, security awareness training, IT governance, systems and network security, and business continuity.</p><p>“We also contribute insight gleaned from our extensive scope of work to help departments evolve and improve their strategies and protocols to better prepare for cyberattacks,” Lediju adds. Her team’s work is based on the Cybersecurity Framework Core Functions outlined by the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST): identify, protect, detect, respond, and recover. The City Services Auditor Division, she notes, also makes recommendations based on the CIS Controls and CIS Benchmarks guidance developed by the Center for Internet Security (CIS). “The CIS recommendations highlight for clients the numerous opportunities for control and process improvements or other enhancements that could ultimately increase their effectiveness in managing data security and fulfilling the organizations’ missions and goals in serving the city,” Lediju says. </p><h2>Sweden’s Smart Tech</h2><p>At Sweden’s Borlänge-based Trafikverket — the Swedish Transport Administration — the audit unit also gets involved early on, says Peter Funck, CAE. “The Agency,” as he calls it, is the national government authority responsible for public roads and railways; Funck’s office focuses on the planning and development phases, which is where he says his unit delivers the greatest added value. Audit and The Agency, he adds, have learned to manage large software and infrastructure development projects in similar ways, meaning audit is involved “several times before coding starts, as well as before the first spade is put in the ground,” Funck says. That’s been the case with two of Sweden’s key smart tech endeavors:</p><p></p><ul><li>The European Rail Traffic Management System (ERTMS) is a major industrial project underway in the European Union, Funck notes, and Sweden is one of the early adopters in developing and implementing it. ERTMS is a safety system that “enforces compliance by the train with speed restrictions and signaling status,” he says. </li><li>Sweden is also developing a national system for controlling and scheduling all trains that will integrate train operator scheduling. “It’s one of the biggest software-based projects ever in the country,” Funck says. “The project brings a lot of opportunities, but, of course, size and complexity imply challenges: Will it work? Is it safe?”</li></ul><p><br></p><p>Funck points out that his unit audited both the ERTMS and national integration projects several times, before they were even deployed on a test basis. “Those audits had different focuses,” he says, “but the common denominator has been whether internal controls provide prerequisites to make it work and make it safe.”</p><p>The projects aren’t yet far enough along for after-the-fact performance audits. But Funck notes that, in all of his office’s smart tech projects, health and safety, including terror attacks, are the largest risk concerns. “Information security often brings those risks down to some kind of acceptable level,” he says. Indeed, Funck emphasizes that available information security technology in general is up to the smart tech challenge; the bigger problem lies in people and their roles in keeping smart cities humming.</p><p>Funck adds: “There is always a need for some kind of security and safety risk acceptance in developing business processes to balance with productivity requirements.” At the end of the day, he points out, “railroads and roads are safer if we remove all trains and cars.” </p><h2>Data and Privacy Safeguards</h2><p>Jim Thompson, city auditor in the Albuquerque Office of Internal Audit (OIA), takes smart tech in stride, too, though he’s also well aware of the risks it poses — including those related to cybersecurity. “OIA performs an annual risk assessment of the city, which includes consideration of the city’s information technology risk,” he says. “As the city increases its use and reliance on information technologies, including smart technologies, the risk of cybersecurity and data breach — as well as the liability risk — increase.” </p><p>The city’s Technology and Innovation Department maintains internal controls over IT and also uses outside experts for IT vulnerability risk assessments and intrusion testing. Thompson maintains in-house technology expertise on his team as well. One senior information systems auditor, he says, holds several IT certifications, including CISA, CITP, and ITIL v3 Foundation.</p><p>The City of Albuquerque, Thompson says, has implemented various smart technologies, including government document and data transparency, ride apps, enhanced wireless access, and online police services. Planned audit engagements assessing privacy concerns will target some of those enhancements. “Our annual audit plan this year includes an audit of all city systems and devices that contain personal identifiable information [PII],” Thompson notes. “Some of the city’s smart technologies will be included.”</p><p>Thompson says the audit will consider whether the city maintains a listing of all systems and devices containing PII and if it has controls in place to classify and safeguard PII correctly, including intake points, release and data sharing points, and storage. It will also examine whether individuals with access to the city’s computer environment are trained on and aware of their responsibility to safeguard PII and what to do in the event of a data breach. OIA will consider federal, state, local, and contractual requirements for PII and compare the city’s current practices with IT governance framework best practices recommended by ISACA’s COBIT framework, as well as NIST. </p><table class="ms-rteTable-default" width="100%" cellspacing="0"><tbody><tr><td class="ms-rteTable-default" style="width:100%;"><p><strong>​Down the Pike</strong></p><p>For municipal auditors who are not engaged to audit their city's smart tech right now, there's a good chance they will be soon. Indeed, Kansas City, Mo.'s Chief Innovation Officer Bob Bennett declared last year at the Smart Cities Connect Conference and Expo that municipalities that don't get on the smart tech bandwagon soon will find themselves part of a "digital Rust Belt." </p><ul><li>66 percent of cities say they're investing in smart tech, according to a 2017 report from the National League of Cities called Cities and the Innovation Economy: Perspectives of Local Leaders; one-fourth of the rest are looking into it. </li><li>International Data Corp. reported in January that worldwide spending on smart cities initiatives would reach $95.8 billion in 2019, an increase of 17.7 percent over 2018; by 2021, the total could hit $135 billion. Singapore, New York, Tokyo, and London are expected to invest more than $1 billion each this year, IDC added; the applications receiving the most funding are fixed visual surveillance, advanced public transit, smart outdoor lighting, and intelligent traffic management.</li><li>IoT Analytics said late last year that there were 17 billion connected devices worldwide; the number of IoT devices — excluding smartphones, tablets, laptops, and fixed line phones — was pegged at 7 billion. "The number of IoT devices is expected to grow to 10 billion by 2020," the firm points out, "and 22 billion by 2025."</li><li>Mobility is the most common area for smart tech investment, according to the National League of Cities report. Other key applications include lighting solutions, security, and utilities management, according to the McKinsey Global Institute 2018 report, Smart Cities: Digital Solutions for a More Livable Future. </li></ul></td></tr></tbody></table><h2>Protecting the Vision</h2><p>Chattanooga, Tenn., City Auditor Stan Sewell also points to cybersecurity risk associated with his municipality’s emerging technologies. And while it’s not the No. 1 priority, the city’s tech-focused initiatives provide ample reason to ensure online security issues are addressed. “It’s definitely a risk, but it’s more of a ‘black swan’ concern,” he says.</p><p>Chattanooga’s Smart City Division, which manages street lights and traffic signals, acknowledges that “technical challenges may result from our vision in cybersecurity, hacking, and privacy issues.” “Vision” in Chattanooga includes autonomous vehicles and robust vehicle-to-vehicle and vehicle-to-infrastructure communications. The city won a 2019 Smart Cities Connect Smart 50 Award, a global recognition of transformative smart city project work, for its Chattanooga Smart Community Collaborative research partnership.</p><p>Sewell’s primary concern is supervisory control and data acquisition (SCADA) systems, composed of computers and both wired and wireless data communications modules that provide remote access to and control of a city’s infrastructure processes. “SCADA systems are vulnerable to cyberattacks,” he says, “which are occurring with an increased frequency.” A cyberattacker could gain remote control of the city’s water treatment, for example, “commanding the release of wastewater or sending false pressure sensor data, resulting in a catastrophic failure of water pumps and controls.” Sewell adds: “The various smart technologies increase the number of potential access points to enter the city’s systems to gain access to other areas.”</p><h2>Tried and True</h2><p>In some municipalities, the audit function’s treatment of smart tech doesn’t differ much from how it handles other city initiatives. Smart tech constitutes a largely routine subject, for example, for the City Auditor’s Office in Kansas City, Mo.</p><p>City Auditor Douglas Jones says he is aware of many of the city’s initiatives, one of which earned Kansas City a 2019 Smart 50 Award; plus, he knows smart tech is “timely and topical” and that it poses some reputation risk, as well as risks related to IT and operations. But from his perspective, newness can work against a program’s auditability. “It often makes little sense to audit a program with no track record,” he says. “And there’s always risk with a new program.”</p><p>Indeed, smart tech, Jones emphasizes, is “just one more thing that would be in our universe of potential audit topics. We cover everything from airports to the zoo, and we don’t put a specific emphasis on one thing or the other.” </p><p>Austin, Texas, another 2019 Smart 50 Award recipient, also places high priority on leveraging tech. In fact, Assistant City Auditor Andrew Keegan says Austin is trying to use its technology to help save lives. “Austin is committed to a Vision Zero plan, which calls for zero fatalities or serious injuries resulting from vehicle collisions by 2025,” he explains. “Part of that plan is focused on implementing new technologies.”</p><p>But Keegan’s team likely won’t be involved until after those plans and programs have been implemented. “Selecting a particular technology to audit depends on the risk posed by the new technology as compared to other risks facing the city,” he says. “This is our practice regardless of the topic.” Indeed, right now, his office is conducting an audit related to motorists’ well-being. “While part of that project includes reviewing the implementation of new technology,” he comments, “the audit is focused on the general issue of traffic safety.”</p><p>Amanda Noble, city auditor in the City of Atlanta’s City Auditor’s Office, notes that Atlanta has implemented smart mobility tech, but she, too, says the audit function didn’t have a role in assessing risk on the front end. “As the city was implementing the technology, we became aware of it and went to a demonstration,” she says. “But we looked at the data the city was connected to and its potential uses in risk assessments and audit work. We hadn’t thought about auditing the technology itself.”</p><p>Would it help? “I think it would,” Noble says. She notes that her team has assessed controls on financial systems installations, but “possibly because smart tech is not financial data, the audit function has not been asked to play a role.” Stakeholders viewing the profession as dealing primarily with financial information can be frustrating, she adds, in the face of internal audit training that emphasizes the importance of foresight in all areas of the enterprise. </p><p>“So much of our role is looking backward,” Noble says. “There’s not really a process for emerging risk, unless we do it as one-offs. There’s nothing systematic.” She adds that resource constraints limit the audit function’s ability to tackle emerging issues, so new risks may not be audited until nearly a year has passed. She’d like to do more.</p><p>“Decision-makers value our input,” Noble emphasizes. “We need a way to assess and report on emerging technology.”</p><h2>Expanded Services, New Skills</h2><p>Lediju sees a balance between tried and true audit services and helping organizations see around the corner. “We’ll need to remain focused on our existing foundation of auditing standards and principles to detect internal control weaknesses and fraud risks,” she says. “But the profession must be ready to take on more of an advisory role and help cities keep pace with and get ahead of emerging risks, maintaining its unique perspective on people, processes, and governance when striving to strengthen its risk management programs.” </p><p>Because of the specialized knowledge required for new and smart technologies, she adds, internal auditors who possess a mix of business and technology skills will be needed. In fact, more of them will be needed. “Smart tech requires more internal audit resources because the pool of tools is constantly expanding and being used for various operations across government services,” Lediju explains. As a result, she says, information and software oversight and accountability, including human and technology resources, become more necessary.</p><p>Internal auditors will need to adopt new tools and techniques, she adds, such as artificial intelligence and blockchain auditing and reconciliations, to increase continuous audit activities, rapidly pinpoint control gaps, and identify nonconformance and process improvement opportunities in real time. She says her office “currently relies on outside contracting and consulting services to keep abreast of the rapidly evolving trends and practices in technology, governance, security, and privacy relevant to the respective technologies.” </p><p>Lediju adds: “With the requirements of continuing professional education and the goal to help businesses and government adopt best or leading practices, internal audit can remain a necessary and beneficial agent of change.” Maybe, in fact, the profession could do more when it comes to smart tech. </p><table cellspacing="0" width="100%" class="ms-rteTable-4"><tbody><tr class="ms-rteTableEvenRow-4"><td class="ms-rteTableEvenCol-4" style="width:100%;"><strong>​IoT Risks</strong><p><br>The risk issues every public entity project faces are amplified when the connectivity required for smart technology is at play. </p><ul><li><em>Human error.</em> Hackers are one kind of human risk, simple mistakes are another. Often overlooked as a threat, the public entity employees who read the meters and monitor the system outputs — and decide when to override — are likely inexperienced with smart city technology, <em>Risk Management</em> magazine noted recently. Their ethics and judgment may also come into play in a smart tech crisis.</li><li><em>Technical difficulties.</em> The connectedness needed for smart technology to work may require integrating powerful, cutting-edge IT infrastructures with, as Travelers calls them in its 2017 Public Safety for the Smart City report, "legacy IT infrastructures that may not be fully up to the task of handling the extreme volumes and types of data." This includes, for example, vehicle-to-infrastructure that smart devices generate. Plus, sometimes software fails, or lightning strikes, or the power goes out. </li></ul><ul><li><em>Complicated connections.</em> Many smart tech projects, especially in transportation, involve public sector entities, academia, and private industry — and each often has its own data management infrastructure already in place. Many also involve multiple — in some cases, dozens of — local, county, and state jurisdictions. The City of San Diego General Plan, for example, includes a "mobility element" that will guide implementation of the city's part in the multi-stakeholder Mobility 2030 Regional Transportation Plan prepared by the San Diego Association of Governments, an organization of 18 local and county public entities. In addition, Southern California is a national Intelligent Transportation System Priority Corridor Program participant; the Southern California Association of Governments represents six counties and 191 cities.</li></ul><p> <br>Even the familiar risks posed by smart tech can cause greater concern to internal auditors because of their vast scale — especially if, as <em>Risk Management</em> puts it, "policies, procedures, and training do not adequately address the new capabilities." Additional education and new tools may be required to meet the challenge.</p></td></tr></tbody></table><p></p>Russell A. Jackson1
Fit for Digital for Digital<p>​Internal auditors better get fit. Digitally fit, that is. They will need to be in top shape to take action in digital transformation initiatives.</p><p>Digital fitness means embracing new technology to gain insights that can help digital transformation. It can make risk professionals "important partners and leaders in helping their organizations get better benefits from their digital initiatives," says Jim Woods, global risk assurance leader at PwC.</p><p>But many internal audit, risk, and compliance functions are falling short, according to PwC's latest <a href="" target="_blank">Risk in Review Study</a>. The global study surveyed more than 2,000 board members, CEOs, and senior executives, as well as internal audit, compliance, and risk professionals. </p><p>Woods says internal audit and other risk functions are at a "critical juncture" in which automation, data analytics, and other technologies are transforming businesses. But new risks are transforming organizations, as well. "Digital transformation is also driving the potential for identifying risk and making smarter decisions," he adds. </p><p>Woods points to findings from PwC's most recent CEO survey in which about one in five CEOs said they receive risk exposure data that is comprehensive enough to make long-term decisions. That number hasn't increased in 10 years. </p><p>"Dynamics" is what PwC labels internal audit, compliance, and risk functions that are in the top quartile of surveyed organizations. These functions are developing digital capabilities faster, are confident in taking risks that are consistent with their strategies, manage transformation-related risks more effectively, and are getting better-than-expected value from digital investments than their peers. </p><p>The Risk in Review study outlines six components of digitally fit organizations. Internal audit functions might consider them a fitness regime.</p><h2>All-in on the Organization's Digital Plan</h2><p>Dynamics have aligned their function's digital strategy with that of their organization, enabling them to provide "strategic advice and assurance over the new and changing risks that digital transformation brings," the report notes. Three-fourths of dynamic functions seek specific outcomes from their digital investments, and 73% change performance metrics to support behaviors and manage against an "aspirational" digital operating model. Less than half of other organizations do those three things.</p><h2>Boost Digital Skills and Talent </h2><p>Auditors and risk professionals in dynamic functions have become data-driven and use digital tools to provide risk insights at the pace of the organization's transformation efforts, the report says. Executives say their organizations need critical thinking, technology, analytics, cybersecurity, project management, and change management skills. Eight in ten dynamic functions use performance metrics to assess and reward new digital ways of working, and seven in ten have created a talent management program to hire digital personnel or enhance the skills of existing people. </p><h2>Find the Right Fit for Emerging Technologies</h2><p>Overall, one-third of surveyed functions are using technologies such as artificial intelligence (AI), the Internet of Things (IoT), and robotic process automation. Dynamics are more likely than other functions to automate their activities to free people to work on more valuable analyses and to expand risk coverage. Thirty-six percent of dynamic functions use IoT sensors to respond to risks, and 39% use AI for population testing, controls, or risk modeling. Those are more than double the percentages of other functions. </p><h2>Enable the Organization to Act on Risk in Real Time</h2><p>To support transformation decisions, internal audit and other risk professionals must provide insights about a fast-changing set of risks that can impact the organization quickly. Nearly three-fourths of dynamic functions are redesigning current processes to deliver services and developing new services for stakeholders. Half are using intelligent automation or machine learning to prioritize risks. </p><h2>Engage Decision-makers of Key Digital Initiatives</h2><p>Audit and risk functions that provide the most value to transformation projects are in communication with decision-makers, participating in key meetings and consulting on projects and plans. About eight in 10 dynamic functions use dashboards or visualization tools to provide more strategic risk reports to the board. A similar number influence strategic decisions about digital initiatives.</p><h2>Provide a Consolidated View of Risks</h2><p>Dynamic audit and risk functions collaborate across the lines of defense on digital projects. With this component, the numbers are small, though. One in five dynamic functions have a common policy framework and a single set of risk metrics or key performance indicators. About one-fourth provide consolidated reports to the board. </p>Tim McCollum0
Internal Audit's Technology Challenge Is No Easy Road Audit's Technology Challenge Is No Easy Road<p><img src="/2019/PublishingImages/Cracked%20Road_445x300.jpg" class="ms-rtePosition-2" alt="" style="margin:5px;" />​I often refer to the end of the first quarter as "whitepaper season" for the internal audit profession. Typically starting in late February and into March, several key players in the profession publish reports that offer a glimpse into how we're doing. </p><p>Two recently released reports should raise serious concerns about internal audit's slow progress in adopting and adapting technology to execute its responsibilities: Protiviti's Internal Audit Capabilities and Needs survey, <a href=""><span style="text-decoration:underline;"><em>Embracing the Next Generation of Internal Auditing</em></span></a>; and PwC's State of the Internal Audit Profession report, <a href=""><span style="text-decoration:underline;"><em>Elevating Internal Audit's Role: The Digitally Fit Function</em></span></a><em>.</em><em> </em></p><p>Protiviti's report indicates that three out of four internal audit groups are undertaking some form of innovation or transformation, but most are only beginning that journey. What's more, a significant number of functions have yet to get started. Worse yet, "Less than one in three internal audit functions currently have a roadmap in place to guide their innovation and transformation journeys," according to the Protiviti report.</p><p>Internal audit's efforts to become "digitally fit," as described in PwC's report, are equally uninspiring. For the purposes of the report, PwC defines digital fitness in two important components. First, the function has in place the skills and competencies to provide strategic advice to stakeholders and to provide assurance with regard to risks from the organization's digital transformation. Second, the function is changing its own processes and services so as to become more data driven and digitally enabled.</p><p>While the PwC report finds 19 percent of internal audit functions are digitally fit, and another 27 percent are taking definitive steps toward digital fitness, 54 percent are described as beginners who are just starting relevant activities or planning them "in far more ad hoc ways."</p><p>This is particularly troubling when one considers how long the profession has been talking about updating and transforming its processes. My message to attendees of The IIA's General Audit Management conference in March addressed many of the same themes. In my presentation, Auditing at the Speed of Risk, I lamented the profession's low adoption rates of next generation technology, reliance on weak approaches to identifying emerging and atypical risks, and minimal changes in decades-old audit processes.</p><p>What is needed is transformational change, which was the theme of The IIA's 2018 Pulse of Internal Audit. <a href=""><span style="text-decoration:underline;"><em>The Internal Audit Transformation Imperative</em></span></a> urged practitioners to embrace agility, innovate, raise the level of talent, and engage more closely with boards. The closing words of the report are as relevant and urgent today as ever:</p><p><em class="ms-rteStyle-BQ">Internal audit's progress over the past, and the successes accomplished, will not be enough to carry the profession forward. Current times require changes in mindset and actions from all internal auditors. Complacency will lead to irrelevance, but decisive moves by CAEs will propel internal audit forward through the transformation required.</em></p><p>I don't wish to be an alarmist, but stakeholders are demanding more from internal audit. Reports such as the ones from Protiviti and PwC suggest we are not ready to meet those demands.</p><p>I debated whether to use a crossroads analogy to hammer home this message. At first I thought it a bit cliché. Every crisis doesn't have to be a choice of one path or another. But in this case a slight modification to the crossroads discussion offers an important added level of clarity and urgency.</p><p>Most people envision a crossroads as intersecting roads that offer alternatives on which way to go. But that vision is typically two-dimensional, like the x- and y-axes on a quadrant graph. The reality is that there is a third dimension.</p><p>In this case, the z-axis offers a measure of how steep the journey toward digital fitness has become. Until we commit as a profession to innovate our approaches, update stale and slow processes, and embrace technology, the grade will get steeper with each passing day. The other direction on the z-axis provides an easier road. Of course, that road is all downhill, which is a direction we can't afford to go.</p><p>As always, I look forward to your comments.<br></p>Richard Chambers0
The Single Point of Failure Single Point of Failure<p>When Canadian cryptocurrency exchange CEO Gerald Cotten died unexpectedly in December, he took key corporate passwords to his grave. Those passwords could unlock $137 million in customer funds that were trapped on Cotten’s encrypted notebook computer. Without the recovery key to access those funds, his company, QuadrigaCX, filed for bankruptcy, according to Nova Scotia’s Supreme Court records. </p><p>In March, court-appointed monitor Ernst & Young (EY) cracked Cotten’s code and found the funds had been transferred out of customers’ crypto wallets in April 2018. Moreover, EY says QuadrigaCX kept limited records and never reported its financials.</p><p>This incident takes the meaning of a single point of failure to a higher level. It also suggests some considerations for internal auditors now and in the future.</p><p>At QuadrigaCX, basic governance, risk management, and controls failed to prevent this unexpected and disastrous event or allow for a timely recovery. Clearly, access controls stopped the company from running the key cryptocurrency exchange process and transacting with its customers normally. </p><p>All organizations need to think about single-point-of-failure risks such as one person knowing all the key passwords to a critical process. This risk occurs when failure of one part of a system stops the entire system from working. This condition is undesirable in any system with a goal of high availability or reliability . This is what happened at QuadrigaCX, which raises important questions and lessons in three key areas.</p><h2>Technology Governance, Risks, and Controls</h2><p>Internal auditors should identify critical business technology governance, risks, processes, and systems to determine whether single points of failure exist. IIA Standard 1210.A3: Proficiency calls on auditors to know the business and technology they review, which they can accomplish by learning, documenting, and mapping key processes and systems. As part of that process, the auditor may analyze the process flow and identify whether certain devices or processes could become a single point of failure. For example, in some network configurations, a single router or device may serve as a key gateway. But if the one device fails, the gateway may become unavailable to users. </p><p>Likewise, a single software failure can have a calamitous impact on a business. In 2012, a failed software test at Knight Capital caused the company’s new trading system to start trading repeatedly, resulting in a $440 million loss within 45 minutes.</p><p>Information security tools or systems can become a single point of failure, too. For example, a retail company requested that all of its customers update their sign-on passwords, telling them it would give them promotional discounts and improve account security. However, the password security system became a single point of failure when suddenly too many customers logged on to update their passwords, which crashed the system. The system was not designed to handle the volume. </p><p>In addressing single points of failure, internal auditors should focus on the highest business process and technology risks. For example, Deloitte’s An Eye on the Future 2019: Hot Topics for IT Internal Audit in Financial Services report lists cybersecurity, technology transformation and change, technology resilience, and extended enterprise risks among its hot risk topics. Several of these topics apply to all organizations.</p><p>Knowing the top risks represents a start, but finding single points of failure in those areas can be challenging. Internal auditors cover program changes by testing governance and controls, but at best, auditors can only sample certain testing procedures and processes. </p><h2>Disaster Recovery Backup Testing</h2><p>Internal auditors should determine what recovery or backup plans are in place for the organization’s critical systems. Disaster recovery plans serve as a high-level control process to restore critical systems that were lost or disrupted. Reviewing the governance, risks, and controls over backup or disaster recovery tests allows the auditor to determine how rapidly a critical system can be recovered. The objective of recovery testing should include looking at any single points of failure such as testing for missing documents, devices, or key individuals. </p><p>Use of cloud technology and software as a service adds different factors that the auditor needs to review. For example, how frequent and how realistic are the testing plans? What mistakes or setbacks are uncovered, and more importantly, are there any single points of failure? If a critical system recovery was performed but needed a single person to provide the only passwords to transact or start the system, then the auditor or recovery team should consider this a single point of failure.</p><p>Some technology recovery plans are not completely tested or exercised because they are too complex, no resources are budgeted, or the governance is too weak. Sometimes limited recovery is considered successful. </p><p>Several years ago, during a large payroll processor’s data center disaster recovery test, an IT audit team observed that a critical system failed to restore several times. The culprit: One backup medium failed and could not be read. The disaster recovery team was able to get a new backup made but from the existing data center. This backup took more than two days to create. What would have happened if the existing data center had been unavailable or if it took weeks to restore? Would the payroll processor’s customers accept this critical service disruption? </p><h2>Key Personnel</h2><p>Auditors should look for key personnel or executives as a single point of failure in their audit universe or audit program. If a privileged account user, system administrator, or CEO is the person who knows the key password, and no other person or recovery process is in place, then the risk of a single point of failure increases.</p><p>To begin, internal auditors should identify who the key stakeholders — customers, vendors, or users — are for the critical systems. They should inquire and document whether any single individual performs a critical task or function and consider the single-point-of-failure risk. </p><p>Key personnel do not need to be a CEO to become a single point of failure. During a review of a large retailer’s critical key management system, an IT auditor discovered that one of the two individuals who had half of the primary encryption key had left the company. The company noticed this situation because it had not needed to generate a new key since the employee departed. If it had needed to generate a new key, a serious delay or security incident may have occurred.</p><p>Prepare for the Future</p><p>Preparing for the future, internal auditors need to continue assessing complex IT processes based on risk. The QuadrigaCX incident demonstrates that auditors need to assess possible technology single points of failure. When a single point of failure can disrupt an organization’s business or technology process, auditors need to carefully assess this threat. Ignoring it could be hazardous to the organization’s health. <br></p>Steve Mar1
The Consumer's Data Anxiety's-Data-Anxiety.aspxThe Consumer's Data Anxiety<p>​Consumers are fatigued from data breaches, ransomware attacks, and data misuse scandals, and are anxious about their privacy. Now a trio of surveys show U.S. consumers are losing trust in organizations to protect their personal information and support government regulation of the companies that collect it.</p><p>More than four out of five consumers say they are concerned about how companies use their data, according to an IBM Institute for Business Value survey. Three-fourths of respondents say they do not trust companies with their data, notes an Axios report on the survey.</p><p>Those concerns are echoed by <a href="" target="_blank">a survey</a> of 1,000 U.S. adults by security hardware company nCipher Security. One in five respondents say they don't trust anyone to protect their data, with more than two-thirds concerned about identity theft. "Consumers are grasping for a semblance of control," says Peter Galvin, nCipher's chief strategy and marketing officer.</p><p>Respondents to both these surveys want companies to be held accountable for protecting their data. In the IBM survey, 87 percent of respondents say governments should regulate companies that manage personal data. Nearly 40 percent of nCipher survey respondents say organizations should fire their chief information security officer following a breach by an intruder. The same percentage say hacking should be a federal offense and that C-level executives should be fined or imprisoned for failing to protect data.</p><h2>The Problem With Facebook</h2><p>One case in point is Facebook, which has drawn the ire of legislators, regulators, and consumer and privacy advocates over the past year. Facebook's troubles began with reports that U.K. research firm Cambridge Analytica had obtained personal data on tens of millions of Facebook users and built profiles on them before the 2016 U.S. presidential election. Similar privacy lapses have emerged since then, despite Facebook executives' assurances to the U.S. Congress and European Commission that the company was working to ensure users' privacy.</p><p>Now with Facebook CEO Mark Zuckerberg recently touting a more "privacy-focused social network," <a href="" target="_blank">a <em>Consumer Reports</em> survey</a> finds that one-fourth of Facebook users say they are very concerned about the amount of personal data Facebook collects about them. They're just not willing to do much about it, themselves.</p><p>For example, only 10 percent of Facebook account holders surveyed stopped using Facebook after learning about the Cambridge Analytica scandal, <em>Consumer Reports</em> found. By far, the biggest reason for staying with Facebook was it was the easiest way to stay connected with people, respondents say. </p><p>While they didn't quit, 70 percent report they have changed how they use Facebook. Forty-four percent surveyed say they have revised their privacy settings and nearly 40 percent have cut back on posting and viewing content, and turned off location tracking on Facebook's mobile app.</p><h2>Companies Leaking Data</h2><p>Consumers might act differently if they knew how frequently organizations compromise their data. In <a href="" target="_blank">a recent U.S. survey</a> conducted by Opinion Matters, 83 percent of security professionals say their organization has accidently exposed customer or business-sensitive data. </p><p>The study, commissioned by Boston-based security firm Egress, blames the proliferation of unstructured data contained in emails and document files combined with the number of internal and external channels that employees can communicate through. This combination "has made it easier than ever for employees to share data beyond traditional security platforms," says Mark Bower, chief revenue officer for Egress.</p><p>The survey notes five technologies that have contributed to accidental data breaches by employees: external email services such as Gmail and Yahoo Mail, corporate email, file-sharing services, collaboration platforms such as Slack and Dropbox, and messaging apps. </p><p>Most respondents say their organizations have implemented new security policies, and invested in security technologies and employee training. What they haven't done is encrypt data. Nearly 80 percent share sensitive data internally without encryption, and almost two-thirds share it externally without encrypting it. Without encryption, an employee mistake is more likely to result in data exposure, the survey notes.</p><p>With the European Union's General Data Protection Regulation in effect and other laws due to come online in the next year, organizations will need to plug these data leaks or pay a stiff price (see "GDPR's Global Reach" in the April issue of <em>Internal Auditor</em>). As these surveys indicate, consumers are growing tired of unfulfilled privacy promises and want companies and governments to act.</p>Tim McCollum0
Beneath the Data the Data<p>​Big data can tell unexpected stories: The chief financial officer who had a conflict of interest with a supplier to whom he had awarded a multimillion-dollar contract. The two employees who provided their company-supplied fuel cards to family members to refuel their personal vehicles. The executive who had an affair with a union official during wage negotiations. </p><p>Internal auditors never could have discovered such wrongdoing through traditional audit sampling, walk-throughs, or reliance on the representations of management. They were only found by using business intelligence tools to mine data sources that are now routinely available.</p><h2>Business Intelligence for Auditors</h2><p>Audits typically entail inquiries of management, walk-throughs, and transaction sampling as a basis for statistically inferring the effectiveness of each internal control attribute under review. To be generalizable within a given confidence interval, transaction samples need to be both large and randomized to represent the entire population. In doing so, internal auditors usually presume that the population conforms to a normal bell curve. This brings with it the risk that if the sample is too small, the tests are performed with insufficient care, or the population is skewed differently from a normal bell curve, the auditor may form the wrong conclusions about the control’s true characteristics. If the population contains any erroneous or fraudulent transactions, it is unlikely they will turn up in a walk-through or random sample. </p><p>Today’s self-service business intelligence tools expand internal audit’s toolkit from mere questionnaires and sampling to mining entire data populations. These tools make it easier for auditors to mine data for errors such as anomalous transactions and fraudulent data correlations (see “Mining for Errors” below). In this way, auditors can pinpoint actual error, fraud, and cost savings that demand action.</p><p><img src="/2019/PublishingImages/Kelly-Mining%20for%20Errors.jpg" alt="" style="margin:5px;" /><br></p><p>Beyond financial transactions, auditors can use business intelligence tools to access newly available data sources such as telecommunications, email, internet usage, road tolls, time sheets, maintenance schedules, security incident logs, clocking on/off, and electronic point-of-sale transactions. Previously, many of these sources either were not auditable or were stored as manual records. Business intelligence tools open the door to a variety of audits.</p><p><strong>Inventory</strong> For many organizations, inventory is a complex and poorly understood process. Organizations record movements in cash, debtors, and creditors within their financial systems. Yet, inventory data easily can get out of step with the physical daily movement of thousands of nonhomogeneous goods. Inventory is vulnerable to receipting errors, barcode misreads, obsolescence, rot, and shrinkage. </p><p>Things often go wrong in inventory, and audits often have revealed downside errors of 10 percent of inventory value. Therefore, internal audit could focus on ensuring quantity and description data matches physical reality through accurate goods receipting into the accounting system, precise sales capture, and reliable stock-taking. Once inventory data reflects the physical goods on hand, data mining can assist with identifying: </p><p></p><ul><li>Slow-moving and excessive inventory build-up. </li><li>Book-to-physical adjustments pointing to shrinkage or theft by location. </li><li>Refundable stock that can be returned to suppliers. </li><li>Stock-outs where the organization lost sales because of insufficient demand analysis. </li><li>Negative quantities revealing goods receipting or similar process errors. </li></ul><p><br></p><p>This kind of audit analysis demonstrates the informational value of having accurate inventory data. Such information can lead the organization to prioritize which inventory processes most need fixing.</p><p><strong>Supply Chain</strong> Organizations need to know supplier agreements do not conceal undeclared conflicts of interest and suppliers are paid no more than their contractual entitlements. Even small organizations process thousands of supplier payments daily, so errors are likely. Data mining can include: </p><p></p><ul><li>Matching supplier master data such as bank account numbers, addresses, and telephone numbers to employee and next-of-kin master data for unexpected relationships. </li><li>Isolation of purchase orders or payments just below authorization thresholds. </li><li>Erroneous duplicate invoice payments because of optical character recognition or human error when entering invoice references such as mis-entry of “I” instead of “1,” or “S” instead of “5,” or “/” instead of “\.” </li><li>Historic credit notes that have never been offset against subsequent payments and remain recoverable from suppliers. </li></ul><p><br></p><p>Audits using these tests have experientially revealed an average of 0.1 percent in errors, which enabled organizations to recover cash refunds from suppliers. Auditing over several prior years can result in material financial recoveries. <br></p><p><strong>Payroll</strong> For most organizations, payroll is the largest single cost. The board and audit committee need to know overpaying or underpaying employees is minimized. Payroll data mining can include comparing hours paid to hours actually worked by matching sick leave and holiday to other time- and location-stamped data such as building entry/exit data, cell phone metadata, and email data. In doing this, internal auditors can present management with compelling evidence that supports corrective action. Moreover, previous audits have uncovered savings of about 1 percent of total payroll cost from: </p><p></p><ul><li>Claiming fictional hours on time sheets. </li><li>Falsely claiming to be working at home or on paid sick leave. </li><li>Missing scheduled training. </li><li>Finding repetitive patterns of fictitious sick leave taken on Mondays, Fridays, and the day before or after public holidays. </li></ul><p><br><strong>Company Motor Vehicles</strong> Auditors can mine data gathered from vehicles, including road tolls, refueling, traffic penalties, and insurance claims. This jigsaw puzzle of data can show auditors how vehicles are being used for business purposes, possible abuse of vehicles, and drivers with poor driving histories that result in unnecessary cost. This data can be obtained from external motor fleet providers and insurers. Such audits can recover around 5 percent of fleet costs.</p><p><strong>Metadata</strong> While the content of company-issued cell phone calls and text messages is confidential, the accompanying nonconfidential metadata includes called numbers, durations, date and time stamps, and base station geographical locations. Auditors can discern employee activity, interconnections, and external relationships during work hours or while on paid sick leave by matching this metadata to other sources such as the organization’s telephone list and employee and supplier master files. Internet usage metadata provides similar insights. These data sources can help when investigating white collar conflicts of interest and fraud. </p><p>These are just a few areas where business intelligence opens new portholes. Partnering with the chief information officer can help internal audit access the organization’s databases. Once access is granted, auditors can use business intelligence tools with minimal assistance. </p><h2>Getting Started</h2><p>With business intelligence, auditors are no longer constrained by Microsoft Excel’s 1,048,576 row limit. Excel 2016 includes built-in business intelligence tools, Power Query and Power Pivot. Power Query is an extract, transform, and load (ETL) tool that reads source data and makes it available for Power Pivot for data modeling. This source data typically comes from comma- or tab-separated outputs from other systems. Auditors can access Power Query under Excel 2016’s Data ribbon, where it is also known as Get Data and, once opened, Query Editor.</p><p>Power Query and Power Pivot have formula languages that allow users to create new data columns specific to their own unique needs. Power Query uses M formula language and Power Pivot uses Data Access Expressions (DAX). Both languages differ from Excel formulas. Whereas Excel formulas are not case sensitive and usually do not distinguish among string, date, and numeric data types, M and DAX are sensitive to both text case and data type. This distinction is important when manipulating data and performing calculations. </p><p>Once internal auditors have loaded and edited the raw data down to only the needed columns in Power Query, they can add each table to the Power Pivot data model under the “Add to data model” option. Auditors can then access Power Pivot from Excel under “Manage data model.” From there, they can use the “Diagram view” to link tables such as transaction files keyed to their corresponding master files. The data model can handle multiple external data sources as well as normal Excel tables. This capability allows auditors to create multidimensional relational databases rather than two-dimensional flat files. </p><p>Power Pivot enables auditors to annotate the relational databases retrieved in Power Query with unique columns and measures specific to audit needs, which can be analyzed using Excel’s pivot tables. “Applying Business Intelligence Using Benford’s Law” at the bottom of the page illustrates how Power Query, M, Power Pivot, and Excel can work together to search for irregularities. </p><h2>Data Cleansing</h2><p>Data files usually need to be cleansed before analysis. That is because over time, original source data is input by a variety of users whose training and attention to accuracy may be inconsistent. Some fields may hold invalid data as a result of being migrated from different systems or different versions of the same system. Moreover, stack overflow and other error types may lurk in historic data, the text files may have misaligned some fields, and records may be broken across two or more rows. </p><p>Comma-separated text files can present extra cleansing problems if users have input commas into individual fields. For example, “Kelly & Yang, Inc” would translate into two separate fields because of the comma, whereas “Kelly & Yang Inc” would translate into one field. </p><p>ETL tools will attempt to read all transactions from the raw data files. But if the tool encounters errors, it may exclude them from the upload, resulting in loss of data that dilutes the objective of testing the entire population. If time allows, the auditor may cleanse the text files field-by-field in a spreadsheet or word processor by rejoining broken records, recalibrating misaligned fields, trimming stray characters or spaces, replacing known error values with blanks or zeros, and converting dates stored as text to real dates. </p><p>Further cleansing may be required if source files are fragmented across different years or subsidiaries and need to be joined into a single table, or if source files are tabulated differently from how internal audit wants to use them. In the first case, Power Query can append files into a single data source provided the field headings are identical. In the second case, auditors can untabulate inappropriately tabulated source files back into a single column of data using Power Query’s Unpivot command.</p><p>Internal auditors should keep a record of data cleansing actions in case future rework is required. Any updates to source data made in Power Query will need to be refreshed in the Power Pivot data model as well as in dependent pivot tables. </p><h2>Efficient Queries</h2><p>Business intelligence tools are faster than previous versions of Excel, but internal auditors still need to be mindful of formula efficiency. If the auditor tries to add a new calculated field to a data model that requires a row-by-row lookup of each element in a two-million-row database, that could easily result in two million x two million = four trillion separate lookups. </p><p>Even with software, four trillion lookups could take several hours. Auditors can increase query efficiency by indexing, compartmentalizing a large query with efficient calculated fields, and filtering out unwanted columns or transactions that are blank or below a given materiality threshold. </p><h2>Securing Data</h2><p>To avoid internal audit being the source of a leak, or to limit the damage if the unthinkable occurs, auditors should take care with data. Auditors can exclude fields that identify living individuals, home addresses, or bank account numbers from downloads or replace them with codes such as an employee number instead of a name. They should be cautious when transmitting data to ensure USB drives are secure and electronic data is not emailed to unintended recipients. Auditors should check recipient email addresses before hitting “send.” Password protection and encryption should be used when practical. As auditors only need to work on copy data — rather than live data — they usually can destroy their version and wipe USB drives after the audit is completed. </p><h2>Original Insights</h2><p>Business intelligence tools unlock new ways to audit. With only a little new learning, business intelligence tools can expand internal audit’s adventures into new pools of financial and operational data that may reveal risk and control insights. Moreover, because even the most innocuous transactions leave data trails, imaginative analysis can uncover errors, fraud, and cost savings that transform audit reports into compelling reading for executives and the board.</p><p><img src="/2019/PublishingImages/Kelly-Applying-BI-Using-Benfords-Law_web.jpg" alt="" style="margin:5px;" /><br></p>Christopher Kelly1

  • IIA Global 3LOD Exposure_July 2019_Premium 1
  • IIA_Sawyer_July 2019_Premium 2
  • IIA Sepcialty Centers_July 2091_Premium 3