Technology

 

 

The Runaway Threat of Identity Fraudhttps://iaonline.theiia.org/2018/Pages/The-Runaway-Threat-of-Identity-Fraud.aspxThe Runaway Threat of Identity Fraud<p>​​​​Just a reminder: The European Union's Global Data Protection Regulation (GDPR) takes effect on May 25. The new regulation ​enacts strict rules requiring organizations to protect consumer data, and it applies to any organization worldwide that gathers data on EU consumers. The aim is to protect the privacy of consumers and to combat identity theft and fraud.</p><p>Now here's another reminder: Identity fraud is getting worse. In the U.S., 16.7 million consumers were victims of identity fraud in 2017, up 8 percent from 2016, according to Javelin Strategy & Research's <a href="https://www.javelinstrategy.com/coverage-area/2018-identity-fraud-fraud-enters-new-era-complexity" target="_blank">2018 Identity Fraud Study</a>. That's one out of every 15 U.S. consumers. Javelin surveyed 5,000 U.S. adults for the study.</p><p>What's the bottom line for internal auditors and their organizations? It's time to get serious about protecting consumer data. </p><p>"2017 was a runaway year for fraudsters, and with the amount of valid information they have on consumers, their attacks are just getting more complex," says Al Pascual, senior vice president and research director at San Francisco-based Javelin.</p><p>The Javelin report makes a distinction between identity theft and identity fraud. Identity theft is unauthorized access to personal information, such as through a data breach. Identity fraud happens when that personal information is used for financial gain.</p><h2>A New Target</h2><p>The nature of identity theft and fraud shifted in 2017, the report notes. For the first time, more Social Security numbers were stolen than credit card numbers. Last year's massive Equifax hack was the most glaring example. Those Social Security numbers make it easy for criminals to open accounts in a victim's name or to take over their existing accounts. </p><p>Javelin says account takeover was one of two drivers of identity fraud last year, along with existing noncard fraud. Account takeover tripled, with $5.1 billion in losses, a 120 percent increase over 2016. This type of fraud is particularly costly for consumers, who spend on average $290 and 16 hours to resolve incidents.</p><p>Small wonder then that consumers "shift the perceived responsibility for preventing fraud from themselves to other entities, such as their financial institution or the companies storing their data," as Javelin's press release notes. Respondents rate security breaches at companies as the top identity-related threat, with 63 percent saying they are "very" or "extremely" concerned about such incidents. Nearly two-thirds of victims say breach notifications don't protect them and are just a way for organizations to avoid legal trouble. </p><h2>Going Online</h2><p>Another trend is identity fraud has moved online in response to the introduction of EMV chip cards in the U.S. Credit and bank cards with these chips make it harder for fraudsters to use stolen cards in person, but they still can be used online, where many people shop. Indeed, card-not-present fraud is 81 percent more likely than point-of-sale fraud, Javelin reports.</p><p>These frauds are becoming more sophisticated, too, according to Javelin. For example, fraudsters opened intermediary accounts in the names of 1.5 million victims of existing card frauds. Such accounts include email payment services such as PayPal or accounts with online merchants.</p><h2>Protecting Consumers</h2><p>Javelin's recommendations for preventing identity fraud focus more on what consumers can do to protect themselves, including:</p><ul><li>Using two-factor authentication.</li><li>Securing devices.</li><li>Putting a security freeze on credit reports to prevent accounts from being opened.</li><li>Signing up for account alerts.</li><li>Setting controls to prevent unauthorized online transactions.</li></ul><p> <br> </p><p>Such vigilance can help, but consumers expect financial institutions, retailers, and others they do business with to protect their information. Now they have a powerful ally in the GDPR, which puts responsibility squarely on businesses.</p><p>The GDPR requires organizations to provide a reasonable level of protection for personal data and mandates that they notify data protection authorities within 72 hours when consumer records have been breached. Compare that with some recent U.S. breaches in which several weeks passed between when the incident was discovered and the time when the organization disclosed it. </p><p>GDPR regulators can punish organizations that don't comply harshly. Fines can run up to 4 percent of an organization's annual turnover up to €20 million ($24.6 million). If protecting customers' personal data isn't a priority in itself, the potential financial penalties should raise the stakes for organizations.​</p><p> <br> </p>Tim McCollum0
The Rising Tide of Cyber Riskshttps://iaonline.theiia.org/2018/Pages/The-Rising-Tide-of-Cyber-Risks.aspxThe Rising Tide of Cyber Risks<p>​Large-scale cyberattacks rank third in likelihood among global risks identified by the World Economic Forum's <a href="http://reports.weforum.org/global-risks-2018/" target="_blank" style="background-color:#ffffff;">Global Risk Report 2018</a>. Released this month ahead of the forum's annual gathering of world and business leaders in Davos, Switzerland, the survey report predicts a heightened global risk environment, with the tentacles of cyber threats factoring into business and geopolitical risks. Think cyberwarfare and attacks on major companies, banks, and markets.</p><p>"Geopolitical friction is contributing to a surge in the scale and sophistication of cyberattacks," says John Drzik, president of Global Risk and Digital with insurer Marsh, in a press release accompanying the report. That risk continues to grow for businesses, as well, even as they become more aware of cyber threats, Drzik points out. "While cyber risk management is improving, business and government need to invest far more in resilience efforts" to avoid protection gaps. </p><p>Dire warnings about cyber threats are pushing boards to reconsider their business plans. In EY's latest <a href="http://www.ey.com/gl/en/services/advisory/ey-global-information-security-survey-2017-18" target="_blank">Global Information Security Survey</a>, 56 percent of C-suite respondents say the increased impact of cyber threats and vulnerabilities has led their organization to change or plan to change business strategies. Only 4 percent say they have fully considered the secu​rity issues arising from their current strategy.</p><p>It's not the large-scale attacks envisioned by the World Economic Forum report that worry the nearly 1,200 respondents to the EY survey. It's the less sophisticated attackers that have targeted their organizations. "The most successful recent cyberattacks employed common methods that leveraged known vulnerabilities of organizations," says Paul van Kessel, cybersecurity leader for EY's Global Advisory. </p><p>Couple that with new technologies and increased connectivity, and organizations are facing more vulnerabilities than before, he notes. As they look to transform their businesses, organizations need to assess their digital environment "from every angle to protect their businesses today, tomorrow, and far into the future," he says.</p><h2>A Question of Money</h2><p>Executives clearly see a need for more resources to face cyber threats. Although 59 percent of respondents say their cybersecurity budgets increased in 2017, 87 percent say they need to allocate as much as 50 percent more. Twelve percent expect more than a 25 percent increase this year. </p><p>For many organizations, it might take a major breach for them to make significant cybersecurity investments, respondents report. Three-fourths of respondents say an incident that caused damage would result in a higher cybersecurity outlay. Conversely, nearly two-thirds say a less damaging attack would not lead to an increase.</p><h2>Three Levels of Attack</h2><p>Budgets aside, respondents acknowledge the vulnerabilities and threats are rising. Chief among the vulnerabilities are employees who aren't following good cybersecurity practices. Malware and phishing far outpace other threats. </p><p>In the face of increased threats, resilience may be the best way for organizations to fight back. "To get there, the organization needs to understand the relationship between cyber resilience and the objectives of the business, as well as the nature of the risks it is facing and the status of the current safeguards," the EY report says. "It must also assess how much risk it is prepared to take and define an acceptable loss."</p><p>To become more resilient, the EY report notes that organizations need to take steps to address three levels of attack: common, advanced, and emerging.</p><p> <strong>Common.</strong> Although the vast majority of attacks target known weaknesses, three-fourths of respondents say their organization's ability to identify vulnerabilities is immature or moderately mature. Twelve percent lack a breach detection program, and 35 percent say their data protection policies are ad hoc or don't exist. </p><p>To defend against common threats, EY proposes five components:</p><ul><li>Talent-centric, with everyone in the organization responsible for cybersecurity.</li><li>Strategic and innovative, with cybersecurity embedded into decision-making.</li><li>Risk-focused, with "well-governed risk alignment."</li><li>Intelligent and agile, to detect and respond to threats timely.</li><li>Resilient and scalable, to minimize disruptions and grow with the business.</li></ul><p> <strong><br></strong></p><p> <strong>Advanced.</strong> These sophisticated attacks target unknown or complex vulnerabilities, and are carried out by organized crime groups, cyber terrorists, and nation states. To respond to such attacks, the EY report recommends organizations centralize cybersecurity activities within a security operations center (SOC). This center should focus on protecting the organization's most valuable assets, defining normal operating conditions as a basis for identifying unusual activity, gathering threat intelligence, and carrying out "active defense" missions to identify hidden intruders.</p><p> <strong>Emerging.</strong> These unanticipated attacks are made possible by advancing technologies. Responding to them requires agility to imagine the attacks that could be possible and act quickly when they happen, the report notes. </p><h2>In Case of Emergency</h2><p>Beyond these measures, the EY report says organizations need a cyber breach response plan that automatically springs into action when an incident occurs. The cybersecurity function plays a part, but the plan also involves business continuity planning, compliance, insurance, legal, and public relations. This is an area where many respondents fall short. Nearly 70 percent have a formal incident response capacity, but problems arise when drilling down to specifics. </p><p>Communication is a glaring problem, with 43 percent saying their organization doesn't have a communication strategy to respond to attacks. Just 56 percent say they would notify the media within a month of an incident that compromised data. That could prove costly, with the European Union's Global Data Protection Regulation set to take effect in May. Organizations that fail to respond timely to data breaches could face tangible penalties beyond the damage caused by attacks. </p>Tim McCollum0
Fundamentals of a Cybersecurity Programhttps://iaonline.theiia.org/2017/Pages/Fundamentals-of-a-Cybersecurity-Program.aspxFundamentals of a Cybersecurity Program<p>​Recent major data breaches at Equifax and Deloitte are reminders of the dangers of failing to practice cybersecurity fundamentals. At Equifax, more than 143 million records were exposed, including names, addresses, Social Security numbers, and credit information. The Deloitte breach compromised hundreds of global clients' information.</p><p>Cybersecurity risk is not just an IT issue — it's a business and audit issue. Collectively, the advice information security and internal audit professionals provide to business leaders has never been more important. To partner in addressing today's cybersecurity challenges, audit and security leaders must start with a little common sense.</p><p>Take, for example, a homeowner. There are valuables in the home, so it's important that only trusted people have a copy of the house key. To be prudent, the homeowner should take an inventory of the items in the ho​me and estimate their value so he or she knows how much needs protecting and ensures items are stored securely. The homeowner also should make sure the smoke detectors are working and set up a security monitoring service with video surveillance so he or she can be alerted and react quickly to a potential fire or break-in. </p><p>Organizations need to exercise the same principles when assessing the digital risk to customer, employee, and other company information. Auditors and security professionals should prioritize three fundamentals to help make an information security program more impactful and effective. </p><h2>1. Improve Visibility</h2><p>How can organizations protect what they can't see? Identifying the valuables, or assets, within an organization is probably the most foundational aspect of a security program, and yet it continues to be a pain point. Technical solutions can help, with the right support and funding, but asset management is a process and a discipline, not just a tool. </p><p>Knowing the organization's assets and their value will inform what gets monitored and how. Security monitoring solutions are improving, with richer analytics and machine-learning capabilities as well as more expansive integration. Organizations should monitor their environments around the clock. For small and mid-size organizations that lack in-house resources for such monitoring, partnering with a trusted third party or managed security service provider is an option.</p><p>Another fundamental aspect of improving visibility and monitoring is to proactively look for existing weaknesses or vulnerabilities and patch them. Failure to patch systems with the Apache Struts vulnerability led to the Equifax data breach. The vulnerability allows command injection attacks to occur because of incorrect exception handling. As a result, an unauthorized user can gain privileged user access to a web server and execute remote commands against it. This vulnerability could have been addressed by standardizing and increasing the frequency of scanning and patch cycles.</p><p>Security and audit teams can work together to ensure the right risks are being mitigated and help their business partners think about risk rather than checking off a compliance requirement. They also can partner on implementing a repeatable risk assessment process. This is no longer just a best practice or standard. It is now a matter of compliance with regulations such as the European Union General Data Protection Regulation and the New York Department of Financial Services CR500.</p><h2>2. Improve Resiliency</h2><p>Is the organization prepared to handle the inevitable and how well can it recover? Improving visibility and being notified of threats and incidents is great, but an inappropriate or untimely response can incur a much greater cost. The organization's<strong><em> </em></strong>ability to quickly diagnose, contain, and recover from a potential or actual data breach or privacy incident directly impacts business operations and the cost to the organization. A well-planned and tested incident response plan can reduce the overall impact and cost of the incident. </p><p>Rapid response is a must with many global and U.S. state data breach notification laws having aggressive notification time lines. One of the ways in which internal audit and information security functions can increase the speed of their investigations and response times is maintaining a good asset- management process. </p><p>Maintaining a state of preparedness is more than having a document or periodically testing the plan. It's about having a good team of people from the right areas of the organization. Security and audit teams can partner to ensure that the incident response plan has all the necessary elements in place and ensure it is being followed. Responding to a crisis requires people to work together in a way that they normally do not work, which requires building and maintaining good relationships.</p><h2>3. Improve Sensitivity</h2><p>Do the organization's employees and associates understand what is at stake with cybersecurity? Increasing sensitivity to cyber risks needs to be tied to personal relevance, because people respond better when it impacts them directly. </p><p>Recall the homeowner analogy. For some people, it may be easy to get too comfortable within their neighborhood and become desensitized to potential risks of home thefts to the point of forgetting to lock doors and windows. Or they may become too liberal about who has a copy of their house key and what they do with it. There are lessons here for employees that should prompt their response.</p><p>Social engineering, including phishing simulations and physical security, must be a regular and primary aspect of cyber risk sensitivity training programs. Phishing attacks aimed at stealing user login credentials cause most reported data breaches. These types of attacks can be thwarted through a more expansive use of multi-factor authentication, which is a combination of something the person knows, such as a password or PIN number, along with something the person has, such as a token or smartphone. Technical controls can be effective, but they also must be accompanied by user education. As a training method, phishing simulations confirm what internal auditors and security professionals already know: There is never going to be a 0 percent click rate. However, they provide an opportunity to reiterate training content.</p><h2>Practicing Security Basics</h2><p>Shortly after the 2014 Sony hack, former President Barack Obama compared cybersecurity to a basketball game, "in the sense that there's no clear line between offense and defense. Things are going back and forth all the time." There is some truth to that. </p><p>In basketball, teams often lose because they overreact to a new play and forget the fundamentals. Coaches usually react by having teams practice basics such as passing, layups, and free throws. Similarly, organizations all have various priorities, and many of them are competing. Sometimes when it appears organizations are getting beaten by cyber risks, they need to revisit the fundamentals such as visibility, resiliency, and sensitivity. Auditors can partner with chief information security officers in this effort to ensure that the program is taking a balanced, risk-based, and business-oriented approach. ​</p>Jon West1
Innovations on the Horizonhttps://iaonline.theiia.org/2017/Pages/Innovations-on-the-Horizon.aspxInnovations on the Horizon<p>​Organizations are beginning to look at emerging technologies more holistically, with an eye toward coordinating them in pursuit of objectives, according to Deloitte's <a href="https://www2.deloitte.com/insights/us/en/focus/tech-trends.html" target="_blank">Tech Trends 2018​</a> report. These organizations aren't thinking of big data, the cloud, and other disruptive technologies as separate domains. They are looking at how the technologies can complement each other, the report finds.</p><p>They also are pushing responsibility for technology up the corporate ladder, from chief information and technology officers all the way to the CEO and board. "We now see many forward-thinking organizations approach disruptive change more strategically," says Bill Briggs, chief technology officer at Deloitte Consulting LLP. "Increasingly, they are focusing on how multiple disruptive technologies can work together to drive meaningful and measurable impact across the enterprise." Tech Trends 2018 identifies eight trends that may drive organizations over the next two years.</p><h2>Reengineering Technology</h2><p>After many years of using IT to reengineer the organization, IT departments need to reengineer themselves, the report states. Bottom-up change should focus on modernizing the organization's underlying IT infrastructure through automation, and by repaying "technical debt" accrued from software design, physical infrastructure and systems, and maintaining legacy systems. Top-down reengineering should focus on building a new operating model for the IT function that breaks down silos and establishes multi-skill teams aimed at delivering specific outcomes. Rather than seeking funding for specific IT needs, the report recommends IT functions budget in a way that applies resources to support strategic goals. </p><h2>No-collar Workforce</h2><p>Forget white collar and blue collar. The workforce of the future will combine people and machines working together, the report predicts. "As automation, cognitive technologies, and artificial intelligence gain traction, companies may need to reinvent worker roles, assigning some to humans, others to machines, and still others to a hybrid model in which technology augments human performance," the report states. The good news is automation probably will not displace most workers. Instead, people and machines each will bring specialized abilities to the equation. Organizations will need to redesign jobs and reimagine how work gets done, the report notes. </p><h2>Enterprise Data Sovereignty</h2><p>Increasingly, organizations want to make information accessible across business units, departments, and locations, the report finds. Within the next two years, many organizations will modernize their data management approach in a way that balances the need for control and accessibility. Setting data "free" will take more "modern approaches to data architecture and data governance" in making decisions about data storage, usage rights, and understanding relationships among data, the report notes. Moreover, organizations will need to address data issues in three areas: management and architecture, global regulatory compliance, and data ownership.</p><h2>The New Core</h2><p>Discussions of disruptive technologies often overlook how technology can "fundamentally change the way work gets done" in an organization's back-office operations and systems, such as finance and the supply chain, the report states. Organizations have much to gain from connecting front-office systems to back-office operations that support pricing, product availability, logistics, and financial information. Over the next two years, the report predicts organizations will build a new core that incorporates automation, analytics, and interconnections with systems and processes. Instead of seeking tools to address specific tasks, organizations will look for technologies that can support complex operating networks and new ways of working, the report says.</p><h2>Digital Reality</h2><p>The report notes that organizations implementing technologies such as augmented reality, virtual reality, and immersive technology are starting to move beyond experimentation to focus on building mission-critical applications for the workplace. It suggests three design breakthroughs that may accelerate digital reality: </p><ul><li>Transparent interfaces that allow users to interact with data, software applications, and their surroundings. </li><li>Wearable augmented reality/virtual reality gear that gives people "ubiquitous access" to the internet and organizational networks. </li><li>Contextual filters that enable users to adapt their level of engagement in virtual environments — like a virtual reality mute button.</li></ul><h2>Blockchains to Blockchains</h2><p>Although many organizations are testing the waters, the report urges them to start standardizing on the technology, people, and platforms needed to build blockchain initiatives. The report predicts organizations will go from initial use cases to fully deploying production solutions, with a focus on applications that can be commercialized. It also expects organizations to integrate multiple blockchains within their value chain.</p><h2>API Imperative</h2><p>Application programming interfaces (APIs) traditionally have been an IT concern, but the report notes they are becoming a business matter. While APIs enable systems to interact, many businesses want to use them to make technology assets available for reuse enterprisewide. The ability to build and reuse APIs "is key to achieving business agility, unlocking new value in existing assets, and accelerating the process of delivering new ideas to the market," the report says. To do so, organizations need to find ways to make APIs known throughout the organization, and manage and control them. </p><h2>Exponential Technology Watch List</h2><p>The previous trends focus on technologies that are moving into the mainstream, but the report's final trend looks forward to future innovations and their potential impact on organizations. These "exponentials" may emerge at different times, with some coming within the next five years and others likely to take longer to arrive. That doesn't mean organizations should wait to plan for new innovations. Indeed, without the capabilities, processes, and structures needed to innovate, organizations may risk missing out on opportunities that could bring "transformative outcomes," the report concludes.</p><p><br></p>Tim McCollum0
The Robots Are Coming ... for My Familyhttps://iaonline.theiia.org/2017/Pages/The-Robots-Are-Coming-for-My-Family.aspxThe Robots Are Coming ... for My Family<p>​My husband and I had lunch with our 19-year-old college sophomore last weekend. He's majoring in IT. I tried to persuade him to take a look at artificial intelligence (AI) as a career option. After all, it will likely be taking over his family's jobs — and we'll need him to support us. </p><p>You see, his dad is an accountant, one of "The Five Jobs Robots Will Take First," according to <em>AdAge</em> magazine. "Robo-accounting is in its infancy," the article explains, "but it's awesome at dealing with accounts payable and receivable, inventory control, auditing, and several other accounting functions that humans used to be needed to do."</p><p>Another of the top five jobs robots will take according to <em>AdAge</em>? His mother's. Given the fact that, last year, IBM and marketing company The Drum announced that Watson, IBM's AI tool, edited an entire magazine on its own, my days in publishing may, indeed, be numbered. </p><p>And, finally, there's his sister. She plans to follow in the footsteps of a long line of teachers in our family — unfortunately, it may be the end of the line. IBM's Teacher Advisor With Watson "is loaded with the lesson plans and proven strategies [needed] to teach across a variety of elementary grade levels and student abilities," reports 3BL Media. "And because it's cognitive, Teacher Advisor will get smarter — and better — with training and use." </p><p>According to Harnessing Automation for a Future That Works, a McKinsey Global Institute Report, "almost every occupation has partial automation potential." The report estimates that about half of all the activities employees are paid to do in the world's workforce could be automated by adapting current technologies. </p><p>The good news, according to McKinsey, is that less than 5 percent of occupations are candidates for <em>full</em> automation. Take internal auditing, for example. In this month's cover story, <a href="/2017/Pages/Audit-in-an-Age-of-Intelligent-Machines.aspx">"Audit in an Age of Intelligent Machines,"</a> David Schubmehl, research director for Cognitive/AI Systems at IDC, says "There's going to be tremendous growth in AI-based auditing, looking at risk and bias, looking at data."</p><p>So maybe there's hope after all. Maybe these technologies will just supplement and enhance our jobs. Maybe they will even make us more productive. Maybe my family and the pugs won't have to move in with my son.</p><p>While I'm still the editor, I'd like to welcome Kayla Flanders, senior audit manager at Pella Corp., who joins us as the new contributing editor of "Governance Perspectives." A big thank you to Mark Brinkley for his years serving in that position. And, finally, we will be saying goodbye to InternalAuditor.org's "Marks on Governance" blog at the end of December. Norman Marks' contributions to the magazine have been invaluable. In addition to his blog, he has served as a contributing editor and written numerous articles throughout the years. Norman also was a member of The IIA's Publications Advisory Committee and continues to serve on the magazine's Editorial Advisory Board. We look forward to continued collaborations.</p>Anne Millage0
Audit in an Age of Intelligent Machineshttps://iaonline.theiia.org/2017/Pages/Audit-in-an-Age-of-Intelligent-Machines.aspxAudit in an Age of Intelligent Machines<p>​While monitoring transactions, an alert bank data analyst noticed unusual payments from a computer manufacturer to a casino. Because casinos are heavily computerized, one would expect the payments to go to the computer company. The analyst alerted an investigative agent, who rapidly scoured websites, proprietary data stores, and dark web sources to find detailed information about the two parties. The data revealed that the computer manufacturer was facing a criminal indictment and a civil law suit. Meanwhile, the casino had lost its gambling license due to money laundering and had set up shop in another country. Further investigation revealed the computer manufacturer was using the casino to launder money before the company’s legal issues drove it out of business.</p><p>The bank’s data analyst was a machine learning algorithm. The investigative agent was an artificial intelligence (AI) agent.</p><table class="ms-rteTable-default" width="100%" cellspacing="0"><tbody><tr><td class="ms-rteTable-default" style="width:100%;">​To learn more about internal audit's role in AI, download The IIA's <a href="http://bit.ly/2ned9uk" target="_blank">Artificial Intelligence: Considerations for the Profession of Internal Auditing.</a><br></td></tr></tbody></table><p>AI is all around. It’s monitoring financial transactions. It’s diagnosing illnesses, often more accurately than doctors. It’s carrying out stock trades, screening job applicants, recommending products and services, and telling people what to watch on TV. It’s in their phones and soon it will be driving their cars. </p><p>And it’s coming to organizations, maybe sooner than people realize. Research firm International Data Corp. says worldwide spending on cognitive and AI systems will be $12 billion this year. It predicts spending will top $57 billion by 2021.</p><p>“If you think AI is not coming your way, it’s probably coming sooner than you think it is,” says Yulia Gurman, director of internal audit and corporate security for the Packaging Corporation of America in Lake Forest, Ill. Fresh off of attending a chief audit executive roundtable about AI, Gurman says AI wouldn’t have been on the agenda a year ago. Like most of her peers present, she hasn’t had to address AI within her organization yet. Now it’s on her risk assessment radar. “Internal auditors should be alerting the board about what’s coming their way,” she says.</p><h2>The Learning Algorithm</h2><p>Intelligent technology has already found a place on everyday devices. That personal assistant on the kitchen counter or on the phone is an AI. Alexa, Cortana, and Siri can find all sorts of information for people, and they can talk to other machines such as alarm systems, climate control, and cleaning robots.</p><p>Yet, most people don’t realize they are interacting with AI. Nearly two-thirds of respondents to a recent survey by software company Pegasystems say they have not or aren’t sure they have interacted with AI. But questions about the technologies they use — such as personal assistants, email spam filters, predictive search terms, recommended news on Facebook, and online shopping recommendations — reveal that 84 percent are interacting with AI, according to the What Consumers Really Think About AI report. </p><p>What makes AI possible is today’s massive availability of data and computing power, as well as significant advances in the quality of the machine learning algorithms that make AI applications possible, says Pedro Domingos, a professor of computer science at the University of Washington in Seattle and author of The Master Algorithm. When AI researchers like Domingos talk about the technology, they often are referring to machine learning. Unlike other computer applications that must be written step-by-step by people, machine learning algorithms are designed to program themselves. The algorithm does this by analyzing huge amounts of data, learning about that data, and building a predictive model based on what it’s learned. For example, the algorithm can build a model to predict the risk that a person will default on his or her credit card based on various factors about the individual, as well as historical factors that lead to default. </p><table class="ms-rteTable-4" width="100%" cellspacing="0"><tbody><tr class="ms-rteTableEvenRow-4"><td class="ms-rteTableEvenCol-4" style="width:100%;">​ <style> div.WordSection1 { } </style> <p> <strong>Alexa, Are You Monitoring Me?</strong><br></p><p>Between them, the world’s e-commerce, social media, and technology companies are getting to know people very well. Amazon knows their shopping habits, Apple and Google know what they search for and what questions they ask, Facebook knows what engages them online, and Netflix knows what they watch on TV.</p><p>Artificial intelligence researcher Pedro Domingos says the companies using personalization algorithms are getting to the point where they could build a good model of each of their customers. But if the data they had on those people were consolidated in one place, it would enable an algorithm to build a comprehensive model of each person. Domingos calls this the personal data model.</p><p>Imagine an AI algorithm that worked on your behalf, he says — search for your next car, apply for jobs, and even find you a date. “The big technology companies are in a competition to see who can do this better,” he says. “This is something that we’re going to see pick up steam in the next several years.”</p><p>Whether that is a good thing or a bad thing may depend on who controls that data. That’s something that worries John C. Havens, executive director of the IEEE Global AI Ethics Initiative. He says the misunderstanding and misuse of personal data is AI’s biggest risk. Despite the benefits of personalization, “most people don’t understand the depth of how their data is used by second and third parties,” he notes. </p><p>Havens says there’s a need to reorient that approach now to put people at the center of their data. Such an approach would allow people to gather copies of their data in a personal data cloud tied to an identity source, and set terms and conditions for how their data can be used. “People can still be tracked and get all the benefits,” Havens explains. “But then they also get to say, ‘These are my values and ethics, and this is how I’m willing to share my data.’ It doesn’t mean the seller will always agree, but it puts the symmetry back into the relationship.”</p><p>Similarly, Domingos sees an opportunity for a new kind of business that could safeguard a personal data model in the same way that a bank protects someone’s money and uses it on the person’s behalf. “It would need to have an actual commitment to your privacy and to always work in your best interest,” he says. “And it has to have an absolute commitment to ensuring it is secure.”</p></td></tr></tbody></table><h2>Driven by Data</h2><p>Using AI to make predictions takes huge amounts of data. But data isn’t just the fuel for AI, it’s also the killer application. In recent years, organizations have been trying to harness the power of big data. The problem is there’s too much data for people and existing data mining tools to analyze quickly. </p><p>That is among the reasons why data-driven businesses are turning to AI. Five industries — banking, retail, discrete manufacturing, health care, and process manufacturing — will each spend more than $1 billion on AI this year and are forecast to account for nearly 55 percent of worldwide AI spending by 2021, according to IDC’s latest Worldwide Semiannual Cognitive Artificial Intelligence Systems Spending Guide. What these industries have in common is lots of good data, says David Schubmehl, research director, Cognitive/AI Systems, at IDC. “If you don’t have the data, you can’t build an AI application,” he explains. “Owning the right kind of data is what makes these uses possible.”</p><p>Retail and financial services are leading the way with AI. In retail, Amazon’s AI-based product recommendation solutions have pushed other traditional and online retailers like Macy’s and Wal-Mart Stores Inc. to follow suit. But it’s not just the retailers themselves that are driving product recommendations, Schubmehl says. Image recognition AI apps can enable people to take a picture of a product they saw on Facebook or Pinterest and search for that product — or something similar and less expensive. “It’s a huge opportunity in the marketplace,” he says.</p><p>Meanwhile, banks and financial service firms are using AI for customer care and recommendation systems for financial advice and products. Fraud investigation is a big focus. “The idea of using machine learning and deep learning to connect the dots is something that is very helpful to organizations that have traditionally relied on experienced investigators to have that ‘aha moment,’” Schubmehl says.</p><p>That’s what happened with the casino and the computer manufacturer. “The way AI works in that scenario is to say, ‘Something is different. Let’s bring it back to the central brain and analyze whether this is risky or not risky,’” says David McLaughlin, CEO and founder of AI software company QuantaVerse, based in Wayne, Pa. “The technology is never going to accuse somebody of a crime or a regulatory violation. What it’s going to do is allow the people who need to make that determination focus in the right areas.”</p><p>Currently, IDC says automated customer service agents and health-care diagnostic and treatment systems are the applications where organizations are investing the most. Some of the AI uses expected to rise the most over the next few years are intelligent processing automation, expert shopping advisors, and public safety and emergency response. </p><p>Regardless of the use, Schubmehl says it’s the business units that are pushing organizations to adopt AI to advance their business and deal with potential disrupters. Because of the computing power needed, most industries are turning to cloud vendors, some of whom may also be able to help build machine learning algorithms.</p><h2>Is AI Something to Fear?</h2><p>Despite its potential, there is much fear about the risks that AI poses to both businesses and society at large. Some worry that machines will become too smart or get out of control.</p><p>There have been some well-publicized problems. Microsoft developed an AI chatbot, Clippy, that after interacting with people, started using insulting and racist language and had to be shut down. More recently, Facebook shut down an experimental AI system after its chatbots started communicating with each other in their own language, in violation of their programming. In the financial sector, two recent stock market “flash crashes” were attributed to AI applications with unintended consequences.</p><p>Respondents to the World Economic Forum’s (WEF’s) 2017 Global Risks Perception Survey rated AI highest in potential negative consequences among 12 emerging technologies. Specifically, AI ranked highest among technologies in economic, geopolitical, and technological risk, and ranked third in societal risk, according to the WEF’s Global Risks Report 2017. </p><p> <strong>Employment</strong> One of the biggest concerns is whether AI might eliminate many jobs and what that might mean to people both economically and personally. Take truck driving, the world’s most common profession. More than 3 million people in the U.S. earn their living driving trucks and vans. Consulting firm McKinsey predicts that one-third of commercial trucks will be replaced by self-driving vehicles by 2025.<br></p><table class="ms-rteTable-default" width="100%" cellspacing="0"><tbody><tr><td class="ms-rteTable-default" style="width:100%;">​<strong>The Jobs Question</strong><br>By now, internal auditors may be asking themselves, “Is AI going to take my job?” After all, an Oxford University study rated accountants and auditors among the professionals most vulnerable to automation. Of course, internal auditors aren’t accountants. But are their jobs safe?<br><br>Actually, AI may be an opportunity, says IDC’s David Schubmehl. He says many of the manual processes internal auditors review are going to be automated. Auditors will need to check how machine learning algorithms are derived and validate the data on which they are based. And, they’ll need to help senior executives understand AI-related risks. “There’s going to be tremendous growth in AI-based auditing, looking at risk and bias, looking at data,” Schubmehl explains. “Auditors will help identify and certify that machine learning and AI applications are being fair.”<br><br>Using AI to automate business processes will create new risks for auditors to address, says Deloitte & Touche LLP’s Will Bible. He likens it to when organizations began to deploy enterprise resource planning systems, which shifted some auditors’ focus from reviewing documents to auditing system controls. “I don’t foresee an end to the audit profession because of AI,” he says. “But as digital transformation occurs, I see the audit profession re-evaluating the risks that are relevant to the audit.”</td></tr></tbody></table><p>According to the Pew Research Center’s recent U.S.-based Automation in Everyday Life survey, 72 percent of respondents are worried about robots doing human jobs. But only 30 percent think their own job could be replaced (see “The Jobs Question” at right). That may be wishful thinking. “However long it takes, there’s not going to be any vertical industry where there’s not the opportunity to automate humans out of a job,” says John C. Havens, executive director of the IEEE Global AI Ethics Initiative. He says that will be the case as long as businesses are measured primarily by their ability to meet financial targets. “The bigger question is not AI. It’s economics.” </p><p> <strong>Ethics</strong> With organizations racing to develop AI, there is concern that human values will be lost along the way. Havens and the IEEE AI Ethics Initiative are advocating for putting applied ethics at the front end of AI development work. Consider the emotional factors of children or elderly persons who come to think of a companion robot in the same way they would a person or animal. And who would be accountable in an accident involving a self-driving car — the vehicle or the person riding in it?<br></p><p>“The phrase we use is ‘ethics is the new green,’” Havens explains, likening AI ethics to the corporate responsibility world. “When you address these very human aspects of emotion and agency early on — much earlier than they are addressed now — then you build systems that are more aligned to people’s values. You avoid negative unintended consequences and you identify more positive opportunities for innovation.”</p><p> <strong>Privacy and Security</strong> Using AI to gather data poses privacy risks for both individuals and businesses. All those personal assistant requests, product recommendations, and customer service interactions are gathering data on people — data that organizations eventually could use to build a comprehensive model about their customers. Organizations using personalization agents must walk a fine line. “You want to personalize something to the point where you can get the purchase offer,” Schubmehl says, “but you don’t want to personalize it so much that they say, ‘This is really creepy and knows stuff about me that I don’t want it to know.’”<br></p><p>All that data creates a compliance obligation for organizations, as well. And it is also valuable to cyber attackers.</p><p> <strong>Output</strong> Although AI has potential to help organizations make decisions more quickly, organizations need to determine whether they can trust the AI model’s recommendations and predictions. That all depends on the reliability of the data, Domingos says. If the data isn’t reliable or it’s biased, then the model won’t be reliable either. Moreover, machine learning algorithms can overinterpret data or interpret it incorrectly. “They can show patterns,” he points out. “But there are other patterns that would do equally well at explaining what you are seeing.”<br></p><p> <strong>Control</strong> If machine learning algorithms become too smart, can they be controlled? Domingos says there are ways to control machine learning algorithms, most notably by raising or lowering their ability to fit the data such as through limiting the amount of computation, using statistical significance tests, and penalizing the complexity of the model. <br></p><p>He says one big misconception about AI is that algorithms are smarter than they actually are. “Machine learning systems are not very smart when they are making important decisions,” he says. Because they lack common sense, they can make mistakes that people can’t make. And it’s difficult to know from looking at the model where the potential for error is. His solution is making algorithms more transparent and making them smarter. “The risk is not from malevolence. It’s from incompetence,” he says. “To reduce the risk from AI, what we need to do is make the computer smarter. The big risk is dumb computers doing dumb things.”</p><p> <strong>Knowledge</strong> Domingos says concerns about AI’s competence apply as well to the people who are charged with putting it to use in businesses. He sees a large knowledge gap between academic researchers working on developing AI and the business employees building machine learning algorithms, who may not understand what it is they are doing. And he says, “Part of the problem is their bosses don’t understand it either.”<br></p><p> <strong>Governance</strong> That concern for governance is one area the WEF’s Global Risk Report questions — specifically, whether AI can be governed or regulated. Components of AI fall under various standards bodies: industrial robots by ISO standards, domestic robotics by product certification regulations, and in some cases the data used for machine learning by data governance and privacy regulations. On their own, those pieces may not be a big risk, but collectively they could be a problem. “It would be difficult to regulate such things before they happen,” the report notes, “and any unforeseeable consequences or control issues may be beyond governance once they occur.”<br></p><h2>AI in IA </h2><p>Questions of risk, governance, and control are where internal auditors come into the picture. There are similarities between deploying AI and implementing other software and technology, with similar risks, notes Will Bible, audit and assurance partner with Deloitte & Touche LLP in Parsippany, N.J. “The important thing to remember is that AI is still computer software, no matter what we call it,” he says. One area where internal auditors could be useful, Bible says, is assessing controls around the AI algorithms — specifically whether people are making sure the machine is operating correctly.</p><p>If internal auditors are just getting started with AI, their external audit peers at the Big 4 firms are already putting it to work as an audit tool. Bible and his Deloitte colleagues are using optical character recognition technology called Argus to digitize documents and convert them to a readable form for analysis. This enables auditors to use data extraction routines to locate data from a large population of documents that is relevant to the audit. </p><p>For auditors, AI speeds the process of getting to a decision point and improves the quality of the work because it makes fewer mistakes in data extraction. “You can imagine a day when you push a button and you’re given the things you need to follow up on,” Bible says. “There’s still that interrogation and investigation, but you get to that faster, which makes it a better experience for audit clients.”</p><p>QuantaVerse’s McLaughlin says internal auditors could take AI even farther by applying it to areas such as fraud investigation and compliance work. For example, rather than relying on auditors or compliance personnel to catch potential anti-bribery violations, internal audit could use AI to analyze an entire data set of expense reports to identify cases of anomalous behavior that require the most scrutiny. “Now internal audit has the five cases that really need a human to understand and investigate,” McLaughlin says. “That dramatically changes the effectiveness of an internal audit department to protect the organization.” </p><p>The key there is making sure a person is still in the loop, Bible says. “The nature of AI systems is you are throwing them into situations they probably have not seen yet,” he notes. A person involved in the process can evaluate the output and correct the machine when it is wrong. </p><h2>Building Intelligence</h2><p>Bible and McLaughlin both advise internal audit departments to start with a small project, before expanding their use of AI tools. That goes for the organization, as well. Organizations first will need to take stock of their data assets and get them organized, a task where internal auditors can provide assistance. </p><p>For audit executives such as Gurman, the objective is to get up to speed as fast as possible on AI and all its related risks, so they can educate the audit committee and the board. “There is a lot of unknown,” she concedes. “What risks are we bringing into the organization by being more efficient and using robots instead of human beings? Use of new technologies brings new risks.” ​​</p><table class="ms-rteTable-4" width="100%" cellspacing="0"><tbody><tr class="ms-rteTableEvenRow-4"><td class="ms-rteTableEvenCol-4" style="width:100%;"> <br> <p> <strong>AI in the Real World </strong></p><p>Still think artificial intelligence is science fiction? Here are some examples of how companies are putting it to use.</p><ul><li> <strong>Agriculture</strong>. Produce grower NatureSweet uses AI to examine data that it can apply to better control pests and diseases that affect crop production. The company estimates AI could help it increase greenhouse output by 20 percent annually, CNN reports. Meanwhile, equipment maker John Deere recently spent $305 million to purchase robotics firm Blue River Technology, whose AI-based equipment scans fields, assesses crops, and sprays weeds only where they are present. And Coca-Cola uses AI algorithms to predict weather patterns and other conditions that might impact crop yields for its orange juice products.</li><li> <strong>Aviation.</strong> GE Digital uses AI to cull through data collected from sensors to assess the safety and life expectancy of jet engines, including their likelihood for failure. The company estimates that a single flight can generate as much data as a full day of Twitter posts.</li><li> <strong>Finance.</strong> Machine learning enables lawyers and loan officers at JPMorgan to identify patterns and relationships in commercial loan agreements, a task that once required 360,000 man hours, Bloomberg reports. Bank of America uses natural language technology to extract information from voice calls that might reveal things like sales practice or regulatory issues. On the stock market, an estimated 60 percent of Wall Street trades are executed by AI, according to Christopher Steiner's book <em>Automate This</em>. </li><li> <strong>Marketing.</strong> Kraft used an AI algorithm to analyze customer preference data that helped it make changes to its cream cheese brand.</li><li> <strong>Retail.</strong> Fashion retailer Burberry's applies AI-based image recognition technology to determine whether products in photographs are genuine, spotting counterfeits with 98 percent accuracy, according to a <em>Forbes Online</em> report.<br><br></li></ul></td></tr></tbody></table><p></p>Tim McCollum1
The IT Governance Gaphttps://iaonline.theiia.org/2017/Pages/The-IT-Governance-Gap.aspxThe IT Governance Gap<p>​Executives see benefits from IT governance, but many aren't doing enough about it, according to ISACA's <a href="http://www.isaca.org/tech-governance-impact" target="_blank">Better Tech Governance Is Better for Business</a> report. Ninety-two percent of executives surveyed report IT governance has led to better outcomes, and 89 percent say it makes the business more agile. Yet, 69 percent say organizations need a stronger alignment of IT and business goals. ISACA surveyed 732 members, in 87 countries, who are board members, senior executives, managers, and professionals.</p><p>"The boardroom must become hyper-vigilant in ensuring a tight linkage between business goals and IT goals, fully leveraging business technology to improve business outcomes while diligently safeguarding the organization's digital assets," ISACA CEO Matt Loeb says.</p><p>The top IT governance challenges respondents foresee in the next 12 months are cybersecurity policies and defenses (44 percent), risk management priorities (36 percent), and alignment between IT objectives and overall business objectives (35 percent). It's not surprising that cybersecurity ranked so highly, the report notes. "Boardroom worries over increased internal and external threats are so great (61 percent) that almost half (48 percent) of leadership teams have prioritized investments in cyber defense improvements over other programs, including digital transformation and cloud," it states.</p><p>Despite their concern, only 55 percent of respondents say the board and senior executives are doing all they can to protect digital assets and data records. Just 29 percent report their organization continuously assesses IT risk.</p><p>The good news is leadership teams are increasing spending on cybersecurity through security consultants (27 percent), network perimeter defense upgrades (25 percent), and cyber insurance (17 percent). However, most organizations aren't planning to spend more on cybersecurity and privacy-related training for employees and board members in the next 12 months.</p><p>Although respondents say boards and executives are taking greater interest in IT governance, the content of their meetings doesn't reflect that interest. Twenty-one percent of respondents say their board and senior management discuss IT risk topics such as cybersecurity and disaster recovery at every meeting, while 39 percent say they discuss them at some meetings. About one-third only discuss IT risk topics as needed.</p><p>Going forward, respondents say senior leaders must demonstrate that their organization has effective IT governance by:</p><ul><li>Ensuring alignment between IT and stakeholder needs (58 percent).</li><li>Monitoring and measuring results toward goals (39 percent).</li><li>Providing strong chairman, CEO, or executive guidance (33 percent).</li><li>Having strong engagement by business units and employees (30 percent).</li></ul><p><br></p><p>"There is much work to do in information and technology governance," Loeb acknowledges. "Committing to a boardroom with technology savvy and experience strongly represented provides the needed foundation for organizations to effectively and securely innovate through technology." </p><p><br></p>Tim McCollum0
Tech Vs. Fraudhttps://iaonline.theiia.org/2017/Pages/Tech-Vs.-Fraud.aspxTech Vs. Fraud<p>​Organizations are adding more technology capabilities to their fraud investigation teams, according to the <a href="http://www.acfe.com/press-release.aspx?id=4294999251" target="_blank">Association of Certified Fraud Examiners'</a> (ACFE's) In-house Fraud Investigation Teams: 2017 Benchmarking Report. Building forensics and cybersecurity expertise is a big focus among the nearly 1,500 anti-fraud professionals who responded to the global survey. Forty-three percent say their organization is seeking or expected to add expertise in digital forensics to its fraud investigation team. Conversely, 39 percent say their team currently has such skills.</p><p>Additionally, 36 percent of respondents say their fraud team has cybersecurity skills. A similar percentage (37 percent) is looking to add those skills. Most fraud investigation teams aren't investigating cyber fraud and hacking, possibly reflecting a lack of expertise. Only 16 percent investigate such incidents frequently, while 27 percent investigate them occasionally. </p><p>Frauds that teams investigate frequently are:</p><ul><li>Employee embezzlement (40 percent).</li><li>Frauds committed by customers (40 percent).</li><li>Frauds committed by vendors or contractors (32 percent).</li><li>Human resources issues (30 percent).</li></ul><p> <br> </p><p>Some fraud investigators have a lot on their plates, the survey notes. Although most (51 percent) work on fewer than five cases at a time, 30 percent work on 10 or more cases concurrently. </p><p>And fraud investigations are only part of their jobs. The average team spends 56 percent of its time on investigations. The rest of the time, investigators are working in areas such as internal audit, compliance, and information security.</p><p>The high case load and demands on investigators' time would seem to call for some automated assistance, but most respondents say their organization isn't using such technologies. Just 38 percent use case management software. Forty-seven percent use data analytics software, including spreadsheet software, in their work. That's not because they don't know how to use it — most respondents (62 percent) say they have analytics and data mining skills on their team.</p><p>And what about the results of these investigations? Most respondents say their team substantiates the majority of cases, with 34 percent reporting that they substantiate more than three-fourths of alleged frauds. Forty-six percent of respondents say most fraud investigations result in disciplinary action, but just 17 percent say their team refers most investigations for prosecution. Indeed, 70 percent of respondents report that they refer one-fourth of cases or fewer for prosecution.</p><p>Organizations are even less likely to recover fraud losses, the report finds. Just 24 percent say they recover more than half of fraud losses, while 59 percent recover one-fourth or fewer. </p><p> <br> </p>Tim McCollum0
Stop Clicking, Start Codinghttps://iaonline.theiia.org/2017/Pages/Stop-Clicking,-Start-Coding.aspxStop Clicking, Start Coding<p>​As data grows in volume and complexity, the effective use of it is critical for making better, faster, and more informed decisions. Organizations increasingly are seeking internal auditors who can analyze data and generate insights that bring new value to the business. <br></p><p>While internal auditors typically perform data analysis using specialized audit software packages or a general spreadsheet application, there is a growing need for auditors to develop technical skills beyond those tools. For example, Fortune 500 firms such as Google and Verizon have made proficiency in structured query language (SQL) part of their job requirements for hiring internal auditors. <br></p><p>SQL is a special-purpose programming language designed for managing data held in database management systems that support widely used enterprise resource planning systems. Designing SQL procedures for transforming data into useful information requires a good understanding of data structure and the logic of how a system works. Such understanding is particularly important for internal auditors when they work with large volumes of data in today’s complex business environment. From the learning perspective, logical thinking and reasoning inherent in the SQL coding process helps internal auditors develop the critical thinking and problem-solving skills desired by the profession. <br></p><p>Moreover, SQL-based analysis has gained increasing importance with the advent of big data. SQL tools enable fast access to relational databases that store vast amounts of data, offer flexibility in developing ad hoc queries on an as-needed basis, and can be tailored to the specific needs of auditing. Furthermore, because SQL is an international standard, internal auditors are not constrained to using a specific software tool.<br></p><h2>Asking Questions of Data</h2><p>Internal auditors can write and refine SQL codes in a relational database to arrive at incrementally better solutions until the desired outcome is achieved. Consider the example of an Employees table that contains data such as employee ID, first name, last name, birth date, and hire date. Auditors can ask many interesting questions about this data, such as whether the company has complied with all employment regulations. In the context of The Committee of Sponsoring Organizations of the Treadway Commission’s <em>Enterprise Risk Management–Integrated Framework</em>, this inquiry addresses the company’s conformance with its compliance objectives.<br>To check compliance with child labor laws, internal auditors can query the data to determine whether any employees were underage at the time of their hiring. For example, the minimum age for employment in the U.S. is 14; and there are specific requirements for the age group between 14 and 18. Auditors can begin answering this question using this code:<br><br><em>SELECT EmployeeID, FirstName, LastName, </em><br><em>(HireDate-BirthDate)/365</em><br><em>FROM Employees;</em><br><br>The SELECT statement in the code retrieves all of the values in the EmployeeID, FirstName, and LastName columns, and calculates the age of the employee at the time of hiring as the difference between the HireDate and BirthDate divided by 365 days. The FROM clause specifies the tables from which the data are selected. <br>The query returns a total of 11 employees. Of these employees, the results identify four questionable employees: two are under 18 and the other two have no age information. At first glance, the design of the query seems to answer the question, but this solution only works well for small organizations. Imagine a large company that has thousands of employees. In such a situation, auditors would have to sift through a long list of employees to identify those with age problems. An additional issue is that the system-generated title of the column for the age data, “Expr1003,” is not descriptive, and the data, itself, has 10 decimal places. To address these drawbacks, internal auditors can improve the SQL statement: <br><br><em>SELECT</em><br><em>EmployeeID, FirstName, LastName</em><br><em>ROUND((HireDate-BirthDate)/365, 1) </em><br><em>AS AgeAtHire</em><br><em>FROM Employees</em><br><em>WHERE (HireDate-BirthDate)/365 < 18;</em><br><br>This revision aims to filter out unnecessary data and improve the readability of the report. Adding the WHERE clause restricts the result to employees under age 18. The ROUND function rounds the age number off to one decimal place. The heading of the column containing the age data is also renamed to AgeAtHire. The query result now contains only two suspicious employees who were under 18 at the time of their hiring. <br>However, there is something missing from the report. The first query uncovered two additional suspicious employees without any age information. Further examination of the Employees table reveals that birth and hiring dates are not available for these two employees. While only a conjecture, these two individuals may have been “ghost employees” as the result of payroll frauds. Internal auditors should include these two suspicious employees in the report, as well.<br>To find this information, internal auditors can amend the SQL query:<br><br><em>SELECT</em><br><em>EmployeeID, FirstName, LastName</em><br><em>ROUND((HireDate-BirthDate)/365, 1) </em><br><em>AS AgeAtHire</em><br><em>FROM Employees</em><br><em>WHERE (HireDate-BirthDate)/365 < 18</em><br><em>OR (HireDate-BirthDate) IS NULL;</em><br><br>In this solution, auditors add another condition “(HireDate-BirthDate ) IS NULL” in the WHERE clause with the OR operator. The OR operator performs a logical comparison and specifies that an employee should be included in the report if either of the two conditions is met: age at the time of hiring is less than 18, or age data for this employee is NULL (i.e., left blank). Now the report shows all four suspicious employees. <br></p><p>This is not the end of the data analysis, however. Based on this result, internal auditors would need to investigate further to determine why the age information is missing for two employees and how the two underage employees were hired in the first place. <br></p><h2>Powerful Analytical Tools</h2><p>The underage employee example demonstrates how SQL can be a useful database tool for solving audit-related problems. However, it has only scratched the surface of the capabilities of SQL-based data analysis. Indeed, SQL and other audit software can form a powerful set of analytical tools for internal auditors, particularly in the context of ever-growing volumes of data available for business use. <br></p>Ken Guo1
Building a Data Analytics Programhttps://iaonline.theiia.org/2017/Pages/Building-a-Data-Analytics-Program.aspxBuilding a Data Analytics Program<p>​In today’s data-hungry world, an analytics-capable audit function is a necessity. However, relatively few audit teams have developed sophisticated analytics capabilities and an embedded, integrated approach to analytics. So how can internal audit functions initiate and advance their analytics capabilities? Internal audit functions that have successfully implemented sustainable analytics activities have not only been able to clearly visualize and articulate the value analytics can deliver to their functions and the broader business, but also have started to realize that value in enhanced efficiency, effectiveness, and risk awareness. <br></p><p>Along the way, many functions have experienced missteps and setbacks. The lessons they have learned should benefit internal audit departments embarking on their own analytics journeys or those attempting to overcome false starts of the past. Some of these hard-earned insights are what one might expect. Difficult access to enterprise data stores marks a widespread pitfall, as does insufficient planning. Other data analytics lessons will surprise the uninitiated. Investing in robust technical skills training and analytics tools implementation often can be a distraction to getting an analytics program off the ground. By knowing what to avoid, internal audit departments can keep a data analytics program on track to reach its full potential.<br></p><h2>Tools for Success </h2><p>When internal audit leaders commit to introducing or furthering a data analytics program, there are six strategies that can positively impact these initiatives.<br></p><h2>1. Create awareness rather than a silo </h2><p>Internal audit leaders should resist the inclination to start by creating a data analytics silo within the larger function. While dedicated analytics functions are present within many internal audit functions with advanced analytics capabilities, this structure should more appropriately be treated as a long-term goal or possible target state than an immediate to-do item when getting started.<br></p><p>While it is necessary to have the appropriate technical competence within the team, creating a silo structure from the start can reduce focus on a more important driver of success: data and analytics awareness. This mindset helps internal auditors understand how data is created, processed, and consumed as it flows throughout the organization, the key systems where it resides, and the key business processes and decisions that it supports. This understanding represents a business-centric view of analytics as opposed to a technology-only view, a critical distinction in developing the right kind of thinking among the internal audit team.<br></p><p>When an internal audit function decides to reassign a technical resource as the team’s analytics champion, problems often ensue. Creating this type of structure too soon can cause the rest of internal audit, as well as the business, to view audit analytics as a purely technical exercise as opposed to an integrated component of internal audit’s culture, strategy, and activities. Insight from analytics are the result of the intersection between business awareness and the application of analytics tools and methodologies. These are two sides of the same coin and both must be present for success.<br></p><p>Internal audit leaders also should reflect on how they source their analytics talent. While there is no one way to do this, leaders should recognize that hiring analytics professionals or repurposing technical resources can pose risks to the development of an analytics mindset throughout the entire internal audit team. It takes time to understand business processes and what valuable information can be gleaned from the systems and data that underpin them. Building a more pervasive analytics mindset across the internal audit department is critical. The most effective audit analytics programs operate in a tightly coordinated — if not seamless — manner with all other parts of the audit team. All members of the team think about the data that exists in the environment, its business relevance, and the stories it can tell. The analytics teams then layer in their view and capabilities.<br></p><p>Dedicated analytics functions and externally hired analytics experts are common hallmarks of top-performing analytics capabilities; however, neither of these elements should be used in place of the initial establishment of the right analytics mindset throughout the internal audit function.<br></p><h2>2. Understand the data before investing in a tool </h2><p>One of the most common start-up lessons involves resisting the desire to acquire the latest and greatest analytical tool. Given the impressive power, look, and feel of analytics tools, it’s difficult to not be sold on a new piece of software with the promise that, within hours, internal audit will be generating a flurry of queries and new intelligence insights.<br></p><p>Rather than a first step, however, implementing an analytics tool should be a more deliberate step in the rollout of an analytics program. A rush to start using these tools, without establishing a plan and set of initial, high-value use cases, often leads to results that lack business impact, which can be detrimental for a start-up analytics activity.<br></p><p>Before using a tool, internal auditors should carefully evaluate a high-value area to target, understand the data source, validate it, and identify how the results will be evaluated and shared. When it comes to analytics tools, it is helpful to adhere to the 80/20 rule: 80 percent of the analytics team’s work should consist of understanding the data, the business process it supports, and the activities and decision-making that it drives, along with the business value the analysis is designed to deliver; 20 percent of the effort should focus on the technical aspects of the analysis, including the audit tool.<br></p><h2>3. Plan sufficiently </h2><p>Too many analytics initiatives suffer from too little planning. Plunging into data analytics does not mean that internal audit functions should give short shrift to key planning considerations.<br></p><p>The most effective and sustainable analytics programs tend to begin with a planning effort that includes:<br></p><ul><li>Understanding the system and data landscape; how data is created, processed, and consumed; and how it drives business activities and decision-making.</li><li>Educating internal auditors on the power, benefits, and applications of audit analytics (the analytics mindset).</li><li>Laying out how analytical talent will be trained or hired and retained.</li><li>Seeking business partners’ input on areas of their domains that might benefit from audit analytics. </li><li>Carefully identifying which initial analytics are likely to yield the most valuable results — and, as a result, support from business partners.</li></ul><p><br></p><p>Neglecting any one of these items can lead to initial results that are low impact or miss the mark entirely.<br>When educating internal audit team members about the use of data analytics, it is helpful to steer the focus away from the technical inner workings of the capability by presenting real examples that demonstrate how analytics enhance the efficiency, effectiveness, or risk awareness of the internal audit function and the broader organization (i.e., how data can be turned into information that provides risk and business insights).<br></p><h2>4. Think big picture </h2><p>The expansive reach of audit analytics has, oddly enough, resulted in narrow thinking about its application. For years, internal audit professionals and experts have marveled at the way analytics and continuous auditing techniques can be deployed to test massive populations of transactions. This capability is rightly trumpeted as a massive improvement over the traditional approach of manually sampling large data sets, often months after the associated activity has occurred, to identify problems. While accurate, this view of analytics is severely limited.<br></p><p>Leading internal audit functions now use analytics throughout the audit life cycle to support dynamic risk assessments; monitor trends, fraud, and risk and performance indicators, or deviations from acceptable performance levels; and model business outcomes. These functions tend to view analytics as a way to interpret data that helps tell a story to the business that may not have been told before. To be successful here, there has to be an acute understanding of the data that is created, processed, and consumed within — and across — the organization and how it is used to drive business activity and decision-making.</p><h2><br>5. Partner with IT </h2><p>Given that data typically exists in a multitude of different systems throughout organizations as well as within third-party (e.g., cloud) environments, internal audit frequently encounters difficulties when attempting to access data for analytics. This problem relates not only to accessibility (the protracted data request process with IT), but also to completeness, accuracy, and validity of the data. Without understanding the specifics of what they are asking for, internal auditors cannot reasonably expect to get what they need — at least, not the first time around. In some cases, lengthy and ineffective data request back-and-forth between internal audit and IT departments results in data integrity issues (at best, perhaps) or the planned analytic being canceled entirely.<br></p><p>To succeed, audit analytics teams need to partner with IT departments to develop a robust process for data acquisition — either through specific and easily understood data requests or through direct connections to data repositories. This all starts by understanding the data environment. While this marks a common goal, it takes time, effort, and coordination to get there. Auditors should consider discussing how to decide which data elements should be created and captured, the business rationale for doing so, and how internal audit and business partners will use the information that analytics produce.<br></p><p>Thanks to recent advancements, current analytical tools more easily integrate with other enterprise systems. Internal audit functions’ growing tendency to use dedicated data warehouses also helps address data access and quality challenges, which can reduce stress on business production systems by giving internal auditors their own sandbox to play with data. However, there are risks with this approach, particularly with regard to security and privacy. Ultimately, establishing a dedicated data warehouse requires a sound business case that, among other things, addresses these risks.<br></p><p>Other, less technical qualities and practices also come in handy. Internal audit functions that have earned a reputation for collaborating with the business consistently encounter fewer data management obstacles when deploying data analytics. Their success stems partly from the fact that collaborative internal auditors are more apt to learn about, and apply, data governance standards and practices from their IT colleagues, which can help ease access to quality data residing in systems scattered throughout the organization.<br></p><h2>6. Take advantage of visualization tools for inspired reporting </h2><p>A picture is worth a thousand words. The same principle applies to the presentation — or visualization — of the analytics results. Tabular formats and simple charts are a thing of the past. Analytics reporting packages should be making use of widely available visualization tools. These tools allow for the dynamic presentation of results (e.g., a country map that shows the top locations where purchase card spending occurs) and real time, drill-down capability that represents a far cry from the static analytics presentations of the past. Visually compelling, high-impact reports can help internal audit’s clients quickly draw insights from the data.<br></p><h2>A Fundamental Shift</h2><p>At present, data is being created and collected at a pace that is far beyond anything seen before. While there is always some risk in undertaking a new program — and a desire to prove the return on investment — the bigger risk is doing nothing. It is simply not an approach that internal audit functions can afford to take if they want to keep up with the business, stay relevant, and deliver value and insight. The most innovative companies are looking at ways to capture and use data to transform their business operations as part of digitalization initiatives. Internal audit must be equally innovative and embracing of the need and value to make the company’s data work for them.<br></p><p>A key method to overcome common time and resource constraints with setting up a discrete analytics group within internal audit is by focusing on an “analytics mindset.” Further, internal audit functions are encouraged to work with business partners to identify areas where analytics can have high impact and high value, provide real business insight, and help address business challenges (rather than focus on a return on investment calculation). The value delivered in these initial analytics projects will set the stage for the program. Internal audit should look for parts of the business that are particularly data dense, or that have high volumes of data processing but still rely heavily on manual procedures. For example, focus on ways to: <br></p><ul><li>Pull business insight from the data-heavy areas (and show management a story they have not seen before). </li><li>Work with management to convert audit analytics into reports that can be used in place of time-intensive procedures (e.g., “real time” monitoring of large, disparate data sets for key fraud indicators). </li><li>Quantify the impact of findings and deliver more insight through audit reports.</li></ul><p><br></p><p>These are some of the ways that internal audit functions are able to quickly demonstrate and communicate value in their investment in, and use of, analytics. Ultimately, however, stakeholders must recognize that there is a fundamental shift in how business is being conducted, and as such auditors must match that with a fundamental shift in how they audit.<br></p><h2>Each Journey is Unique</h2><p>Establishing a robust analytics program may take several years to mature. The process for developing a data analytics capability tends to be unique for each internal audit function. Some standard general assessments exist and can help, but each internal audit leader should chart a path forward that reflects the unique qualities and needs of his or her function and the unique characteristics of the industry, the organization, and the team’s relationships with business partners.</p><p>For additional guidance, download <a href="http://bit.ly/2u6iVQv">GTAG: Understanding and Auditing Big Data</a>. <span><span></span></span><br> </p>Gordon Braun1

  • MNP_Feb2018 IAO_Premium 1
  • IIA Training_Feb2018_Premium 2
  • IIA CIA_Feb2018_Premium 3