A Change in Mindset Change in Mindset<h3>​How far have audit functions come in terms of data analytics usage?</h3><p><strong>Petersen</strong> Progressing audit analytics is a journey that doesn’t have an end, but I’m excited to hear organizations describe how they continue to progress year over year. These organizations know the direction they need to go, continue to raise the bar for themselves, and set new objectives to achieve. They face the same resource limitations many audit teams do, so they encourage all their auditors to progress, not just those assigned as the data analytics expert. </p><p><strong>Zitting</strong> Not far enough. Recently, my company’s State of the GRC Profession survey revealed 43% of professionals want to grow their data analysis skills, but those figures have been the same for years — if not decades. Leading audit teams that are willing to embrace change and take risks are indeed creating a new future by delivering and sharing successes in data analysis, advanced analytics, robotic process automation, and even machine learning/artificial intelligence; unfortunately, these leaders are the exception. They inspire us, yet other corporate functions like marketing, IT/digital transformation, security, and even risk management are leaving internal audit behind. <br></p><h3>What are examples, beyond typical usages, of analytics that auditors should be undertaking?<strong style="color:#666666;font-family:arial, helvetica, sans-serif;font-size:12px;"> </strong></h3><p><strong><img src="/2019/PublishingImages/Dan-Zitting.jpg" class="ms-rtePosition-1" alt="" style="margin:5px;" />Zitting</strong> Let’s not write off the “typical usages” of data analytics, because the vast majority of audit teams aren’t even doing those. The key control areas that virtually every organization’s audit and internal control teams test are completely automatable, yet few seem to do it. Areas like user access, IT administrator activity (or other activity log testing), journal entry, payment, and payroll should never again be tested with anything but data analytics.</p><p>Beyond that, the universe of possibility for the data-savvy audit team is limitless. I’m seeing leading audit teams even turn analytics in on themselves — like doing textual analytics on the text of the past several years’ audit findings to indicate where risk is increasing or not being addressed. It’s incredibly impactful. I’ve also seen practitioners develop analytics that use machine learning to create “hot clusters” of employees that are at high risk of churn, or to see “hot clusters” of payments that could be bribes, money laundering, or sanction violations. </p><p><strong>Petersen</strong> How about running data analysis on the audit analytics program? Start by ascertaining how many audits contain some level of data analysis — sampling doesn’t count. Now compare that to how many should contain some analysis. I don’t know of any organizations that would find they should be doing analytics on 100% of their audits, but if they are honest, they’ll find a significant gap between those audits that could have some analytics performed and those that do.</p><p>Now that we have determined breadth of coverage, let’s determine depth of coverage. This is done by determining for each of those audits that could have analytics performed on them, the analytics that would ideally be performed. Internal audit should focus on those analytics it would be proud to report to the audit committee that it performed considering the risks and audit objective. Don’t be discouraged by the thought that internal audit can never achieve the coverage it has identified. Instead, plan to increase coverage each year.<br></p><h3>How can small audit functions that can’t afford a data scientist jump into data analytics?<br></h3><p><strong><img src="/2019/PublishingImages/Ken-Petersen.jpg" class="ms-rtePosition-1" alt="" style="margin:5px;" />Petersen</strong> Start with basic analytics functions. Audit leadership needs to lead the organization to continually progress the analytics being performed. Leverage those individuals in your organization that have an aptitude for analytics and communicate within the team successes, new ideas, and new ways of doing things. Use known tools such as Excel and easy-to-use and learn audit analytics tools. Leverage existing audit techniques across different types of audits. For example, testing for duplicate payments, separation of duties violations, and several other routines apply across many types of audits. Once you’ve determined how to identify these in one audit, this can be applied to other audits. Teams without a data scientist can still have a strong audit analytics program.</p><p><strong>Zitting</strong> Every audit function that can hire a single auditor can afford a person with data skills. The problem is that we accept the status quo of the short-term demands of internal audit’s stakeholders; thus, we elect to hire a “traditional” auditor over a person with technical data skills and the ability to think critically. Obviously, that is a necessity in real life, but also it illustrates that the “can’t afford” or “can’t find the skills” arguments are basically bad excuses that abdicate our responsibility as corporate leaders to evolve with the economic demands of the modern environment. Consider a complete shift in mindset. What if we were building a small data science team that had some audit skills instead of a small audit team with some data skills? Wouldn’t that change our perspective on staffing for a truly modern form of auditing?<br></p><h3>What skills should audit functions be looking for when hiring a data analytics expert?</h3><p><strong>Zitting</strong> Most importantly, audit functions should be looking for critical thinking skills. Technical skills in data analytics can be taught. What is difficult to teach is critical thinking, particularly as it relates to knowledge of audit process/risk assessment/internal control, knowledge of the business and its strategy/operations, and the ability to navigate corporate access challenges — access to data and executive time — by asking really smart questions. Next, look for an understanding and desire to work in an Agile mindset. Specific tools and approaches will always change, but if the candidate understands Agile methodology — minimum viable product, sprints and iteration, continuous improvement — he or she will be able to deliver business results in both the short and long term regardless of issues of tool preference. </p><p><strong>Petersen</strong> Communication and collaboration skills can exponentially increase the team’s analytics effectiveness. Without these skills, there is one expert off doing analytics by him or herself. However, with these skills and easy-to-use analytics tools, the expert can guide the entire team through its analytics needs, greatly increasing the overall effectiveness of the team. When not providing this guidance, the expert can work on more complex analytical projects. This approach also increases employee satisfaction of both the expert and the other team members.<br></p><h3>What does a best-in-class audit function that is fully embedded in data analytics look like?</h3><p><strong>Petersen</strong> These teams apply a quantitative analysis and measurement to their audit analytics. They do this by measuring the depth and breadth of their analytics coverage. They have strong leaders who promote the value of analytics and make it a part of the team’s culture. They also understand that there is no finish line, but the analytics program will continually evolve and grow. Leaders of these teams incorporate all team members into the analytics process, understanding that some have a stronger aptitude for it than others, but still expecting all to participate, and they set appropriate analytics goals for each. Not only are organizations like this best-in-class with respect to the analytics functions but, as a surprise to some, they also have happier team members.</p><p><strong>Zitting</strong> The best audit organizations already are demonstrating that their core skill is data analysis. It’s the only way to get large-scale insight on risk, control, and assurance across globally dispersed organizations using constrained resources. Best-in-class audit functions don’t embed data analytics, they provide 90% of all assurance they report through analytics and reserve “traditional” auditing for manual deep dives into areas of significant risk or deviation from policy, regulation, or other standards of control. For example, one of our clients moved its entire internal audit team into the core business operation and began rebuilding internal audit from scratch in the last two years. This was because audit was providing so much value via its complete focus on data and analytics, the business demanded to consume the function, and the audit committee agreed to rebuild. That’s one example of internal audit driving real value through a data-centric mindset and practice.<br></p>Staff1
Editor's Note: Fortress in the Cloud's Note: Fortress in the Cloud<p>​Cloud computing has quickly soared to become a dominant business technology. Public cloud adoption, in fact, now stands at 91% among organizations, according to software company Flexera's State of the Cloud Survey. And it's only expected to grow from there. Analysts at Gartner say more than half of global enterprises already using the cloud will have gone all-in by 2021. </p><p>Collectively, that places a lot of responsibility for organizational data outside the enterprise. And while cloud migration can lead to significant efficiencies and cost savings, the potential risks of third-party data management cannot be ignored. Reuters, for example, recently reported that several large cloud providers were affected by a series of cyber intrusions suspected to originate in China. Victims, Reuters reports, include Computer Sciences Corp., Fujitsu, IBM, and Tata Consultancy Services. The news agency's chilling quote from Mike Rogers, former director of the U.S. National Security Agency, emphasizes the gravity of these breaches: "For those that thought the cloud was a panacea, I would say you haven't been paying attention." </p><p>As noted in this issue's cover story, <a href="/2019/Pages/Security-in-the-Cloud.aspx">"Security in the Cloud,"</a> growing use of cloud services creates new challenges for internal auditors. Writer Arthur Piper, for example, points to issues arising from the cloud's unique infrastructure and the "lack of visibility of fourth- and fifth-level suppliers." He also cites the cloud's opaque nature and rapid pace of development as potential areas of difficulty. Addressing these issues, he says, requires internal audit to work with a wide range of business stakeholders — especially those in IT — and to secure staff with the right type of expertise.</p><p>The need to focus on these areas is supported by a recent report from the Internal Audit Foundation, Internal Auditors' Response to Disruptive Innovation. Among practitioners surveyed for the research, a consistent theme emerged with regard to cloud computing — to be successful, internal audit should build relationships with IT, before moving to the cloud. Multiple respondents also recommend bringing in personnel with specialized IT skills to facilitate the evaluation of cloud controls. Moreover, they noted the importance of evaluating not only standard internal controls in areas like data security and privacy, but soft controls, such as institutional knowledge, as well.</p><p>Of course, cloud computing is only the tip of the iceberg when it comes to challenges around disruptive technology. Among other IT innovations affecting practitioners, artificial intelligence and the Internet of Things are equally impactful. We examine each of these areas in <a href="/2019/Pages/Stronger-Assurance-Through-Machine-Learning.aspx">"Stronger Assurance Through Machine Learning"</a> and <a href="/2019/Pages/Wrangling-the-Internet-of-Things.aspx">"Wrangling the Internet of Things,"</a> respectively. And be sure to visit the <a href="/_layouts/15/FIXUPREDIRECT.ASPX?WebId=85b83afb-e83f-45b9-8ef5-505e3b5d1501&TermSetId=2a58f91d-9a68-446d-bcc3-92c79740a123&TermId=75ea3310-ffa9-45b9-9280-c69105326f09">Technology section</a> of our website,, for insights and perspectives on other IT-related developments affecting the profession.</p>David Salierno0
Security in the Cloud in the Cloud<p>​Although Jean-Michel Garcia-Alvarez was used to working as a high-level internal auditor in the financial services sector, 2015 presented him several novel challenges. First, he was appointed head of internal audit — and later also data protection officer — at a new, fintech challenger bank in London called OakNorth. It had received regulatory approval from both the Prudential Regulatory Authority and the Financial Conduct Authority in August 2015 — one of only three U.K. banks to do so in the past 150 years. Second, OakNorth wanted to be the first U.K. bank with a cloud-only IT infrastructure, which was not an area he specialized in during his previous audit roles at Nationwide Building Society, RBS, or Barclays.</p><p>Garcia-Alvarez realized that traditional audit skills would be of limited use because of the cloud’s newness and evolving nature, with little precedent in the scope and range of how to approach it as an internal auditor. So, he decided to obtain an IT audit certificate from the U.K.’s Chartered Institute of Internal Auditors (CIIA). It boosted his IT audit skills and forced him to get to grips with how to approach cloud auditing and security. It also made him a credible security player in the business.</p><p>At the same time, he says internal auditors must adhere to the fundamental remit of audit, which, for OakNorth, is the CIIA’s Financial Services Code. One of the first sentences of that document says internal audit’s primary role is to help senior management protect the assets of the business — in this case from hacking, data breach, and leakage.</p><p>“That is absolutely the role of internal audit in cloud security,” Garcia-Alvarez says. When businesses are migrating to and operating in the cloud, internal audit needs to provide assurance that the cloud infrastructure is safe, secure, and able to meet the firm’s objectives — not just now, but in the future. “The way to do that is to be embedded as the third line of defense and to provide real-time feedback on risk and controls, and to assure the board that you are mitigating risk with data — not creating new ones.” </p><p>While cybersecurity has long been on auditors’ lists of regular assignments, securing today’s cloud poses fresh challenges. The very structure, speed, and opacity of the cloud demands a focus away from traditional auditing. Having systems in place to deal with data breaches, data loss, and ransomware attacks is mostly standard today, but issues arising from the unique infrastructure of the cloud, the lack of visibility of fourth- and fifth-level suppliers, and the need to work in tandem with both the cloud provider’s own security teams and a wider range of stakeholders across the business are growing challenges for internal auditors dealing with cloud security. </p><h2>Changing Purpose</h2><p>OakNorth’s journey is a good example of how the speed of change impacts internal audit’s security concerns. Like many businesses, OakNorth’s cloud provider in 2016 was Amazon Web Services (AWS). As a large global player, Garcia-Alvarez was happy that AWS could be responsible for the security of the cloud, while OakNorth was responsible for security in the cloud. That theoretically makes it easier for internal audit because the function can regularly check and rely on the up-to-date certifications maintained by the cloud provider. Audit can then focus almost entirely on the internal security control environment. In reality, though, for cloud security to be robust auditors also need to keep up with changing laws, rules, and regulator expectations. </p><p>“Those can change very quickly,” he says. In 2016 when OakNorth migrated to the cloud, the U.K. financial regulator was happy with the decision and with the company’s cloud provider — because it was big, safe, and secure. But when other banks followed suit by 2017, the regulator decided it was a potential concentration risk. If AWS went down, it would take a huge slice of the U.K. financial services sector with it. As a result, OakNorth moved to a multi-cloud solution for all of its client-facing technology.</p><p>From the outset, OakNorth used cloud data centers, provided by AWS, in several locations in Ireland, with an additional fail-safe elsewhere in Europe. “That one is like a bouncy castle,” Garcia-Alvarez says. “The shell is there, but the engine is off. Turn on the engine and it will be fully blown up and working in a matter of hours.” Just to be sure, the IT team rebuilds the core banking platform from scratch at a new location in Europe once a year, with internal audit providing independent assurance over the exercises. “It is time-consuming and expensive, but at least we know that the bank is safe.”</p><h2>Getting in Early</h2><p>Cloud downtime is not a fantasy risk. In February 2017, for instance, AWS services on the U.S. East Coast experienced failure. While reports on technology news site <em>The Register</em> suggested the servers were down only about half an hour, some customers reportedly could not get their data back because of hardware failure. Another outage in March 2018 affected companies such as GitHub, MongoDB, NewVoiceMedia, Slack, and Zillow, according to CNBC.</p><p>James Bone, a lecturer at Columbia University and president of Global Compliance Associates in Lincoln, R.I., says that is just one of many reasons internal auditors should be involved early in any cloud deployment. “I don’t believe that internal auditors should be deciding which products to use, but I do think they should be very much involved in the selection process,” he says. “They need to understand the service model, what is being deployed, and how they are planning to use the services. The platform that they use will determine, to a large part, the risk exposure to the firm.”</p><p>That is because the choice of platform governs what data will be transitioned, if any will stay on the premises, access administration, business continuity plans, data breach response, ransomware strategy and response, the frameworks the service provider uses for cloud security, the frequency of monitoring, contractual agreements, and many other factors. Auditors need to be on top of the situation to raise red flags before security risks crystallize. Bone says, for instance, that he has heard stories of service providers failing during a transition to the cloud, without a backup in place from which to restore the client’s data. In this example, organizations need to know what the recovery plan is and, crucially, who is responsible for it.    </p><h2>Sharing Responsibility</h2><p>“These are shared security and operational relationships between the cloud provider and the business,” Bone says. “So it is about clearly separating the different lines of accountability and responsibility at an early stage.” That includes sharing operational performance metrics and having clear escalation processes for data breaches, outages, and other security issues where the responsibilities are set out clearly between the cloud provider and the business. The internal audit team must have a realistic understanding of its own and the business’s capabilities if those measures are to be effective. “If the firm and the audit team are not particularly agile, can they use the vendor to take up some of that role?” he asks. </p><p>The opaque nature of what goes on in the cloud service provider’s business is a particular worry for internal auditors. “The biggest problem in these virtual environments is that the distance between control and assurance gets wider,” he says. Bone has been researching this idea for about four years. In digital environments, he says, risk and audit professionals have been used to testing applications because in most cases the physical hardware and data are available to see, touch, and analyze. </p><p>“As we move to a boundaryless environment, we are creating a distance between our ability to recognize a problem and having to rely on others to tell us there is a problem,” he says. “That distance impacts response time, and our ability to develop and put in place even more robust controls, because we are further away from the problem. This is an underappreciated risk and is getting larger because firms that are providing these services are getting better at managing their own risk, while as businesses go further into the cloud and have multiple cloud providers, they are becoming more removed from core processes.”</p><h2>Potential Headaches</h2><p>For Fred Brown, head of the critical asset management protection program at HP in Houston and former head of IT audit at the firm, dealing with cloud security while working with such shared services can create “rather large challenges.” </p><p>“The more you open your environment, the more you have to stay on top of security,” he says. Over the last couple of years, HP has been working toward being a top quartile security organization, he explains. And Brown’s cyber team has grown 70% during that time. The business has been aggressively moving to cloud services — including infrastructure as a service, platform as a service, and software as a service. Implementing a 100% review of all suppliers that would include all cloud instances throughout the business means doing a detailed security check of more than 2,000 suppliers across the enterprise. </p><p>To speed up the process, HP has contracted with a third-party assessment exchange, CyberGRX, which describes itself as supplying “risk-assessment-as-a-service.” Any subscriber can have a supplier risk assessed — once the results are in, users can view them via an exchange. The process is integrated into HP’s inherent risk-scoring program, so that all vendors except those with the highest inherent risk score are assessed by CyberGRX. The vendors with the highest inherent risk are risk assessed by internal resources. This process represents a new initiative at HP, and so far it has produced useful reports and helped the company tackle a backlog of risk assessments.</p><p>“This is removing an entire blind spot when it comes to risk,” Brown says. “Even if you have 100 suppliers who you haven’t assessed, with many connected to your company’s critical assets, whether it is employee data, or something else — if you haven’t assessed them, you have no idea what their risk profile really looks like.”</p><p>Brown says one problem is that whether a cloud-based supplier is AWS or a small online education provider, if it is managing critical data, the threat to the business is the same. With many cloud providers now outsourcing parts of their own operations, HP is putting in extra effort on fourth- and fifth-party risk management. That is why having someone track the cloud supplier landscape is critical to managing security risk, he says, enabling the organization to identify what is going on and maintain control over the process. This challenge is amplified in a company such as HP that was already complex when it began outsourcing to cloud service providers.</p><h2>Working Across the Business</h2><p>New suppliers need to have up-to-date and formal self-attestation certificates that follow recognized standards, such as Service Organization Controls 2 reports and adhering to the International Organization for Standardization’s ISO 27001. To make sure a business division or manager does not randomly contract with a new cloud provider, Brown’s team has what he calls a “cast-iron interlock” with procurement. Procurement knows what HP’s cloud security requirements are, and they must be included in any new contractual arrangements. In fact, Brown describes the contracts as “living,” because they point to the security requirements, which HP can update without changing the actual contract itself.</p><p>Working with AWS, HP has created a way of centralizing group security policies through the IT infrastructure. The main cloud instance has all of the group policies established — any new instance sits beneath this “parent” and effectively inherits its security policies automatically. “Every time you make a change to the group policy, it cascades to all the instances that are underneath that,” Brown explains. Non-AWS cloud instances go through the new procurement system as described earlier.</p><p>As cloud computing becomes synonymous with organizations’ IT infrastructures, internal auditors need to work more collaboratively and strategically, according to Scott Shinners, partner of Risk Advisory Services at RSM in Chicago. That will mean audit working increasingly not just with IT and IT security, but with procurement, legal, risk management, and the board.</p><p>“The audit committee has to see cloud security in the audit plan, and it also has to be present in the nature of the additional conversations you’re having with management,” he says. “It should come up not just after implementation, but before in strategy setting and so on.” Moreover, if internal audit discovers cloud instances in parts of the business that are not meant to have them, it can feed back to IT and risk management.</p><p>Internal audit also needs to work closely with the audit committee as cloud migration, almost inevitably, leads to abandoning a large percentage of the audit plan. “That is where the really good engagement with the audit committee comes through,” Shinners says. “How willing is the audit committee to support a trade-off to reduce assurance on moderate risk areas in order to have internal audit spend more of its resources on some of the cutting-edge stuff that is emerging?”</p><p>Performing third-party, independent assessments of cloud security and thinking about the underlying controls on data security, access management, breach response plans, and so on, is just the minimum internal audit can do, he says, because that only provides a snapshot in time in a fast-moving area. “The No. 1 way that internal audit can be successful is working with the second line of defense to build a culture around data protection that is pervasive enough to be successful in an environment that is so fast moving,” he says. “Making sure risk management gets feedback to know the culture is working is right up internal auditors’ alley.”</p><h2>Skills and Expertise</h2><p>CAEs may also need to reach outside of their organizations to secure audit staff with the right level of skills and qualifications, says Ruth Doreen Mutebe, head of Internal Audit at Umeme, Uganda’s largest electricity distributor. She recommends building partnerships with technology and information security institutes, such as ISACA, and universities to help identify good candidates.</p><p>“Cloud auditing involves rare skill that takes time to build,” she says, especially because it requires people with a good grasp of technical issues who can also communicate those concepts at a basic level to management. In addition to attracting and training staff, a CAE has to be able to retain them after that initial investment has been made.</p><p>Mutebe’s approach is to recruit a competent IT security auditor — even if a premium price has to be paid — who can effectively audit and guide management on aspects of cloud security. In addition, she encourages her technical staff members to pass on their knowledge to the entire audit team.</p><p>“That could include embedding cloud security procedures into what would have been non-IT audits to build capacity and where resources allow, attaching nontechnical internal auditors to support basic tests on cloud security audits,” she says. Where gaps remain, outsourcing and co-sourcing arrangements with clearly established service level agreements can be used. “Even there, CAEs should encourage the outsourced service provider to train the internal audit staff,” she says.</p><h2>Keeping Up With Change</h2><p>Cloud security is moving at a rapid pace, much like other technological changes in businesses today. For internal auditors, that means a focus on critical thinking, learning how to stay current in their industries, and developing a willingness to team up across the business and beyond to form effective alliances. While such an open approach to providing assurance may be new to many auditors working in more traditional environments, it is likely to be a crucial step to take if organizations are to deal with the growing complexity of their cloud initiatives. </p>Arthur Piper1
The Ever-expanding Cloud Ever-expanding Cloud<p>​Do internal auditors know what's in their organization's cloud? There's probably more to it than they or their IT security colleagues realize, according to volume one of Symantec Corp.'s <a href="" target="_blank">2019 Cloud Security Threat Report</a>.</p><p>Dependence on the cloud is growing, the report notes. More than half of the 1,250 security decision-makers who responded to the global survey say their organizations have moved their computing workload to the cloud. And 93% say their organizations store data in multiple environments, distributed relatively evenly among private cloud, public cloud, on-premises, and hybrid cloud setups. </p><p>That complexity is making it hard for organizations to keep track of how much data they are storing in the cloud — and where. On average, respondents say their organizations' employees are using 452 cloud applications. However, Symantec estimates that organizations actually have an average of 1,807 shadow IT apps.</p><p>And if organizations can't see their cloud apps and data, they can't secure them. More than half of respondents say their organization's cloud security practices aren't keeping pace with the proliferation of cloud apps. </p><p>"The [security] gap created by cloud computing poses a greater risk than we realize, given the troves of sensitive and business-critical data stored in the cloud," says Nico Popp, senior vice president, Cloud and Information Protection, at Mountain View, Calif.-based Symantec. </p><h2>Security Challenged</h2><p>Popp says the cloud, itself, isn't increasing the problem with data breaches. A bigger problem is immature security practices, which nearly three-fourths of respondents blame for at least one cloud security incident in their organizations. More than 80% say their organizations lack processes to respond to cloud security incidents successfully. Just one in 10 say their organizations can analyze cloud traffic effectively. </p><p>Cloud security is a capacity problem, as well. More than 90% say their IT security teams can't keep up with all the cloud workloads in their organizations. Most don't have the cloud security manpower to deal with all alerts — organizations respond to just one-fourth of alerts, respondents say. In addition, 93% say their organizations need to enhance cloud security skills.</p><p>The report notes a third culprit: risky employee behavior such as using personal accounts and having weak passwords. This behavior sets the stage for attacks using "camouflaged" files or aimed at taking over user accounts. Another behavior problem is oversharing of data. Respondents estimate that one-third of files in the cloud shouldn't be there. </p><h2>Threat Watch</h2><p>The Symantec report lists several threats to cloud systems, including a recent trend of cross-cloud and malware injection attacks. Still, unauthorized access accounts for nearly two-thirds of cloud security incidents. "Digging deeper, companies are underestimating the scale and complexity of cloud attacks," the report notes. </p><p>For example, only 7% of respondents say account takeover is among their biggest cloud risks, yet Symantec says its data reveals that 42% of risky behavior can be attributed to a compromised cloud account.</p><p>Respondents say they know criminals are taking advantage. Nearly 70% say they have found evidence that their organization's data has been for sale on the Dark Web. </p><h2>Clearing Skies</h2><p>Despite looming threats, organizations can act to ensure a better forecast for their cloud operations. These actions include:</p><ul><li>Developing a cloud governance strategy to enforce security policies across on-premises and cloud environments.</li><li>Adopting a "zero-trust" model that protects all data and implements controls at all points of access.</li><li>Promoting shared responsibility encompassing not only the cloud provider and IT security department, but also executives and all employees.</li><li>Leveraging automation and artificial intelligence to analyze potential threats and respond to incidents.</li><li>Moving to a DevSecOps approach in which security practices are embedded into all application development. </li></ul><p><br></p><p>With cloud reliance expanding and business processes becoming digitized, organizations "need to re-evaluate their actual versus perceived risks," the report advises. To address these risks, the report recommends complementing technology solutions by adopting security best practices "at the human level" to confront cloud threats.</p><p><em>To learn more, read </em>Internal Auditor<em>'s August issue cover story, <a href="/2019/Pages/Security-in-the-Cloud.aspx">"Security in the Cloud."</a></em></p>Tim McCollum0
Peace in Our Time in Our Time<p>Too many organizations use internal audit results to drive priorities for the IT function, which can have a devastating effect on morale. This approach sets an example for the entire organization about how to get systems-related objectives met. Initially, this can be benign as leaders try to do the right thing and help uncover systems issues that need attention. Eventually, pointing the auditors to real or suspected issues allows them to elevate any project to the highest priority, whether it is strategic or not.</p><p>For example, a software company starved back-office systems in favor of product development. As a result, IT fell seriously behind in patching internal production systems. Because the organization was audit-driven, at the next opportunity, management pointed auditors at patching, and the inevitable findings in patch management became the flag around which any desired project was wrapped to secure new funding. Step one: Hold IT accountable for not patching that system. Step two: Secure funding to “fix IT’s mess.”</p><p>Allowing audits to drive strategy wastes time and money, and robs management of the audit’s real value — helping management validate that it is appropriately addressing risks to business processes. When the audit becomes the key objective, performing audits becomes an essential business process on its own. This mistake creates the potential for a wildly inappropriate scope that gives the IT staff the sense that audits are never-ending and self-serving. </p><h2>Fear and Loathing</h2><p>These issues can lead to audit fatigue and poorly executed audit activities. Before long, management is spending its time and attention fixing problems with audits instead of fixing problems found by audits.</p><p>In another example, a large financial services company purchased a much smaller company in an adjacent but highly regulated space. As is often the case, the smaller company had a much lower profile than the larger company, but that changed once it was part of a larger organization. The new management, lacking experience as a highly regulated entity, began to ramp up audits to get ahead of the regulators. As operational requirements competed with audit requests, “just get it done” replaced “do it right.” At some point in this dysfunctional downward spiral, “do whatever the auditor says to get this over with” became the strategy to end the pain. </p><p>This example provides context for the skepticism, distrust, and outright fear senior executives and IT staff members have about audits. Some worry about getting in trouble for doing something wrong. Many view the time spent on audit requests as wasted time or busy work. The fear and distrust for audits is naturally extended to the auditors, and this leads to an “us versus them” mentality. Both sides dig in and spend more time protecting their flank than solving their problems. </p><p>Some IT departments assign auditors “handlers” to choreograph activity, coach process owners to provide guarded answers, and quickly escalate issues, causing a bottleneck within leadership. Inexperienced auditors bring poor time management skills, poorly thought-out evidence requests, and negative attitudes to audits that put everyone on guard. Auditors then spend extra time gathering overwhelming evidence of control failure, and IT staff fabricates control evidence.</p><p>In addition to driving poor decision-making when used unwisely, audits often veer off track. In such cases, people too close to the situation sometimes focus on the audit as the key objective rather than managing the business process under audit. Besides these strategic mistakes, scope creep, poor communication, distrust among teams, and inexperience can plague any project and amplify any problems with an audit because of the extra scrutiny on the outcome. </p><p>In some organizations, IT may be severely underfunded and so far behind in resolving previous audit findings that the department gets accustomed to adding the next set to its ever-expanding project list. This forces leadership to spend so much time prioritizing and re-prioritizing work that audit failure becomes the de facto driver for funding. This, more than control failures, may be the finding that the audit should reveal.</p><h2>The Path to Peace</h2><p>It doesn’t have to be like this. When used appropriately to validate assumptions and uncover blind spots, the audit program is a crucial asset for management and plays an essential role in governance. Here are 10 tips to help internal auditors, management, and IT employees get on the right track.</p><p><strong>Audit team</strong> The audit team can become better partners to IT by taking these steps:<br></p><ul><li><em>Agree with senior leadership on the strategy and priorities of the audit program.</em> Establish priorities and understand where to focus audits based on the risks presented by the critical business processes.</li><li><em>Ensure each audit focuses on making the business process better, not finding problems</em>. Internal audit should keep this goal in mind as it sets audit objectives, determines scope, and frames findings. Always solicit recommendations for improvement from management. </li><li><em>Help the organization navigate audits and examinations by external organizations (within the limits of independence).</em> This is particularly important as it pertains to audit scope. For example, it’s not helpful to have nonregulated businesses examined by regulators. It wastes time and exposes the organization to inappropriate jeopardy. Auditors should make sure all parties agree to the scope before the audit starts. </li><li><em>Agree up front on the criteria for identifying the required evidence. </em>These criteria include sample selection criteria, the duration of the assessment, and the amount of evidence required to validate each test objective.</li><li><em>Agree on the process and tools to be used for requesting and receiving the evidence. </em>Agree on how quickly evidence is to be gathered once requested.<br><br></li></ul><p><strong>Management</strong> IT management can demonstrate transparency and respect for the audit process by:</p><ul><li><em>Avoiding assigning junior people to handle examiners or auditors.</em> When management tries to offload audit responsibility to the least useful resource, it almost always has a negative impact.</li><li><em>Not coaching employees on how to be coy with auditors. </em>Internal auditors are trained to spot inconsistency and lack of transparency. Trying to hide details from auditors is unprofessional and causes them to dig deeper in that area.<br><br></li></ul><p><strong>Employees</strong> IT staff members who are asked to support audit activities can establish trust by taking these steps:</p><ul><li><em>Don’t assume your competence is being questioned.</em> “I don’t know, but let me find out for you” is a better answer than guessing.</li><li><em>Don’t try to sound like a lawyer. </em>The best way to be understood is for employees to use the language and style that is comfortable to them. The surest way to get management’s attention — and not in a good way — is to call a minor testing deviation a “material weakness.”</li><li><em>The auditor is not a whistleblower hotline. </em>Managers should remind employees to bring internal issues to their manager or a neutral member of the management team.</li></ul><h2>Look in the Mirror<br></h2><p>Internal auditors should ensure their organization doesn’t take a dysfunctional audit approach. They should review their audit strategy to make sure it addresses business process risk, provides the necessary governance assistance to management and the board, and addresses the organization’s regulatory requirements. They shouldn’t let audits drive the business. <br></p>Bill Bonney1
The Threat Hunters Threat Hunters<p>​They're on the hunt, in companies around the world. Combining technology tools with detective skills, they are hunting for hidden adversaries on their networks. And their numbers are growing.</p><p>More than four in ten organizations responding to the <a href="" target="_blank">SANS 2018 Threat Hunting Survey</a> (PDF) say they conduct continuous threat hunts, up from 35% in information security training firm SANS Institute's 2017 study. More than one-third commence such hunts to look for underlying problems in response to a security event.</p><p>Their aim is to root out intruders, who can dwell on a network for an average of more than 90 days before they are detected. "Most of the organizations that are hunting tend to be larger enterprises or those that have been heavily targeted in the past," according to co-authors Robert M. Lee, a SANS instructor, and Rob Lee, curriculum lead at the institute. SANS surveyed 600 organizations for the report.</p><p>Threat hunting goes well beyond the intrusion detection most organizations rely on to discover security breaches. The SANS report defines it as an iterative approach for searching for and identifying adversaries on an organization's network. It's about combining threat intelligence and hypothesis generation to hone in on the most likely locations that intruders will target. </p><p>Threat hunting can be effective, the report notes. For example, 21% found four to 10 threats during threat hunts. Nearly 17% found as many as 50 such threats.</p><h2>Intelligence Is Key</h2><p>One reason for threat hunting's effectiveness is that hunters are harnessing better threat intelligence, the report finds. Most respondents (58%) say they rely on intelligence generated internally based on previous incidents. Moreover, 70% tap into intelligence from third-party sources such as anti-virus signatures.</p><p>"Nothing is more valuable than correctly self-generated intelligence to feed hunting operations," the authors say. However, organizations without such capabilities may need to turn to third parties. In fact, they recommend blending the two forms of intelligence as a way to reduce adversary dwell times.</p><h2>People and Technology</h2><p>Still, respondents depend most on alerts from network monitoring tools for their threat intelligence, which the authors point out isn't really threat hunting — a common misconception. This reliance on sensors may indicate that organizations still see threat hunting as a technology solution. The survey results bear this out, with more than 40% prioritizing technology investments for threat hunts versus 30% for qualified personnel. </p><p>The emphasis on technology is misplaced, the authors say. Yes, threat hunters depend on automation to do things faster, more accurately, and at greater scale. "However, by its definition, hunting is best suited for finding the threats that surpass what automation alone can uncover," they stress. Instead, technology and people must be intertwined.</p><p>The authors recommend that organizations prioritize recruiting and training skilled staff for threat hunts. In particular, they say such professionals are more likely to detect threats and create tools they will need to be effective. </p><p>Respondents say the baseline skills for threat hunters are network, endpoint, threat intelligence, and analytics. More advanced capabilities include digital forensics and incident response.</p><h2>Hunting Tools</h2><p>Hunters need weapons, and this is where technology tools come into use. Nine out of 10 respondents say their threat hunters use the organization's existing IT infrastructure tools, while 62% have developed customized tools. </p><p>However, the authors question whether these tools are providing the view of the network needed for successful hunts, noting that they often are detection-based. Such tools may not find all the intruders who have breached the network, they say.</p><p>Whatever their tools, the report notes that threat hunting can be resource-intensive and requires an emphasis on analysis and developing hypotheses about adversaries. Although growing percentages of respondents are basing hunts on continuous monitoring or incident response, it may be more effective to conduct scheduled hunts. "Even a few hunts per year, when done correctly, can be highly effective for the organization," the authors say.<br></p>Tim McCollum0
Bias in the Machine in the Machine<p>​Can artificial intelligence (AI) discriminate? That is what Facebook’s AI is accused of doing. In March, the U.S. Department of Housing and Urban Development (HUD) announced it was suing the social media company for violating the Fair Housing Act. HUD alleges that Facebook’s advertising system allowed advertisers to limit housing ads based on race, gender, and other characteristics. The agency also claims Facebook’s ad system discriminates against users even when advertisers did not choose to do so.</p><p>Although it has yet to be proven whether Facebook committed any deliberate discrimination, the result is still the same. “Using a computer to limit a person’s housing choices can be just as discriminatory as slamming a door in someone’s face,” HUD Secretary Ben Carson said in announcing the lawsuit.</p><p>Each day, machine learning and AI (ML/AI) models make decisions that affect the lives of millions of people. As these models become more integrated with everyday decision-making, organizations need to be increasingly vigilant of the risk created by potentially discriminatory algorithms.</p><p>But who within those organizations is responsible for ensuring the ML/AI model is making fair, unbiased decisions? The model developer should not be responsible, because internal control principles dictate that the persons who create a system cannot be impartial evaluators of that same system. The model’s users also should not be responsible, because they typically lack the expertise to evaluate an ML/AI model. Users also may not question a model that seems to be performing well. For example, if a predictive policing model leads to more arrests and less crime, users are not likely to question whether that system unfairly targets a particular group. </p><p>Internal audit may be best suited to provide assurance to the board and senior management that the organization is mitigating the reputational, financial, and legal risks of implementing a biased ML/AI model. However, because this is a new assurance domain for the profession, auditors need a methodology for auditing the fairness of these models. </p><h2>Why Models Need to Be Fair</h2><p>An ML/AI model is a mathematical equation that uses data to produce a calculation such as a score, ranking, classification, or prediction. It is a specific set of instructions on how to analyze data to deliver a particular result — behavior, decision, action, or cause — to support a business process. </p><p>There are three main categories of analytic models. <em>Descriptive models</em> summarize large amounts of data into small bits of information that are easier for organizations to analyze and work with. <em>Predictive models</em> are more complex models used to identify patterns and correlations in data that can be used to predict future results. <em>Prescriptive models</em> enable data analysts to see how a decision today can create multiple future scenarios. </p><p>ML/AI models need to be fair and nondiscriminatory because the decisions they support can expose organizations to substantial risk if the classification criteria they use are unethical, illegal, or publicly unacceptable. Such criteria are referred to as inappropriate classification criteria (ICCs) and include race, gender, religion, sexual orientation, and age.<br></p><table cellspacing="0" width="100%" class="ms-rteTable-default"><tbody><tr><td class="ms-rteTable-default" style="width:100%;">​<strong>Controlling for Exogenous Variables</strong><br><p><br>Often, despite the best efforts to eliminate it, discrimination creeps into an organization’s analytic models through external data that has a systemic bias, thus exposing the organization to risk. Appropriate exogenous variables (AEV) are variables that provide appropriate classification criteria but have been subject to external systemic bias that has not been detected. An example of AEVs would be the credit score for individuals from minority communities or salary information for women.<br></p><p>Fortunately, analytic models can be used to control for this bias. For example, after controlling for gender differences in industry, occupation, education, age, job tenure, province of residence, marital status, and union status, an 8% wage gap persists between men and women in Canada, according to a February 2018 Maclean’s article. It is a relatively simple exercise to adjust the salary variable in a classification model by +8% for female subjects. <br></p></td></tr></tbody></table><p>In assurance engagements regarding bias, internal auditors primarily will be concerned with a type of predictive model known as a classification model. This model is used to separate people into groups based on certain attributes that an organization can use to support decisions. Examples of these attributes include:</p><ul><li>Identifying borrowers who are most likely to default on a loan.</li><li>Classifying employees as future high performers.</li><li>Selecting persons who are least likely to commit further crimes if granted probation.</li><li>Targeting consumers to receive special promotions or opportunities. In one case, the Communications Workers of America sued T-Mobile, Facebook, and a host of other companies, alleging that those companies discriminated by excluding older workers from seeing their job ads.</li></ul><p><br>To provide assurance to management and the audit committee that the organization’s ML/AI model does not discriminate, auditors need to assess two things: 1) That the model does not benefit or penalize a certain classification of people; and 2) if a classification is removed from the model, it still provides useful results. </p><p>Internal auditors can test for bias using a model fairness review methodology. This methodology comprises: </p><ol><li>Understanding the model’s business purpose.</li><li>Working with the audit client to determine and identify ICCs. In this step, auditors also may discuss possible appropriate exogenous variables (see “Controlling for Exogenous Variables” on this page). </li><li>Selecting a large sample — or the entire data set — of input data and classification results.</li><li>Conducting statistical analysis of the results to determine whether distribution of ICCs is within acceptable parameters.</li><li>Discussing initial results with the client.</li><li>Removing ICCs and re-running the classification model. Auditors also can replace ICCs with uniform values depending on the nature of the model.</li><li>Comparing distribution of ICCs before and after removal. </li></ol><h2>A Bias Audit</h2><p>As an example of how internal auditors can use this methodology, consider a marketing department at a credit card company that used a classification model to determine which customers should be given a discount. The data used for the model is half women and half men. Management wanted assurance that this model was not exposing the organization to potential liability by discriminating against either group.</p><p>Internal audit met with Marketing and confirmed that it used the model to select customers for preferred rates. These preferred rates are substantially lower than the rates offered to customers in general. After reviewing the information used by the model, internal audit noted these variables:</p><ul><li>Customer ID (metadata — not used as a variable).</li><li>Surname (ICC).</li><li>Credit score.</li><li>Geography (ICC).</li><li>Gender (ICC).</li><li>Age (ICC).</li><li>Tenure.</li><li>Balance.</li><li>Number of products.</li><li>Has credit card.</li><li>Estimated salary.</li></ul><p><br>In some cases, a variable may be an ICC for one type of model but not for another. For example, gender is an appropriate classification criterion for a clothing company promotion but not for a loan approval. Age may be appropriate in a health-care model but not in an applicant screening.</p><p>In the marketing example, internal audit analyzed the initial results of the classification model and observed that 35% of customers were classified as good candidates. However:</p><ul><li>50% of men and 20% of women were classified as good candidates.</li><li>6% of customers over 50 were classified as good candidates.</li><li>1% of women over 50 were classified as good candidates.</li></ul><p><br>Internal audit discussed the initial classification results with the marketing department to determine whether there are business reasons for the observed result and if those reasons are valid, defensible, and nondiscriminatory to mitigate the risk of legal liability. Based on this discussion, internal audit removed the identified ICC from the input data and re-ran the classification model. </p><p>In reporting the results to Marketing, internal audit noted the model was producing useful results. The results showed that 45% of customers were classified as good candidates, a finding with which Marketing concurred. However:</p><ul><li>50% of men and 40% of women were classified as good candidates.</li><li>21% of customers over 50 were classified as good candidates.</li><li>10% of women over 50 were classified as good candidates.</li></ul><p><br>Internal auditors noted that the model appears to be biased against groups such as women and people over 50, which is likely the result of exogenous variables. Auditors recommended that Marketing adjust its model to compensate for these variables.</p><h2>New Models, Old Risks</h2><p>Although the subject of bias in analytic models may be unfamiliar to internal auditors, their risk management role in this domain is crucial. Bias introduces an unacceptable risk to any organization regardless of where that bias originates. A decision made by an organization’s analytic model is a decision made by that entity’s senior management team. Internal audit can help management by providing risk-based and objective assurance, advice, and insight. As such, auditors should learn and adapt their methods to meet the challenges organizations face in adopting AI. <br></p>Allan Sammy1
Crime's Digital Transformation's-Digital-Transformation.aspxCrime's Digital Transformation<p>​An international police operation has taken down a gang that allegedly stole an estimated $100 million from more than 41,000 victims using malware, European police organization Europol announced this month. The gang allegedly infected computers with the GozNym malware, enabling its members to obtain online banking credentials and access to victims' bank accounts. They also used those accounts to launder the money they stole and transfer the funds to their own accounts, Europol alleges.</p><p>What sets the GozNym gang apart is its use of cloud and digital platforms to carry out its operations and recruit service providers, technical expertise, and other accomplices. A U.S. federal grand jury in Pittsburgh has indicted 10 gang members, while prosecutions are underway in Georgia, Moldova, and Ukraine. Law enforcement agencies in Bulgaria and Germany also were involved. </p><p>"The collaborative and simultaneous prosecution of the members of the GozNym criminal conspiracy in four countries represents a paradigm shift in how we investigate and prosecute cybercrime," says U.S. Attorney Scott Brady of the Western District of Pennsylvania. </p><p>But criminals are shifting the paradigm, too. A new wave of organized crime groups are using the tools of digital transformation to carry out crimes throughout the world. </p><p>"Digital transformation is making it easier not only for legitimate organizations to expand their reach, but also for fraudsters and other bad actors to expand theirs," notes the <a href="" target="_blank">2019 Current State of Cybercrime</a> report from cybersecurity firm RSA. The RSA study spotlights trends spanning mobile, legitimate platforms, and digital crime.</p><h2>Mobile</h2><p>Last year, mobile communications was the source of seven out of 10 fraudulent transactions, RSA notes. Such transactions via mobile apps have increased nearly seven-fold since 2015.</p><p>But it's not just fraud that has gone mobile. One in five cyberattacks could be attributed to rogue mobile apps. RSA reports that on average 82 rogue apps are identified each day. RSA expects that trend to continue this year, "especially as cybercriminals keep finding ways to introduce tactics and technologies such as phishing and malware to the mobile channel." </p><h2>Leveraging Legitimate Platforms</h2><p>Last year, RSA's report pointed out that criminals were using social media networks and messaging platforms such as Facebook, Instagram, and WhatsApp to communicate and to sell stolen credit card numbers and identities. That warning has been borne out by a 43% increase in social media fraud attacks, according to this year's report.</p><p>These platforms are attractive to criminals because they are free of charge and easy to use, the report notes. RSA predicts criminals will open more stores on social media platforms to trade in stolen identities and similar data. </p><p>Moreover, cybercriminals "are developing their own apps to increase their anonymity, avoid detection, and otherwise keep anti-fraud forces from tracking them down," RSA says. Another threat to watch is criminals exploiting on-demand service platforms such as Airbnb and Uber for money laundering and to commit fraud. </p><h2>Digital Crime</h2><p>Criminals are turning to digital technologies to aid and abet their crimes, RSA reports. For example, they are automating the process of verifying stolen user names and passwords, using account-checking tools. They also are targeting ever-more-ubiquitous Internet of Things devices. </p><p>Moreover, RSA warns that criminals are exploiting cross-channel vulnerabilities by combining mobile, cloud, and other digital channels to launch attacks. An example would be using social engineering tactics to have an organization's call center change the password on a victim's online account so that the criminal would have access.</p><h2>Crime as a Service</h2><p>The GozNym case highlights another trend not mentioned in the RSA report: leveraging underground criminal networks to recruit accomplices. According to Europol, the gang came together as a "cybercrime as a service" operation. Its ringleaders used Russian-language online criminal forums to connect with people who acted as hosts, money "mules," encryption providers, spammers, computer coders, and technical support.</p><p>For example, the gang's leader obtained online hosting services for the attacks from the Avalanche network, which provided services to more than 200 cybercriminals and hosted more than 20 malware campaigns. </p><h2>The Risk of Copycats</h2><p>The U.S. Justice Department reports that five of the accused GozNym members are still at large — complete with a Federal Bureau of Investigation wanted poster. But as with many technology advances, other criminals are likely to copy the GozNym gang's tactics and add their own innovations. </p><p>To protect themselves, organizations need to combine vigilance and technology. "In this way, digital transformation becomes both a critical contributing factor in the problem of growing cyber risks today — and a critical resource for solving it," RSA says. </p>Tim McCollum0
Auditing the Smart City the Smart City<p>​As cities aggressively adopt “smart” technology — especially in the very public-facing transportation and safety arenas — municipal auditors will increasingly find themselves facing a new version of a familiar risk: cybersecurity. The underpinning of Internet-of-Things (IoT) connectedness that makes smart tech so smart is also its Achilles’ heel, offering hackers access, on a vast scale, to all kinds of complicated technologies — and the people they affect. And countering that risk may require new internal audit skills and tools. </p><p>When the technology works, smart sensors create massive amounts of data that trigger mechanical responses: roadways charge electric vehicles as they pass above; connected cars find the best parking spots. But cybercrime experts take smart tech risks — and their implications for municipalities — quite seriously, painting a dark future portrait in the event things go awry. What happens, for example, if cybercriminals made every traffic light in a city green at the same time or scrambled the entire grid’s color cycles during rush hour? What if they completely shut down the city’s smart power grid? What if an attacker targeted water and sewage systems, tampering with automated meters that detect and respond to flood conditions? </p><p>Auditors take those risks seriously, too. “The benefits that smart and emerging technologies can deliver are accompanied by multiple new risks,” says Tonia Lediju, chief audit executive (CAE) for the City and County of San Francisco. “We need to ensure that cities have the right security governance, processes, and controls in place.”</p><h2>Smart City by the Bay</h2><p>In San Francisco, there’s a lot of smart tech to audit. Lediju says it’s one of the leading smart cities globally, and it’s working on even more smart mobility solutions — often in partnership with private companies or with the U.S. federal government. Initiatives include smart traffic signals, an electronic toll system with congestion pricing, and autonomous electric shuttles to Treasure Island in the San Francisco Bay. The city also uses smart parking meters that change prices according to the time and day of the week.</p><p>Lediju says her auditors tackle the new risks of smart tech head-on. The City Services Auditor Division assists the various city departments affected by new transportation technology, for example, in understanding the risks, monitoring the application controls designed to rein them in, and crafting preventive responses. Lediju says her team’s annual work plan includes auditing new technologies when deemed necessary, based on a risk assessment. </p><p>The division works closely day to day with the City and County of San Francisco’s Department of Technology, its Committee on Information Technology, and the departments adopting new technologies to ensure all risks are managed adequately, before adoption, Lediju says. She follows three key steps: understand the pipeline of emerging technologies being considered, identify risk trends, and help departments actively manage risks as they navigate relevant regulations. </p><p>In the cybersecurity space, the City Services Auditor Division “identifies systems’ vulnerabilities and risks through penetration and assessment tests, and recommends remediation,” Lediju explains. Testing encompasses several areas, including cybersecurity framework adoption, security awareness training, IT governance, systems and network security, and business continuity.</p><p>“We also contribute insight gleaned from our extensive scope of work to help departments evolve and improve their strategies and protocols to better prepare for cyberattacks,” Lediju adds. Her team’s work is based on the Cybersecurity Framework Core Functions outlined by the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST): identify, protect, detect, respond, and recover. The City Services Auditor Division, she notes, also makes recommendations based on the CIS Controls and CIS Benchmarks guidance developed by the Center for Internet Security (CIS). “The CIS recommendations highlight for clients the numerous opportunities for control and process improvements or other enhancements that could ultimately increase their effectiveness in managing data security and fulfilling the organizations’ missions and goals in serving the city,” Lediju says. </p><h2>Sweden’s Smart Tech</h2><p>At Sweden’s Borlänge-based Trafikverket — the Swedish Transport Administration — the audit unit also gets involved early on, says Peter Funck, CAE. “The Agency,” as he calls it, is the national government authority responsible for public roads and railways; Funck’s office focuses on the planning and development phases, which is where he says his unit delivers the greatest added value. Audit and The Agency, he adds, have learned to manage large software and infrastructure development projects in similar ways, meaning audit is involved “several times before coding starts, as well as before the first spade is put in the ground,” Funck says. That’s been the case with two of Sweden’s key smart tech endeavors:</p><p></p><ul><li>The European Rail Traffic Management System (ERTMS) is a major industrial project underway in the European Union, Funck notes, and Sweden is one of the early adopters in developing and implementing it. ERTMS is a safety system that “enforces compliance by the train with speed restrictions and signaling status,” he says. </li><li>Sweden is also developing a national system for controlling and scheduling all trains that will integrate train operator scheduling. “It’s one of the biggest software-based projects ever in the country,” Funck says. “The project brings a lot of opportunities, but, of course, size and complexity imply challenges: Will it work? Is it safe?”</li></ul><p><br></p><p>Funck points out that his unit audited both the ERTMS and national integration projects several times, before they were even deployed on a test basis. “Those audits had different focuses,” he says, “but the common denominator has been whether internal controls provide prerequisites to make it work and make it safe.”</p><p>The projects aren’t yet far enough along for after-the-fact performance audits. But Funck notes that, in all of his office’s smart tech projects, health and safety, including terror attacks, are the largest risk concerns. “Information security often brings those risks down to some kind of acceptable level,” he says. Indeed, Funck emphasizes that available information security technology in general is up to the smart tech challenge; the bigger problem lies in people and their roles in keeping smart cities humming.</p><p>Funck adds: “There is always a need for some kind of security and safety risk acceptance in developing business processes to balance with productivity requirements.” At the end of the day, he points out, “railroads and roads are safer if we remove all trains and cars.” </p><h2>Data and Privacy Safeguards</h2><p>Jim Thompson, city auditor in the Albuquerque Office of Internal Audit (OIA), takes smart tech in stride, too, though he’s also well aware of the risks it poses — including those related to cybersecurity. “OIA performs an annual risk assessment of the city, which includes consideration of the city’s information technology risk,” he says. “As the city increases its use and reliance on information technologies, including smart technologies, the risk of cybersecurity and data breach — as well as the liability risk — increase.” </p><p>The city’s Technology and Innovation Department maintains internal controls over IT and also uses outside experts for IT vulnerability risk assessments and intrusion testing. Thompson maintains in-house technology expertise on his team as well. One senior information systems auditor, he says, holds several IT certifications, including CISA, CITP, and ITIL v3 Foundation.</p><p>The City of Albuquerque, Thompson says, has implemented various smart technologies, including government document and data transparency, ride apps, enhanced wireless access, and online police services. Planned audit engagements assessing privacy concerns will target some of those enhancements. “Our annual audit plan this year includes an audit of all city systems and devices that contain personal identifiable information [PII],” Thompson notes. “Some of the city’s smart technologies will be included.”</p><p>Thompson says the audit will consider whether the city maintains a listing of all systems and devices containing PII and if it has controls in place to classify and safeguard PII correctly, including intake points, release and data sharing points, and storage. It will also examine whether individuals with access to the city’s computer environment are trained on and aware of their responsibility to safeguard PII and what to do in the event of a data breach. OIA will consider federal, state, local, and contractual requirements for PII and compare the city’s current practices with IT governance framework best practices recommended by ISACA’s COBIT framework, as well as NIST. </p><table class="ms-rteTable-default" width="100%" cellspacing="0"><tbody><tr><td class="ms-rteTable-default" style="width:100%;"><p><strong>​Down the Pike</strong></p><p>For municipal auditors who are not engaged to audit their city's smart tech right now, there's a good chance they will be soon. Indeed, Kansas City, Mo.'s Chief Innovation Officer Bob Bennett declared last year at the Smart Cities Connect Conference and Expo that municipalities that don't get on the smart tech bandwagon soon will find themselves part of a "digital Rust Belt." </p><ul><li>66 percent of cities say they're investing in smart tech, according to a 2017 report from the National League of Cities called Cities and the Innovation Economy: Perspectives of Local Leaders; one-fourth of the rest are looking into it. </li><li>International Data Corp. reported in January that worldwide spending on smart cities initiatives would reach $95.8 billion in 2019, an increase of 17.7 percent over 2018; by 2021, the total could hit $135 billion. Singapore, New York, Tokyo, and London are expected to invest more than $1 billion each this year, IDC added; the applications receiving the most funding are fixed visual surveillance, advanced public transit, smart outdoor lighting, and intelligent traffic management.</li><li>IoT Analytics said late last year that there were 17 billion connected devices worldwide; the number of IoT devices — excluding smartphones, tablets, laptops, and fixed line phones — was pegged at 7 billion. "The number of IoT devices is expected to grow to 10 billion by 2020," the firm points out, "and 22 billion by 2025."</li><li>Mobility is the most common area for smart tech investment, according to the National League of Cities report. Other key applications include lighting solutions, security, and utilities management, according to the McKinsey Global Institute 2018 report, Smart Cities: Digital Solutions for a More Livable Future. </li></ul></td></tr></tbody></table><h2>Protecting the Vision</h2><p>Chattanooga, Tenn., City Auditor Stan Sewell also points to cybersecurity risk associated with his municipality’s emerging technologies. And while it’s not the No. 1 priority, the city’s tech-focused initiatives provide ample reason to ensure online security issues are addressed. “It’s definitely a risk, but it’s more of a ‘black swan’ concern,” he says.</p><p>Chattanooga’s Smart City Division, which manages street lights and traffic signals, acknowledges that “technical challenges may result from our vision in cybersecurity, hacking, and privacy issues.” “Vision” in Chattanooga includes autonomous vehicles and robust vehicle-to-vehicle and vehicle-to-infrastructure communications. The city won a 2019 Smart Cities Connect Smart 50 Award, a global recognition of transformative smart city project work, for its Chattanooga Smart Community Collaborative research partnership.</p><p>Sewell’s primary concern is supervisory control and data acquisition (SCADA) systems, composed of computers and both wired and wireless data communications modules that provide remote access to and control of a city’s infrastructure processes. “SCADA systems are vulnerable to cyberattacks,” he says, “which are occurring with an increased frequency.” A cyberattacker could gain remote control of the city’s water treatment, for example, “commanding the release of wastewater or sending false pressure sensor data, resulting in a catastrophic failure of water pumps and controls.” Sewell adds: “The various smart technologies increase the number of potential access points to enter the city’s systems to gain access to other areas.”</p><h2>Tried and True</h2><p>In some municipalities, the audit function’s treatment of smart tech doesn’t differ much from how it handles other city initiatives. Smart tech constitutes a largely routine subject, for example, for the City Auditor’s Office in Kansas City, Mo.</p><p>City Auditor Douglas Jones says he is aware of many of the city’s initiatives, one of which earned Kansas City a 2019 Smart 50 Award; plus, he knows smart tech is “timely and topical” and that it poses some reputation risk, as well as risks related to IT and operations. But from his perspective, newness can work against a program’s auditability. “It often makes little sense to audit a program with no track record,” he says. “And there’s always risk with a new program.”</p><p>Indeed, smart tech, Jones emphasizes, is “just one more thing that would be in our universe of potential audit topics. We cover everything from airports to the zoo, and we don’t put a specific emphasis on one thing or the other.” </p><p>Austin, Texas, another 2019 Smart 50 Award recipient, also places high priority on leveraging tech. In fact, Assistant City Auditor Andrew Keegan says Austin is trying to use its technology to help save lives. “Austin is committed to a Vision Zero plan, which calls for zero fatalities or serious injuries resulting from vehicle collisions by 2025,” he explains. “Part of that plan is focused on implementing new technologies.”</p><p>But Keegan’s team likely won’t be involved until after those plans and programs have been implemented. “Selecting a particular technology to audit depends on the risk posed by the new technology as compared to other risks facing the city,” he says. “This is our practice regardless of the topic.” Indeed, right now, his office is conducting an audit related to motorists’ well-being. “While part of that project includes reviewing the implementation of new technology,” he comments, “the audit is focused on the general issue of traffic safety.”</p><p>Amanda Noble, city auditor in the City of Atlanta’s City Auditor’s Office, notes that Atlanta has implemented smart mobility tech, but she, too, says the audit function didn’t have a role in assessing risk on the front end. “As the city was implementing the technology, we became aware of it and went to a demonstration,” she says. “But we looked at the data the city was connected to and its potential uses in risk assessments and audit work. We hadn’t thought about auditing the technology itself.”</p><p>Would it help? “I think it would,” Noble says. She notes that her team has assessed controls on financial systems installations, but “possibly because smart tech is not financial data, the audit function has not been asked to play a role.” Stakeholders viewing the profession as dealing primarily with financial information can be frustrating, she adds, in the face of internal audit training that emphasizes the importance of foresight in all areas of the enterprise. </p><p>“So much of our role is looking backward,” Noble says. “There’s not really a process for emerging risk, unless we do it as one-offs. There’s nothing systematic.” She adds that resource constraints limit the audit function’s ability to tackle emerging issues, so new risks may not be audited until nearly a year has passed. She’d like to do more.</p><p>“Decision-makers value our input,” Noble emphasizes. “We need a way to assess and report on emerging technology.”</p><h2>Expanded Services, New Skills</h2><p>Lediju sees a balance between tried and true audit services and helping organizations see around the corner. “We’ll need to remain focused on our existing foundation of auditing standards and principles to detect internal control weaknesses and fraud risks,” she says. “But the profession must be ready to take on more of an advisory role and help cities keep pace with and get ahead of emerging risks, maintaining its unique perspective on people, processes, and governance when striving to strengthen its risk management programs.” </p><p>Because of the specialized knowledge required for new and smart technologies, she adds, internal auditors who possess a mix of business and technology skills will be needed. In fact, more of them will be needed. “Smart tech requires more internal audit resources because the pool of tools is constantly expanding and being used for various operations across government services,” Lediju explains. As a result, she says, information and software oversight and accountability, including human and technology resources, become more necessary.</p><p>Internal auditors will need to adopt new tools and techniques, she adds, such as artificial intelligence and blockchain auditing and reconciliations, to increase continuous audit activities, rapidly pinpoint control gaps, and identify nonconformance and process improvement opportunities in real time. She says her office “currently relies on outside contracting and consulting services to keep abreast of the rapidly evolving trends and practices in technology, governance, security, and privacy relevant to the respective technologies.” </p><p>Lediju adds: “With the requirements of continuing professional education and the goal to help businesses and government adopt best or leading practices, internal audit can remain a necessary and beneficial agent of change.” Maybe, in fact, the profession could do more when it comes to smart tech. </p><table cellspacing="0" width="100%" class="ms-rteTable-4"><tbody><tr class="ms-rteTableEvenRow-4"><td class="ms-rteTableEvenCol-4" style="width:100%;"><strong>​IoT Risks</strong><p><br>The risk issues every public entity project faces are amplified when the connectivity required for smart technology is at play. </p><ul><li><em>Human error.</em> Hackers are one kind of human risk, simple mistakes are another. Often overlooked as a threat, the public entity employees who read the meters and monitor the system outputs — and decide when to override — are likely inexperienced with smart city technology, <em>Risk Management</em> magazine noted recently. Their ethics and judgment may also come into play in a smart tech crisis.</li><li><em>Technical difficulties.</em> The connectedness needed for smart technology to work may require integrating powerful, cutting-edge IT infrastructures with, as Travelers calls them in its 2017 Public Safety for the Smart City report, "legacy IT infrastructures that may not be fully up to the task of handling the extreme volumes and types of data." This includes, for example, vehicle-to-infrastructure that smart devices generate. Plus, sometimes software fails, or lightning strikes, or the power goes out. </li></ul><ul><li><em>Complicated connections.</em> Many smart tech projects, especially in transportation, involve public sector entities, academia, and private industry — and each often has its own data management infrastructure already in place. Many also involve multiple — in some cases, dozens of — local, county, and state jurisdictions. The City of San Diego General Plan, for example, includes a "mobility element" that will guide implementation of the city's part in the multi-stakeholder Mobility 2030 Regional Transportation Plan prepared by the San Diego Association of Governments, an organization of 18 local and county public entities. In addition, Southern California is a national Intelligent Transportation System Priority Corridor Program participant; the Southern California Association of Governments represents six counties and 191 cities.</li></ul><p> <br>Even the familiar risks posed by smart tech can cause greater concern to internal auditors because of their vast scale — especially if, as <em>Risk Management</em> puts it, "policies, procedures, and training do not adequately address the new capabilities." Additional education and new tools may be required to meet the challenge.</p></td></tr></tbody></table><p></p>Russell A. Jackson1
Fit for Digital for Digital<p>​Internal auditors better get fit. Digitally fit, that is. They will need to be in top shape to take action in digital transformation initiatives.</p><p>Digital fitness means embracing new technology to gain insights that can help digital transformation. It can make risk professionals "important partners and leaders in helping their organizations get better benefits from their digital initiatives," says Jim Woods, global risk assurance leader at PwC.</p><p>But many internal audit, risk, and compliance functions are falling short, according to PwC's latest <a href="" target="_blank">Risk in Review Study</a>. The global study surveyed more than 2,000 board members, CEOs, and senior executives, as well as internal audit, compliance, and risk professionals. </p><p>Woods says internal audit and other risk functions are at a "critical juncture" in which automation, data analytics, and other technologies are transforming businesses. But new risks are transforming organizations, as well. "Digital transformation is also driving the potential for identifying risk and making smarter decisions," he adds. </p><p>Woods points to findings from PwC's most recent CEO survey in which about one in five CEOs said they receive risk exposure data that is comprehensive enough to make long-term decisions. That number hasn't increased in 10 years. </p><p>"Dynamics" is what PwC labels internal audit, compliance, and risk functions that are in the top quartile of surveyed organizations. These functions are developing digital capabilities faster, are confident in taking risks that are consistent with their strategies, manage transformation-related risks more effectively, and are getting better-than-expected value from digital investments than their peers. </p><p>The Risk in Review study outlines six components of digitally fit organizations. Internal audit functions might consider them a fitness regime.</p><h2>All-in on the Organization's Digital Plan</h2><p>Dynamics have aligned their function's digital strategy with that of their organization, enabling them to provide "strategic advice and assurance over the new and changing risks that digital transformation brings," the report notes. Three-fourths of dynamic functions seek specific outcomes from their digital investments, and 73% change performance metrics to support behaviors and manage against an "aspirational" digital operating model. Less than half of other organizations do those three things.</p><h2>Boost Digital Skills and Talent </h2><p>Auditors and risk professionals in dynamic functions have become data-driven and use digital tools to provide risk insights at the pace of the organization's transformation efforts, the report says. Executives say their organizations need critical thinking, technology, analytics, cybersecurity, project management, and change management skills. Eight in ten dynamic functions use performance metrics to assess and reward new digital ways of working, and seven in ten have created a talent management program to hire digital personnel or enhance the skills of existing people. </p><h2>Find the Right Fit for Emerging Technologies</h2><p>Overall, one-third of surveyed functions are using technologies such as artificial intelligence (AI), the Internet of Things (IoT), and robotic process automation. Dynamics are more likely than other functions to automate their activities to free people to work on more valuable analyses and to expand risk coverage. Thirty-six percent of dynamic functions use IoT sensors to respond to risks, and 39% use AI for population testing, controls, or risk modeling. Those are more than double the percentages of other functions. </p><h2>Enable the Organization to Act on Risk in Real Time</h2><p>To support transformation decisions, internal audit and other risk professionals must provide insights about a fast-changing set of risks that can impact the organization quickly. Nearly three-fourths of dynamic functions are redesigning current processes to deliver services and developing new services for stakeholders. Half are using intelligent automation or machine learning to prioritize risks. </p><h2>Engage Decision-makers of Key Digital Initiatives</h2><p>Audit and risk functions that provide the most value to transformation projects are in communication with decision-makers, participating in key meetings and consulting on projects and plans. About eight in 10 dynamic functions use dashboards or visualization tools to provide more strategic risk reports to the board. A similar number influence strategic decisions about digital initiatives.</p><h2>Provide a Consolidated View of Risks</h2><p>Dynamic audit and risk functions collaborate across the lines of defense on digital projects. With this component, the numbers are small, though. One in five dynamic functions have a common policy framework and a single set of risk metrics or key performance indicators. About one-fourth provide consolidated reports to the board. </p>Tim McCollum0

  • Birmingham City Univ_August 2019_Premium 1
  • IIA Training_August 2019_Premium 2
  • IIA CIA_August 2019_Premium 3