An Important Cyberrisk Framework​ Important Cyberrisk Framework​<p>​Perhaps the most important cyberrisk framework is that published by the U.S. National Institute of Standards and Technology (NIST). Recently, NIST shared for comment a proposed update to their framework.</p><p>You can <a href="" target="_blank">download the document and view related videos here</a>.</p><p>Here are some key excerpts from the executive summary:</p><ul><li>Similar to financial and reputational risk, cybersecurity risk affects a company's bottom line. It can drive up costs and impact revenue. It can harm an organization's ability to innovate and to gain and maintain customers.</li><li>The Framework focuses on using business drivers to guide cybersecurity activities and considering cybersecurity risks as part of the organization's risk management processes.</li><li>The Framework enables organizations — regardless of size, degree of cybersecurity risk, or cybersecurity sophistication — to apply the principles and best practices of risk management to improving the security and resilience of critical infrastructure.</li><li>The Framework is not a one-size-fits-all approach to managing cybersecurity risk for critical infrastructure. Organizations will continue to have unique risks — different threats, different vulnerabilities, different risk tolerances — and how they implement the practices in the Framework will vary. Organizations can determine activities that are important to critical service delivery and can prioritize investments to maximize the impact of each dollar spent. Ultimately, the Framework is aimed at reducing and better managing cybersecurity risks.</li></ul><p><br></p><p>Later, the authors say this:</p><p><span class="ms-rteStyle-BQ">"Enterprise risk manageme​nt is the consideration of all risks to achieving a given business objective. Ensuring cybersecurity is factored into enterprise risk consideration is integral to achieving business objectives. This includes the positive effects of cybersecurity as well as the negative effects should cybersecurity be subverted."</span></p><p>There's a good amount of material to like.</p><ul><li>The framework is risk-based and talks about, in my words, investing in cybersecurity commensurate with the level of risk.</li><li>When it talks about risk, it is to the achievement of business objectives. They don't talk about protecting information assets, but rather drive to what is important to the success of the business.</li><li>It uses a maturity model (although it doesn't describe it as such) as a useful way to assess the effectiveness of the cyber program.</li><li>It makes the point that those responsible for the cyber program need to be at an appropriate level within the organization.</li><li>It emphasizes that the management of cyberrisk needs to be integrated within the broader enterprise risk management activity.</li></ul><p><br></p><p>However, there are some few areas where I would have liked to have seen more discussion.</p><ul><li>Appendix B is a list of objectives for the cyber program. However, in my opinion it is over-simplified and probably incomplete. For example, I do not see anything about protecting the organization from the effects of social engineering.</li><li>While detection is emphasized, the need for <em>timely</em> detection is not mentioned.</li><li>The framework mentions the need for continuous improvement and that cyberrisk is dynamic. However, the sea is constantly rising and defenses have to adapt at least as fast as the risk changes. Investment needs to be in resources that enable threats to be monitored and defenses upgraded continuously.</li><li>The task of assessing the likelihood of a breach is hardly covered at all. There is general acceptance of the fact that a breach is almost inevitable, so the emphasis perhaps should be on the likelihood of different degrees of impact. Past experience may not be a good indicator, as prior breaches may not have been detected — leaving management with the unjustified belief that the incidence of breach is lower than it really is.</li><li>The framework suggests that the organization should have an inventory of all assets or points on the network. However, with the extended supply chain plus the Internet of Things plus the fact that employees and other individuals are hacked as entry points, the problem is far more severe than is presented. I am not persuaded that an inventory can ever be considered complete.</li><li>While the framework talks about integration with the enterprise risk management program, it is important to note that cyber may be one of several risks that might affect the achievement of one or more business objectives. Decisions about acceptable levels of risk to an objective should consider all these risks, not just one. In other words, cyber and other risks to an objective may appear to be at an acceptable level individually, but the aggregate effect may be intolerable and require action.</li><li>The framework references the ISO 31000:2009 global risk management standard (curiously not the COSO ERM Integrated Framework) but defines "risk" in its own way. It also uses the term "risk tolerance" in its own way, inconsistent with that of COSO or ISO. (It is essentially the same as COSO's risk appetite).</li></ul><p><br></p><p>A framework is simply that, a framework that any organization can build out to suit its situation and needs.</p><p>I encourage everybody to consider the document, respond with suggestions for improvement, and perhaps use it to assess and then upgrade your organization's cyber program.</p><p>Your comments?​</p><p><br></p>Norman Marks0
​Deloitte Shares a List of "Risk" Trends to Watch in 2017 and Beyond“risk”-trends-to-watch-in-2017-and-beyond.aspx​Deloitte Shares a List of "Risk" Trends to Watch in 2017 and Beyond<p>​Rather than the list of top risks, the people at Deloitte suggest that there are a number of trends "that have the potential to significantly alter the risk landscape for companies around the world and change how they respond to and manage risk."</p><p>They share 10 in <a href="" target="_blank">The Future of Risk: New Game, New Rules</a>.</p><p>I like the way they start:</p><p><span class="ms-rteStyle-BQ">The risk landscape is changing fast. Every day's headlines bring new reminders that the future is on its way, and sometimes it feels like new risks and response strategies are around every corner. The outlines of new opportunities and new challenges for risk leaders — indeed, all organizational leaders — are already visible.</span></p><p><span class="ms-rteStyle-BQ">What you'll see is that risk's onset and consequences, and the entire nature of the risk discipline, are evolving. The good news? The strategic conversation around risk is changing too. For leaders today, risk can be used as a tool to create value and achieve higher levels of performance. It's no longer something to only fear, minimize, and avoid.</span></p><p>For the moment, let's put aside our differences about the meaning of words such as "risk" and "risk source." </p><p>The 10 trends they have listed merit consideration. As Deloitte suggests, we should all consider these trends. Do we agree with the facts as presented? Will they affect us and, if so, how? How should we respond?</p><p>Please read the report, which is fairly short, before coming back to this discussion.</p><p>The first trend is <span style="text-decoration:underline;">cognitive technologies</span>, which is a fancy term that includes big data analytics, predictive analytics, AI, machine learning, and so on. Deloitte says it is about "using smart machines to detect, predict, and prevent risks in high-risk situations."</p><p>Broadly speaking, every organization should be watching and exploring ways to use new or advances in technology for this purpose.</p><p>But more might be done.</p><p>Machine learning and similar technologies may not only detect patterns and so on, analyze them, but actually make decisions and initiate action. Smart software, as well as machines, is starting to replace humans that perform repetitive analysis and response.</p><p>The second is "<span style="text-decoration:underline;">Controls become pervasive</span>." Deloitte is not talking about internal controls, here. They are talking about controls automation. They could have easily rolled this into the first trend, since it's really about the use of technology for risk monitoring.</p><p>The third is quite different: It's about advances in <span style="text-decoration:underline;">behavioral science</span>. I'm not sure what they expect to be different in 2017 and beyond, because the study of human behavior is not new at all. The key is whether the science will be <span style="text-decoration:underline;">used</span>.</p><p>Deloitte then uses the term "<span style="text-decoration:underline;">vigilance</span>" for its next trend. This is another fancy word; <strong>detection </strong>would have worked just as well, perhaps more accurately, but vigilance is more exciting and appealing to the consumer of Deloitte services.</p><p>Yes, more attention needs to be placed on risk monitoring and detection controls, especially with respect to cyber.</p><p>The next one is "<span style="text-decoration:underline;">risk transfer</span>." Arguably, risk is never transferred. It can only be shared or mitigated. Also, preventive controls do not eliminate risk; they just reduce the level to hopefully acceptable levels, because there is always the possibility that the controls will fail. The only change in this area I am aware of is the emergence of (limited) cyber insurance.</p><p>Deloitte thinks that the fact that <span style="text-decoration:underline;">innovation outpaces regulation</span> is a trend. I am not persuaded. However, the relaxation of regulation under President Trump would be a change — but may not be <span><span>in effect </span></span> long-term if he is not re-elected in four years.</p><p>Using <span style="text-decoration:underline;">risk management to drive performance</span> is not a new thought. I have been pressing for it for a while myself. If it becomes a reality, that would certainly be an important trend.</p><p>"<span style="text-decoration:underline;">Collective risk management</span>" is an interesting concept. However, laws and regulations can limit the sharing of information.</p><p>"<span style="text-decoration:underline;">Disruption</span> dominates the executive agenda" is not new. I agree with Deloitte that it should be expected to increase this year and into the future.</p><p>Then Deloitte picks <span style="text-decoration:underline;">reputation </span>risk — again, not really new. The change is that new technologies can help us address it.</p><p><br></p><p>Overall, a couple of points that should stimulate some thinking. But most of this should be ho-hum for most of us.</p><p>What do you think?​</p><p><br></p><p><br></p>Norman Marks0
A Holistic Approach to IT Risk Holistic Approach to IT Risk<p>​With IT ingrained in most business processes, IT risk management has become a critical part of enterprise risk management. The rise of cybersecurity incidents in recent years has heightened the need for directors and executive management to understand, evaluate, and respond to IT risks. Yet, managing these risks can be daunting because of the technical complexity and far-reaching outcomes of an IT risk event.<br></p><p>Although it is tempting for the board and management to focus on cyberrisks, internal audit must consider the full range of IT risks and take a more holistic view of the business. Gaining such a view is one of the advantages of using ISACA’s COBIT framework to address risk management challenges. <br></p><p>The latest version, COBIT 5, released in 2012, can help internal auditors develop an audit plan to address IT risks, set IT audit objectives, and define the scope for IT audits. It can help simplify complex issues by giving auditors best practices and conceptual guidance on how to categorize risks, identify risk events, and understand the relationship between risk events and value creation.<br></p><p>Moreover, COBIT emphasizes the value of assessing a process from end to end, instead of auditing components of that process. In addition, the separation of governance from management highlights the need to audit IT risks related to IT governance and management, which organizations tend to overlook.<br></p><h2>COBIT Explained</h2><p>COBIT is an enterprisewide IT governance and management framework designed to enable organizations to maintain a balance between realizing benefits from IT and optimizing risk levels and resource use. It is based on five principles: meeting stakeholder needs, covering the enterprise end-to-end, applying a single integrated framework, enabling a holistic approach, and separating governance from management. <br></p><p>COBIT 5’s basic premise is that goals cascade in an organization — that is, stakeholder needs are translated into enterprise goals, which set the direction for IT goals and enabler goals. Further, the framework provides guidance on IT risk management from a functional perspective (i.e., what is needed to build and sustain core risk governance and management activities), and a risk management perspective (i.e., how the COBIT enablers can assist the core risk management processes of identifying, analyzing, and responding to risk). <br></p><p>COBIT 5 describes enablers as factors that “individually and collectively influence whether something will work.” They can be used in both IT risk management and IT audit planning.<br></p><h2>Enabling Audit Planning</h2><p>Whether developing an audit plan or planning for an individual audit, internal auditors need to determine the audit objectives, scope, timing, resource requirements, and process. COBIT suggests auditors take a holistic view of the business when planning an audit. <br></p><p>Auditors can use the seven COBIT enablers as the foundation for identifying IT audit objectives and defining the audit’s scope. These enablers are:<br></p><ul><li>Principles, policies, and frameworks that translate the desired behavior into practical guidance that can be managed.</li><li>Processes that support achievement of a set objective.</li><li>Organizational structures that are important for decision-making.</li><li>Culture, ethics, and behavior of individuals, which explain the human interactions that influence governance and management. </li><li>Information, including all information produced and used in the business.</li><li>Services, infrastructure, and application, including the IT used by the organization.</li><li>People, skills, and competencies, including people who are required for successful completion of all activities. </li></ul><p></p><p>Because COBIT provides 36 generic risk scenarios, internal auditors should begin by working with management to prioritize risk scenarios for their organization. COBIT uses primary and secondary ranking to show the impact of each risk scenario on the type of risk. COBIT categorizes the risk types based on whether the risk is strategic (IT benefit/value enablement), operations-related (IT operations/service delivery), or project-related (IT program/project delivery). <br></p><p>Second, internal auditors can identify activities pertaining to each of the enablers for the prioritized risk scenarios. For example, organizations face IT risk when selecting IT programs (risk scenario), which primarily affect the organization’s strategy and secondarily its operations. To manage this risk, management can implement a policy that indicates the types of IT investments that are a priority (policy), have a formal process to select IT projects (process), have an IT steering committee (organizational structure), communicate the importance of technology throughout the organization (culture), define IT investment selection criteria (information), have a program management application (application), and involve appropriate managers in the decision-making process (people). <br></p><p>Third, internal auditors can rank activities based on an approach that best fits the organization. For example, auditors may use a high/medium/low priority, primary/secondary, or a rank order based on weights to identify the areas that need attention. Finally, once the activities are ranked, auditors can plan the audit by first focusing on the primary/high priority activities before turning attention to secondary activities given resource, time, and personnel constraints.<br></p><h2>An Eye on the Big Picture</h2><p>COBIT’s recommended best practices can establish a foundation for providing assurance on the adequacy, reliability, and integrity of an organization’s information systems, regardless of its industry, technology infrastructure, or geographic location. This foundation can help internal auditors understand how the organization operates and where it wants to go. <br></p><p>Moreover, the COBIT guidance recognizes that IT risk exposure differs among organizations based on management’s risk appetite, involvement, and risk response. Internal auditors can use the framework to understand the nature of IT risks that are unique to their organization and develop an intuition that helps them recognize red flags, internal control weaknesses, and fraud.</p><p>Further, COBIT can help internal auditors identify and organize audit findings that can be instrumental in establishing and monitoring the organization’s IT risk management practices. The framework enables auditors to work at a detailed level while also keeping the big picture in mind.  <br></p>Nishani Edirisinghe Vincent1
​Do We Know How to Audit Technology-related Risks?​Do We Know How to Audit Technology-related Risks?<p>​I just read through the latest ISACA/Protiviti survey, <a href="" target="_blank">A Global Look at IT Audit Best Practices</a>.</p><p>It has a wealth of generally useful information and I recommend it to all internal audit leaders but not to board members — the level of detail is too much for their use. The executive summary is the most I would have a director read. But it would be better to have the CAE summarize the report for them, focusing on what lessons should be learned for their particular organization.</p><p>Some things surprised and others disappointed me.</p><p>My most important issue is that we need to stop talking about IT audit.</p><p>We should be talking about auditing risks relating to technology!</p><p>In the days of yore, the IT department owned and ran all the technology — with the exception of minor pieces of so-called user-managed software.</p><p>But not in 2016.</p><p>A good friend of mine, Gene Kim, is co-author of <a href="" target="_blank"><em>The Phoenix Project: A Novel about IT, DevOps, and Helping Your Business Win</em></a>. I recommend it to anybody interested in technology and today's approach to running the IT function.</p><p>Recently, I read <a href="" target="_blank">a review of <em>The Phoenix Project</em> by Sara Hruska</a>. She makes a few pertinent points:</p><ul><li>Pretty much every business is so dependent on technology that the distinction between leading the IT function and the CEO/chief operating officer role is diminishing.</li><li>The success of any organization can be dependent on the ability of the IT function to deliver at speed technology solutions that will drive the business.</li></ul><p><br></p><p>So, my first point is that the topic should no longer be the IT function, but the development, maintenance, and use of technology across the extended enterprise.</p><p>Let's talk about <em>technology</em> auditing.</p><p>Then there's my constant drumbeat comment that there is no such thing as IT risk.</p><p>It's technology-related <em>business </em>risk.</p><p>What could go wrong when it comes to the development, maintenance, or use of technology that would significantly affect the achievement of <em>business</em> objectives?</p><p>For that reason, there should not be a separate IT audit plan. It should, as Protiviti reports is more often than not the case, part of an integrated audit plan that is updated as often as risks change.</p><p>According to Protiviti, about half the respondents only update their (IT) audit plan annually.</p><p>That simply won't do in an era of dynamic change, especially around technology and its use.</p><p>I find it curious that despite the point made by Sara Hruska, the ability to identify the potential for disruptive technology to drive the organization forward is not among the top technology challenges in the Protiviti report. Perhaps it is because that was not an option Protiviti allowed respondents to select. More likely, though, it is because practitioners simply don't pay enough attention to the problem.</p><p>Is that correct?</p><p>Maybe Protiviti thought that their question about auditing IT governance would cover it. But, IMHO, a single audit of IT governance is not recommended. The topic is broad and practitioners should assess only those aspects of IT governance that are more critical to their business.</p><p>Other points of interest in the survey results:</p><ul><li>Nearly half believe their IT department is not aware of all of their organization's connected devices (e.g., connected thermostats, TVs, fire alarms, cars).</li><li>83 percent of respondents say cyberattacks are among the top three threats facing organizations today, and only 38 percent say they are prepared to experience one. — Comment, I wonder if they have assessed the <em>business</em> risk of a breach.</li><li>The study also found that only 29 percent of the respondents are very confident in their enterprise's ability to ensure the privacy of its sensitive data.</li><li>Only 65 percent said their CAE has sufficient knowledge to discuss IT audit matters with the audit committee. — Comment, that is dreadful.</li><li>Half or less than half of companies have their CAE or IT audit lead meet regularly with the chief information officer!</li><li>Where there is a corporate ERM framework, less than half the IT audit work is integrated with it.</li><li>Only about half are doing a significant or even a moderate amount of work on new technology initiatives.</li></ul><p></p><p>This is a disappointing state of affairs. I was an IT auditor for many years before becoming a CAE and always made sure my team was involved in every major technology initiative. The IT audit staff was generally about a third of the team — and I am talking about from 1990 to 2012!</p><p>Today, technology-related risk is huge and merits a lot more attention that it appears, from the study, it is getting.</p><p>What do you think?</p><p>What jumps out at you from the survey?​</p><p><br></p>Norman Marks0
What's Your Cyber Risk Appetite?'s Your Cyber Risk Appetite?<p>​In drafting the report for a client on a recent information security audit, there was nothing unexpected in the findings. The usual suspects lined up: access control, physical security, and network security. But there was something missing, the elephant in the room. There was no defined or formalized statement of the client's information security risk appetite.</p><p>Typically, organizations do not formally consider and document their information security risk appetite. Although most organizations have an information security policy framework and supporting processes and procedures, many of those policies seem to have been written without an end goal in mind. Specifically, they don't state that the policy is based on an information security risk appetite position or statement. Organizations spend significant resources on information security, but if they do not know what systems and data are to be secured, and to what extent, how do they go about securing them?</p><p>A first step toward drafting a risk appetite statement should be undertaking an internal information security risk assessment to determine where the organization is and where it needs to be. This assessment will involve facing some truths that may not be palatable to senior management, but it will help identify the organization's unique risks and what it needs to do to address them.</p><h3 style="letter-spacing:normal;">Work​ up an Appetite </h3><p>The Committee of Sponsoring Organizations of the Treadway Commission's <em>Enterprise Risk Management–Integrated Framework</em> defines <em>risk appetite</em> as "The degree of risk on a broad-based level that a company or another entity is willing to accept in pursuit of its goals." A June 2009 study by insurance and risk company Marsh and the University of Nottingham, Research Into the Definition and Application of the Concept of Risk Appetite, breaks risk appetite into five categories:</p><ol><li>A limit or boundary set on the risk heat map (usually the top right-hand column).</li><li>Economic measures (including capital changes/impact, profit or loss, and tolerable levels).</li><li>Changes in credit ratings.</li><li>Changes in targets or thresholds of key indicators.</li><li>Qualitative statements (e.g., zero tolerance for license breaches or loss of life).</li></ol><p>The appetite for security risk should be based on the organization's overall risk appetite. The consequence and likelihood of the risk occurring should determine the level of acceptable risk. For example, the impact of not conducting periodic user access reviews on applications may be rated as "medium," which is within the the organization's defined risk appetite. Consequently, management can prioritize resources for taking action based on the appetite it has set. In contrast, a denial of service risk may have the capacity to bring the organization's website down, so the rating of this risk may be outside the acceptable tolerable levels and require appropriate emergency action. </p><p>The organization needs to articulate its risk thresholds and then obtain sign-off from management. A risk mature organization may have multiple levels of risk appetite statements across platforms and technologies. The key to success is aligning these area-specific risk statements with the overall information security risk appetite and the organization's risk appetite statement. </p><p>Some areas where risk appetite may be considered include:</p><ul><li>Asset management.</li><li>Access control.</li><li>Cryptography.</li><li>Physical and environmental security.</li><li>Operations security.</li><li>Communications security.</li><li>System acquisition development and maintenance.</li><li>Supplier relationships.</li><li>Information security incident management.</li><li>Business continuity management.​</li></ul><h3 style="letter-spacing:normal;">Mak​e a Statement</h3><p>The organization's information security risk statement should be based on its overall risk statement. For example, a financial institution's information security risk appetite statement may be pitched and agreed to at a high level of detail prescribed by regulatory authorities, while a start-up company may provide less detail. Factors influencing the standard could be the number of customers, financial impact, and level of risk senior management and the board are willing to accept. </p><p>An example of an organization's overall risk appetite statement is:​</p><p><span class="ms-rteStyle-BQ"><em>The organization has a tolerance for risk that will allow it to achieve its business objectives in a manner that is compliant with the laws and regulations in the jurisdiction in​ which it operates. We specifically will not tolerate any negative impact on employee and customer health and well-being.</em><em>  </em></span></p><p>Based on this overall risk appetite statement, the organization's information security risk appetite statement could be: ​</p><p><em class="ms-rteStyle-BQ">The organization has a low risk appetite for the loss of its business and customer data. </em></p><p>Moreover, information security risk appetite statements for specific areas could include:</p><ul><li>Asset Management: The organization has a medium risk appetite for physical information security assets and will track assets greater than US$2,000. Information assets will be protected per the organization's data classification framework.<br></li><li>Access Control: The organization has a high risk appetite for access controls.  All access to the organization's mission-critical systems will be controlled via biometric authentication. <br></li></ul><h3 style="letter-spacing:normal;">Defining Acc​eptable Risk </h3><p>Having an information security risk appetite statement ensures the organization has defined what it considers an acceptable level of risk. Without such a statement, the organization is saying either that all information is important and will be protected, or that no information is important and therefore will be freely available. Both of these scenarios could be a survival risk for the organization in the long term.​</p><p>Information security risk appetite is the next step in an organization's maturing and understanding of risk management. By giving information security special attention, the organization is acknowledging that this area needs to be addressed specifically.</p>Shannon Buckley0
Software Assets, Hidden Risks,-Hidden-Risks.aspxSoftware Assets, Hidden Risks<p>​By Huzaifa Hussain and Syed Salman<br></p><p>Most organizations today use a wide range of software to help serve their customers and manage their operations. Software includes operating systems, applications, network management programs, enterprise resource planning solutions, and time-sheet management systems. </p><p>Washington, D.C.-based software industry advocacy organization BSA's <a href="" target="_blank">Global Software Survey</a> finds that 39 percent of software installed globally in 2015 was not licensed appropriately. A 2014 Gartner survey noted that 68 percent of respondents surveyed reported having one or more software license audits within the past year. Moreover, according to a 2013 Cherwell Software <a href="" target="_blank">report</a> (PDF), 57 percent of the 178 North America-based IT professionals surveyed said their organization owed money to the vendor at the conclusion of a software audit. Of those organizations that owed money, the largest subset owed between US$50,000 and US$250,000. Nearly 60 percent of respondents said license agreements are difficult to understand or interpret.</p><p>Software license compliance is a global problem, says Tariq Ajmal, IT risk advisory leader for a large professional services firm in the Middle East, who has been involved in software asset management (SAM) reviews for large organizations. "Many large organizations in the Middle East have vendor license exposures of over US$1 million," he says.<strong> </strong></p><p>Given these statistics, IT auditors should include SAM reviews in their audit plans to ensure all of their organization's software complies with their license agreements. When organizations do not abide with contractual agreements, it may result in spending on unused software licenses; failure to address controls over software procurement, asset tracking, and retirement; financial exposure for noncompliance with software agreements; and significant unrecorded liabilities.</p><p>The objectives of a SAM review are to:</p><p></p><ul><li>Provide an integrated view of installed software to allow a one-to-one reconciliation between usage and purchased/licensed records.</li><li>Review the organization's process to enable an effective software management life cycle.</li></ul><p> <br> </p><p>IT auditors can perform two broad SAM engagements: auditing the SAM process, itself, and assessing compliance with software licenses. </p><h2>The SAM Process</h2><p>The software license acquisition and inventory process can be hindered by a lack of communication between the organization's procurement department and the individuals who perform SAM activities. IT auditors should assess whether communication and coordination between these parties is adequate and allows for accurate tracking of the organization's software assets.</p><p>To prepare to perform this audit, auditors can refer to ISO/IEC standard 19770-1:2012: Information Technology­—Software Asset Management. One key area to review is organizational management processes such as corporate governance processes, roles and responsibilities, and the adequacy of SAM policies and procedures.</p><p>Another area to review is core SAM processes. These include identification of software assets, baseline software inventory and license compliance, security of software assets, and operational management processes and interfaces for SAM. </p><p>In addition, the audit should assess process interfaces for SAM, including agreement and contract management, the software acquisition process, change management, the software development process, problem and incident management, and the software retirement process.</p><h2>Software License Assessment</h2><p>The key focus of software license audits is establishing a baseline for software. Specifically, auditors should compare the deployment of software throughout the organization with the number of licenses purchased as stated in the software licensing agreement. This comparison typically will identify cases of overdeployment or underdeployment of software, usage of unauthorized or pirated software, and software contract violations. Auditors will need a good and deep technical understanding of the software being reviewed because the structure and licensing metrics of agreements can vary greatly.</p><p>Auditors should prioritize which software to select for such reviews using a risk-based approach. Prioritization could be based on factors such as the number of deployments and the value of the  software licensing agreement. </p><h2>A Solid Foundation</h2><p>IT auditors should understand that software assets bring with them serious legal, reputational, and financial licensing risks that must be mitigated appropriately by management. Conducting SAM reviews can assure the organization that it complies with all of its legal obligations, uncover any hidden liabilities the organization might face, and ensure software vendor audits progress smoothly. </p><p>Such audits can be a cornerstone of a robust SAM program that helps organizations save costs by optimizing their deployments to better suit the licensing metrics that are most economical. Moreover, an effective SAM program can help the organization establish a solid foundation to become secure and resilient. </p><p> <em>Huzaifa Hussain, CISA, CISM, PMP, MCP, is a senior manager and leader of the software asset management service line at a large professional services firm in the Middle East. <br></em></p><p> <em>Syed Salman, CISA, has 11 years of experience in professional services ranging from IT audits to IT risk advisory at a diverse set of large entities in the Middle East and South Asia.<br></em></p><p> <br> </p>0
Privacy in the Workplace in the Workplace<p>​Digital technology has changed workplace behavior — and expectations — for both employees and their employers. The ubiquitous use of smartphones and other devices, company issued and personal, places communications and data management continually at users’ fingertips. Internet use alters the traditional dimensions of employees’ work flexibility requirements and need for expression, as well as employers’ need to monitor employees’ online activity. <br></p><p>Employee concerns have been amplified by the ever-evolving technologies and data collection methods that can seem personally intrusive. Any privacy expectations employees may have are being curtailed by privacy policies, privacy pop-up screens during computer log-ins, background checks, and other workplace measures. At the same time, governments worldwide have issued regulatory guidance to address privacy issues, but guidance often falls short when it comes to balancing employers’ needs to monitor and employees’ expectations of privacy. Both noncompliance with regulations and balancing privacy needs represent major concerns. <br></p><p>Of respondents to PricewaterhouseCoopers’s (PwC’s) Global State of Information Security Survey 2016, 32 percent of security professionals say their board members review security and privacy risks — up from 25 percent in 2015. Employees remain one of the most-cited sources of compromise, with 34 percent of respondents citing current employees as sources of security incidents and 29 percent saying former employees were sources. Organizations have legitimate reasons for wanting to keep tabs on employee data, but employees also want some measure of protection from prying eyes. Evolving expectations on both sides are changing where employees, and their employers, draw the line. Internal auditors tasked with examining privacy in the organization should know where the risks lie, and what requirements their clients may face.<br></p><h2>Drivers of Privacy Disruptions</h2><p> </p><table width="100%" cellspacing="0" class="ms-rteTable-default"><tbody><tr><td class="ms-rteTable-default" style="width:100%;"><p>​<strong>Sound Privacy Program</strong></p><p>An effective privacy strategy comprises numerous practices. Organizations that manage privacy well typically feature several components in their approach: </p><ul><li>An organizational view of what privacy means.</li><li>An understanding of how privacy and data protection fit into the organization’s overall business strategy.</li><li>Complete knowledge of what data is held, where it is, and who has access to it.</li><li>A clear understanding of data ownership and of circumstances under which data is protected and under which it is not. </li><li>Understanding and management of the risks introduced to the data by third parties.</li><li>Data governance that ensures data is being used for the purpose that the organization has committed to, and nothing more.</li><li>A privacy model with agility in mind, given the ever-changing privacy landscape.</li><li>Thorough familiarity with legal obligations in the U.S. and abroad, and tracking of developments in regulatory enforcement actions and case law.</li></ul></td></tr></tbody></table><p>Historically, employee monitoring has been limited to checking internet and email usage. Today, digital disruption trends powered by mobile devices, social media, analytics, big data, and the Internet of Things have opened up a host of additional channels for employee activity. Plus, increased competition has fueled mergers and acquisitions, as well as use of offshoring models and reliance on third parties, resulting in constantly changing privacy expectations in the workplace. Organizations are also starting to apply data analytics to better match people to jobs and to more efficiently and cost-effectively recruit, manage, and retain talent. Employees have a need to be heard and to contribute, and they use internal messaging boards and social media sites to do that. Most organizations do not even realize how much data is being collected and analyzed — and exposing them to legal and compliance risks.<br><br><strong>Employee Expectations</strong> With the rise of a constantly mobile and fluid workforce and the consumerization of technology, trust is essential in the digital world. More and more employees expect to use their own devices and applications at work, as well as cloud services they’re familiar with, because they believe those mechanisms make them more productive. <br></p><p>As employees use these devices with greater frequency, and as they become increasingly responsible for the data they hold in their cloud accounts, trust becomes a more significant factor. For instance, who’s responsible if cloud data gets stolen or a device gets hacked? If disabling software is installed to protect the employer, what is that employer’s responsibility for any personal information that gets lost? If the company comes under investigation by the authorities, would personal devices and data have to be handed over? <br></p><p>Employees might be more inclined to use wearable technology such as a smart watch if the information collected were leveraged for managing work hours or stress levels. They may trade personal data for flexible working hours, free health screening, and fitness incentives and approach data sharing more openly if the information is anonymized and shared at an aggregate level. Wearable technology, GPS tracking devices, radio frequency devices, and video cameras deployed in mobile workforces have great potential to track employee movement and productivity, but at the same time, each individual will have a personal limit to what is considered shareable. <br><br><strong>Employer Expectations and Drivers</strong> Employers’ concerns generally center on the need to protect themselves from loss of confidential information, shield against cyber threats, and comply with laws and regulations. Those needs require that employers monitor employee communications on company-issued computers, cell phones, tablets, and social media sites. Employers also need to collect personal information, such as Social Security numbers and health-related information, to provide health and compensation benefits. Companies are expected to act reasonably regarding their possession of that personal information and to respect employees’ rights to privacy. E-discovery tools are now more commonly deployed to investigate suspicious behavior, and so are data loss prevention tools to monitor network traffic and secure computers. <br><br><strong>Regulatory Landscape</strong> Regulatory developments in recent years have focused mainly on the types of data that should be protected, such as personally identifiable information (PII), health information, financial information, and certain demographic information such as income and union representation. Employees in the U.S. have minimal expectations of privacy compared with their counterparts in Europe and Japan, where privacy expectations are absolute and supersede most other laws and regulations despite varying from country to country. <br></p><p>Employee rights are protected by privacy laws such as the Constitution’s Fourth Amendment, the Electronic Communications Privacy Act, and the Health Insurance Portability and Accountability Act (HIPAA) in the U.S. and various European Union (EU) data protection laws in EU member states. However, outside of specific data privacy laws such as HIPAA, interpretations of those laws and regulations are based on <em>reasonable expectations</em> of privacy and refer to both an employee’s expectation and an employer’s implementation of privacy policies in the workplace. Certainly, reasonable expectation can be interpreted differently by different societies, and regulations as such have not kept pace with changing technological advancements. Each country has a multifaceted legal framework in place to govern that country’s employers globally (see “Global Privacy Laws and Regulations” at the end of this article for examples). <br></p><h2>Audit Considerations</h2><p>Organizations should consider taking a holistic approach to managing privacy in the workplace. Moreover, their privacy framework should be agile enough to accommodate changing regulations. Internal auditors should evaluate the framework and other areas of privacy management to gauge the effectiveness of organizational efforts and overall governance. <br><br><strong>Governance Framework</strong> Internal audit should evaluate the organization’s governance framework, if one exists, to verify whether roles and responsibilities for managing privacy have been identified. An adequate framework will incorporate not only a chief information security officer or chief risk officer but also cross-functional partnerships across departments and geographies. Auditors should make sure that management defines a strategic vision and framework, if one does not exist, while ensuring it meets current and long-term business objectives. <br><br><strong>Privacy Risk and Compliance</strong> Execution of a privacy risk and compliance assessment is an essential step in evaluating if the organization has translated its strategic vision and framework into practical implementation. This step entails a gap assessment of applicable laws and regulations within all geographies, as well as the discovery and data flow mapping of data elements that are stored, transmitted, or transferred either on organizational networks or on hard copies. Internal audit should execute such assessments periodically and perform a risk assessment on a more frequent basis to evaluate the impact of organizational and regulatory changes.<br><br><strong>Policies, Processes, and Controls </strong>Auditors should be proactive in guiding management to develop new — or enhance existing — policies, processes, and controls by incorporating privacy-by-design (i.e., embedding privacy into the design specifications of technologies, business practices, and physical infrastructures). They should, for example, evaluate the privacy impacts of new products, third parties, mergers and acquisitions, systems, and technologies; and when the organization enters new markets, auditors should make sure controls are in place to manage privacy requirements. Controls around investigations of employee behavior on an organization’s networks and computer systems should be in place and evaluated by auditors periodically. These controls might include using e-discovery tools aimed at validating internal approvals, clearly articulating the purposes for monitoring that are proportionate to the investigation underway, and involving lawyers when necessary.<br></p><p><strong>Training and Awareness</strong> When policies set the tone of data protection management and guidance, employees and third parties should be trained in their roles and responsibilities. Training and awareness should be adaptive to meet specific needs at every level: executives, management personnel, human resources personnel, supervisors, IT staff, and so on. Auditors can advise management on the development of such programs and then periodically assess employee participation to gauge training compliance.<br><br><strong>Monitoring and Response</strong> Monitoring the environment to ensure compliance with privacy regulations is not just about deploying e-discovery and other tools over the network. It requires ongoing communication and periodic reporting across departments and geographies to help identify and isolate privacy concerns timely. However, organizations with over-the-top monitoring practices could encounter incidents or privacy crises with no warnings, resulting in their reacting reflexively. In their haste, decision makers could fail to consider who should be in the room making decisions, how emerging issues should be prioritized, and how to think strategically<br>beyond the next 24 hours. Internal auditors should ensure that the business has incident management and response capabilities that align with best practices and overall business objectives.  <br></p><h2>A Matter of Trust</h2><p>Trust in the digital age can be difficult for employers to navigate because it’s closely intertwined with risk, security, and privacy. Nothing is hidden in the digital world; the views and opinions of former and current employees are available for everyone to see, and employees expect a clear explanation of what they are contributing and how they’re to be rewarded for it. For these reasons, ongoing trust levels must be built between employers and employees by way of transparency in their day-to-day interactions, and a mutual interest in balancing both parties’ priorities. </p><table width="100%" cellspacing="0" class="ms-rteTable-4"><tbody><tr class="ms-rteTableEvenRow-4"><td class="ms-rteTableEvenCol-4" style="width:100%;"><h3>​Global Privacy Laws and Regulations</h3><br>Organizations need to carefully consider the privacy-related legal requirements that apply to areas in which they do business. A subset of some of the main laws and regulations affecting privacy worldwide may be helpful for internal auditors looking to assess the potential risks. <br><br><strong>EU–U.S. Privacy Shield</strong> was approved in July 2016 — in the form of a data transfer framework between the U.S. and EU member states — to replace the defunct Safe Harbor agreement after intense negotiations between the U.S. Department of Commerce and the European Commission. At first blush, the Privacy Shield seems to resemble Safe Harbor, but closer inspection reveals that it introduces increased compliance complexities for U.S. businesses. The framework includes stricter requirements for enrolling and monitoring, additional third-party risk managementconsiderations, new avenues for data-subject complaint escalation, and further limitations on government access to personal data. Employers must decide whether to participate in the new data transfer framework or use an alternative method to establish adequacy. More importantly, the decision about a data transfer method must be viewed in consideration of the General Data Protection Regulation — a much larger compliance obligation for U.S. companies that profile or collect data from EU citizens. <br><br><strong>U.S. Securities and Exchange Commission’s Regulation Fair Disclosure</strong> requires its issuers to disclose material information to the general public in a broad and nonexclusive manner. Registrants, therefore, must safeguard such information from inappropriate access and disclosure, in part through monitoring activities. <br><br><strong>Japanese Act on the Protection of Personal Information</strong> defines personally identifiable information (PII) as any information about a living individual that could identify the individual by name, date of birth, or other description contained in such information. The act imposes data protection requirements on PII, including securing prior consents from individuals before exchanging or disclosing PII to third parties. The act was amended in September 2015 to require organizations that employ Japanese citizens to comply with the cross-border exchange requirements for PII before September 2017. <br><br><strong>Australian Privacy Act and Australian Privacy Principles</strong> affect public and private entities in Australia as well as overseas businesses that manage the employee personal information of Australian citizens. The act and the principles specify requirements for active maintenance and notification of privacy policy and for extending liability, including the imposition of fines, to overseas businesses in cases of breaches that result in the loss of an Australian citizen’s PII.<br><br><strong>U.S. National Labor Relations Act</strong> protects the rights of employees to organize and bargain collectively with their employers and to engage in other protected concerted activity. Employers are prohibited from restricting employees from acting together, with or without union, to address work conditions that affect their personal lives. The provisions extend to conversations carried out in personal email accounts and social media sites.<br><br><strong>General Data Protection Regulation (GDPR)</strong> for EU members was officially adopted by the European Commission in April 2016 and goes into effect in May 2018 after a two-year transition period. The GDPR strengthens European data protection laws, giving EU citizens greater say in how their digital information gets collected and managed. This complete overhaul of EU privacy confers regulatory authority over any business that offers products or services in the EU and over any business that tracks and stores EU citizen data, as well as the authority to fine violating companies up to 4 percent of their annual global revenues. New compliance requirements include an appointed privacy officer, privacy by design and default in products and services, the right to be forgotten, additional privacy impact assessments, and complete inventories of personal data and third-party data processors.<br><br><strong>U.S. E-Government Act of 2002</strong> requires that a federal agency conduct a “privacy impact assessment” before developing or procuring an IT system or a project that collects, maintains, or disseminates PII about members of the public. The act also sets forth uniform confidentiality protection requirements regarding such data. <br><br></td></tr></tbody></table><p><br><br><span class="ms-rteStyle-Quote">Parthiv Sheth is a director in PwC’s Risk Assurance practice in New York.</span><br class="ms-rteStyle-Quote"><span class="ms-rteStyle-Quote">Khalid Wasti, CIA, CPA, CISA, CITP, is a partner in PwC’s Internal Technology Audit Solutions practice in New York.</span><br class="ms-rteStyle-Quote"><span class="ms-rteStyle-Quote">A. Michael Smith, CPA, CISA, CISSP, is a national partner in PwC’s Internal Technology Audit Solutions practice in the U.S.</span></p>Parthiv Sheth120
Internal Audit and the Internet of Things Audit and the Internet of Things<p>​Last month, <em>Compliance Week</em> published "<a href="" target="_blank" style="background-color:#ffffff;">Internet of Things' Role in Internal Audit & Compliance</a>."</p><p>I heartily agree that this is a topic that merits internal audit's (and the compliance function's) serious attention.</p><p>To quote the article, "Forbes provides a nice simple description of the concept as one of 'connecting any device with an on and off switch to the Internet (and/or to each other).'"</p><p>The Internet of Things (IoT) is not futuristic. It is here today. It will only mushroom in the future, with just about everything interconnected.</p><p>For example, I armed my home security system using my phone while on the way to the airport (I was not driving). If anybody tries to break in, I will receive an alarm on my phone wherever I happen to be.</p><p>Some people have their hearts monitored over the internet — <a href="" target="_blank">see this article from <em>Forbes</em></a>.</p><p>What should internal audit be doing about it?</p><p>Certainly, the level of work should be driven by the level of risk. But do we know what the level of risk is when it comes to IoT?</p><p>The article appears to expect internal audit to assess the risk by finding out how "IoT [is] deployed in our organization today."</p><p>I would take a different approach. I would find out whether management knows what is connected to what and why. If they don't know, that is a huge risk itself — how can IoT and its attendant risks be assessed and addressed if they are now known to management?</p><p>Assuming that they know the current state, I would ask for their risk assessment and how they are addressing the identified risks.</p><p>My next step would be to find out what changes are expected over the next 12 months and whether management is addressing them in its risk assessment.</p><p>These few questions would give me a "feel" for the level of risk and whether an audit engagement is merited.</p><p>I might go a step or two further and ask how they know what is connected to what, and how they have identified and addressed the risks.</p><p>That should give me sufficient confidence to know whether an audit engagement should be performed, what form of engagement it should be (assurance or advisory), and when.</p><p>Too many commentators want internal audit to identify and assess emerging risks, such as IoT.</p><p>I strongly disagree. That is management's role, not internal audit's.</p><p>Internal audit can assist by ensuring management has sound practices for identifying, assessing, and addressing risks — both emerging risks and existing risks where the level changes.</p><p>Do you agree?​</p>Norman Marks01527
Do You Have Data Fever? You Have Data Fever?<p>​A new internal auditor receives his latest assignment. His manager asks, “How are you going to approach the review of this area?” The auditor responds, “I want to test this, and I want to test that, and I want to test the other thing.” The manager asks why the auditor wants to perform those tests. Excitedly, the auditor answers, “Because that’s where all the information is.”<br></p><p>This scenario illustrates a common mistake made by new auditors — seeking to jump in without considering the risks, the processes, the criteria, or even the audit objective. The auditor recognizes a testable area and says, “I am doing an audit of this department and I know they have expense reports, so I will test the expense reports.”<br></p><p>Of course, those of us with years of experience and knowledge would never fall into that trap, right? Not so fast.<br></p><p>We live in a world where systems hold more information than anyone can possibly fathom. We are awash in data — big, large, super-sized, venti. And data analytics has become a buzzword that draws auditors like frau​dsters to inadequate controls. When auditors see that glorious richness of data, they fall back into that rookie mind-set: “I don’t know what I want or what I’m trying to prove or what I’m going to do with it, but I want everything you’ve got.”<br></p><p>At one time or another we’ve all caught it — data fever: The desire for more and more information without considering what that data is. We turn the fire hose on full force and what we intended to be a thirst-quenching sip of real information turns into a suffocating flood of meaningless facts, figures, and folderol. <br></p><p>More is not always better. The rules for gathering data are the same as for any audit test. First determine what you want to accomplish with the audit. Then articulate what you want to do with the data, coordinating that understanding with the already-identified risks. <br></p><p>It all begins by understanding what the data represents and what it might say. Before even thinking about asking for the data, auditors should talk with the data owners to understand what is available, how it is used, and how it relates to the processes under review. Then, and only then, should auditors begin to think about what data may be needed.<br></p><p>The promise of data analytics is to assist in performing audit work more efficiently. It also represents an opportunity for internal audit to provide real value by showing the organization how all that data can be helpful to everyone. But that cannot be accomplished by just gathering every scrap of data available. Just as you would stop a new auditor from barging forward with unfocused and potentially meaningless testing, stop yourself when asking for a data dump and determine what you are really trying to accomplish. <br></p>Mike Jacka1598
Reporting on Cyber Threats on Cyber Threats<p>​Cybersecurity is at the forefront of most organizations' risk discussions, especially at the audit committee and senior executive levels. However, internal audit reporting may not reflect current cyber threats. It is time for auditors to consider revising the evaluation criteria they use to determine whether an IT finding is reportable.</p><p style="text-align:left;">Raising IT risk concerns may clash with the audit committee's threshold for materiality. For example, data breaches often involve reputation risks more so than financial risks. This is the existential question with cybersecurity: What is costly versus what makes the organization look bad. Overall, internal audit should consider whether outdated reporting criteria have created an<span style="text-decoration:underline;"> </span>expectation gap between what the audit committee expects to be reported and what internal audit considers worth reporting.</p><h2>The Current State of Reporting</h2><p>CAEs use multiple criteria to determine whether a finding is reportable to the audit committee and senior executive levels. In a survey of 163 CAEs<sup> </sup>conducted in July by The IIA's Audit Executive Center, 81 percent say their reporting criteria do not differ among different types of audits, such as fraud, compliance, and IT. </p><p>The survey reveals minimal differences in criteria used to report to the audit committee and senior management. Forty percent of respondents use a combination of criteria or additional criteria, including all internal control weaknesses, judgment, and risks to the organization, to determine what to report to senior executives. That percentage rises to 45 percent who use those criteria as a basis for reporting to the audit committee. Thirty-nine percent use pervasive internal control weakness as their criteria for reporting to both reporting levels. Overall, just 7 percent consider dollar threshold a reporting indicator for both senior executives and audit committees. </p><p style="text-align:justify;"> <img src="/2016/PublishingImages/gen-report-exec.jpg" alt="" style="margin:5px;width:425px;height:317px;" /> <em>Source: IIA Audit Executive Center</em><br> </p><p> <br> </p><p style="text-align:justify;"> <img src="/2016/PublishingImages/Gen-report-ac.jpg" alt="" style="margin:5px;width:425px;height:323px;" /> <em>Source: IIA Audit Executive Center</em><br></p><p> <br> </p><p>When asked about specific IT findings, CAEs overwhelmingly focus on whether the findings affect more than one business segment or department, or has an organizationwide impact (49 percent to senior executives and 51 percent to audit committees). Additionally, 42 percent use a combination of criteria that includes other factors such as business and reputational impact in determining which issues to report to senior executives and the audit committee. Only 5 percent of respondents consider dollar threshold a reporting criteria for either level.</p><p style="text-align:justify;"> <img src="/2016/PublishingImages/IT-report-exec.jpg" alt="" style="margin:5px;width:425px;height:329px;" /> <em>Source: IIA Audit Executive Center</em><br></p><p> <br> </p><p style="text-align:justify;"> <img src="/2016/PublishingImages/IT-report-ac.jpg" alt="" style="margin:5px;width:425px;height:339px;" /> <em>Source: IIA Audit Executive Center</em><br></p><p> <br> </p><h2>Are the Criteria Still Appropriate?</h2><table width="100%" cellspacing="0" class="ms-rteTable-0"><tbody><tr class="ms-rteTableEvenRow-0" style="text-align:center;"><td class="ms-rteTableEvenCol-0" colspan="2" style="width:50%;">​<strong>Audit Committee and Senior Executive <br>Reportable IT Findings</strong> ​</td></tr><tr class="ms-rteTableOddRow-0"><td class="ms-rteTableEvenCol-0">​<strong>Reportable</strong> </td><td class="ms-rteTableOddCol-0">​<strong>Not Reportable</strong> </td></tr><tr class="ms-rteTableEvenRow-0"><td class="ms-rteTableEvenCol-0">​User account activation process findings that impact the organization's ability to appropriately assign user access.</td><td class="ms-rteTableOddCol-0">​User access typically is assigned appropriately, but a current audit noted a couple of users whose access was assigned incorrectly. </td></tr><tr class="ms-rteTableOddRow-0"><td class="ms-rteTableEvenCol-0">User account deactivation process findings that impact the organization's ability to disable user access timely upon termination. ​</td><td class="ms-rteTableOddCol-0">​User account deactivation process works correctly, but a current audit noted one or two contractors or employees whose access were not disabled timely. </td></tr><tr class="ms-rteTableEvenRow-0"><td class="ms-rteTableEvenCol-0">​User transfer process findings where access is not removed when employees transfer to other departments.</td><td class="ms-rteTableOddCol-0">​User access is typically adjusted upon transfer, but a current audit identified one or two users whose access were not adjusted. </td></tr><tr class="ms-rteTableOddRow-0"><td class="ms-rteTableEvenCol-0">​Patching process findings where patching does not occur timely organizationwide. </td><td class="ms-rteTableOddCol-0">​Most servers are patched timely except for a few. </td></tr><tr class="ms-rteTableEvenRow-0"><td class="ms-rteTableEvenCol-0">​Two-factor authentication findings where the authentication system has organizationwide issues (i.e., does not work all the time). </td><td class="ms-rteTableOddCol-0">​Most servers have two-factor authentication enabled for interactive login, but one or two do not. </td></tr></tbody></table><p>Although organizationwide impact is the criterion survey respondents consider most impactful in deciding to report IT findings, this may cause internal audit to not report seemingly lesser findings that could potentially be big cyber threats. Findings such as having one or two untimely user account terminations or users who have been assigned incorrect access would most likely not be considered reportable under current generally used criteria (see "Audit Committee and Senior Executive Reportable IT Findings" at right). </p><p>Yet, these are similar to the causes of some of the largest data breaches reported to the <a href="" target="_blank">Identity Theft Resource Center</a> both this year and historically. These include:</p><ul><li>Stolen third-party or employee credentials.</li><li>Stolen mobile device.</li><li>Unsecure wireless network.</li><li>Two-factor authentication disabled on a few servers.</li></ul><p> </p><p>These data breach trends suggest the current reportable criteria may not reflect cyber threat reality. Although only a few items, or even one item, could be found during an audit, such items may open the door for a hacker or general user to allow data theft to occur. In the world of cybersecurity, the small details matter. Failing to perform an appropriate activity for a single user or server could have an organizationwide impact.</p><p>Questions to consider include:</p><ul><li>In today's world of cyber threats, is the criteria used to decide when to report an IT finding to the audit committee and senior executives still relevant?  </li><li>Should the criteria be revised so that other IT findings currently deemed to be lesser risk would be considered reportable?</li><li>Are the board or senior executives sufficiently educated about cybersecurity to understand the impact of such findings?</li></ul><h2>Modifying Expectations</h2><p> <a href="" target="_blank">Internal Audit's Role in Cyber Preparedness</a>, a 2015 white paper from The IIA's Internal Audit Foundation, discusses the importance of taking a holistic approach to an organization's cybersecurity practices and how internal audit can assist in this endeavor. The white paper cites a <a href="">National Association of Corporate Directors (NACD) publication</a> in which 87 percent of respondents to the 2013-2014 NACD Public Company Governance Survey reported their board's understanding of IT risk needs improvement.<sup> </sup><sup> </sup>The IIA white paper says boards could gain access to cybersecurity expertise by adding members with technology industry expertise. Other suggestions include:</p><ul><li>Scheduling "deep dive" briefings from third-party experts, including specialist cybersecurity firms, government agencies, and industry associations.</li><li>Leveraging the board's existing independent advisers, such as external auditors and outside counsel, who will have a multiclient and industrywide perspective on cyberrisk trends.</li><li>Participating in relevant director education programs, whether provided in-house or externally.</li></ul><p><br></p><p>As boards increase their cyber awareness, internal audit is complimenting this awareness by becoming more technology-savvy and providing services to the organization such as helping the board understand IT risks and the impact of new technology initiatives. Becoming more adept at using technology is helping internal audit provide such services, according to a recent Protiviti Report, <a href="" target="_blank">Internal Auditing Around the World</a>. </p><table width="100%" cellspacing="0" class="ms-rteTable-default"><tbody><tr><td class="ms-rteTable-default" style="width:100%;">​ <p> <strong>Summary and Analyses of Findings That Did Not Meet the Required Reporting Threshold</strong></p><p>Fifteen users with excess or incorrect access were noted among seven audits. Upon evaluating these overall, internal auditors noted an increased trend of failure by the business owners to ensure an adequate access review is performed periodically. Further follow-up revealed that five of the seven business owners were relatively new to the company and had not received the appropriate training. As management is aware, excess or incorrect access rights increase the organization's cyber threat level.</p><p>Untimely disabling of a user's application account occurred in eight out of 10 audits. While none of these incidents met the reporting criteria on its own, internal audit noted an upward trend in untimely removal of user access. It is interesting to note that five of the eight users were contractors for whom the business areas did not provide prompt notification of the need to disable their access. Management is now considering alternatives to manage contractor access. </p></td></tr></tbody></table><p>This growing cyber awareness creates an opportunity for internal audit to report on IT findings that were once considered lower risk and less impactful organizationwide. Similar to the common experience of external auditors reporting various material or immaterial individual financial adjustments, reporting on these IT events can further educate the board and senior executives on cyberrisks. </p><h2>Reporting Alternatives</h2><p>In the world where a single IT event now can cause an organizationwide threat, internal audit needs to engage audit committees and senior executives in discussions about single detailed events and their impacts. Yet, it takes time for perspectives to change and education to occur. In the meantime, there are alternatives auditors can use to retain the current reporting criteria and further emphasize these singular IT findings, including:</p><ul><li>Modifying the reporting narrative of each audit that is distributed to the audit committees and senior executives, including elaborating on the cyber threats encompassed by the audit. Alternatively, during the audit committee presentation, auditors can spend a few moments discussing the cyber threats detailed in the audit<strong><em>.</em></strong><span style="text-decoration:underline;"> </span></li><li>Educating senior executives and audit committees on the finer cyber threat details.</li><li>Maintaining the current reporting criteria and providing an annual summarized report noting the major themes from all unreported IT issues identified (for an example, see "Summary and Analyses of Findings That Did Not Meet the Required Reporting Threshold" at right).</li></ul><p><br></p><p>Although revising reporting criteria to better reflect the current cyberrisk environment should occur, suddenly changing long-established reporting practices may not be the best solution. Using the suggested reporting alternatives and easing into new criteria will allow time for the audit committee and senior executives to adjust their perspectives. Moreover, a gradual shift will allow for additional training and understanding that a single IT finding could do as much harm as a pervasive IT finding. </p><p> <br> </p>James Reinhard01782

  • TeamMate_Jan2017_Prem 1
  • IIA TeamDevelopment_Jan2017_Prem 2
  • IIA PerformanceAuditing_Jan2017_Prem 3



Six Steps to an Effective Continuous Audit Process Steps to an Effective Continuous Audit Process2008-02-01T05:00:00Z2008-02-01T05:00:00Z
Managing an Internal Audit Career: How Do You Know When It’s Time to Go?’s-time-to-goManaging an Internal Audit Career: How Do You Know When It’s Time to Go?2015-03-30T04:00:00Z2015-03-30T04:00:00Z
Understanding the Risk Management Process the Risk Management Process2007-05-01T04:00:00Z2007-05-01T04:00:00Z
Lessons From Toshiba: When Corporate Scandals Implicate Internal Audit From Toshiba: When Corporate Scandals Implicate Internal Audit2015-07-27T04:00:00Z2015-07-27T04:00:00Z