There is no business today that is not driven by data,” Dominique Vincenti, vice president, Internal Audit and Financial Controls, at Nordstrom in Seattle, says. “The continuous high-speed evolution of technology is the No. 1 challenge for businesses and internal auditors today. There is not an hour you can rest.”
Vincenti says that businesses need to fundamentally reassess what data means to the success of their organizations going forward. Not only must they be able to successfully protect their data from external threats, but a new law is sparking a trend that will require many to have much more detailed control over what data can be held and how it can be used — the General Data Protection Regulation (GDPR) that goes into effect in Europe in spring 2018. Add to that data processing developments in data analytics, robotics, and artificial intelligence, and organizations that are unable to leverage their most business-critical asset effectively are in danger of being left behind, or worse.
“There needs to be a huge wake-up call,” Vincenti says. “Businesses need a clear answer to the question, what does data mean to the success of our company both today and tomorrow?”
The conjunction of GDPR and advanced data processing technologies is pushing organizations into new ground. For businesses operating in Europe, or any business using or holding data on European citizens, for example, the tougher new data laws will substantially alter the way that organizations need to seek consent and keep data records (see “Main Provisions of GDPR” below right). “GDPR is a more stringent regime than those it replaces, and has a low risk appetite built into it,” Vincenti says. “Since Europe tends to lead the way in legislation, it would be wise for U.S. businesses that are not affected today to at least consider how they might meet those requirements in the future.”
GDPR’s heavy fines have caught the media’s attention — the maximum is 4 percent of the organization’s global revenues. For example, telecom and broadband provider TalkTalk’s 2016 fine of £400,000 from the U.K.’s Information Commissioner’s Office for security failings that allowed hackers to access customer data could have rocketed to £59 million under GDPR.
Yet having the right controls over how data is used and retained will present a challenge. For example, businesses will no longer be able to request a blanket consent to use data collected from individuals in any way they choose. Consent will need to be obtained for a specific and detailed use — otherwise fresh consent will be required. This provision is diametrically opposed to how data can be leveraged by artificial intelligence and data analytics programs. Such programs are best used to find new patterns in data and novel applications of information to improve the business’ products and services. Without free license to experiment with customer data on the business’ servers, it may not be possible to achieve the full potential these technologies promise.
For internal auditors, these pressures could mean going back to the drawing board on the controls needed to strike the right balance between delivering value to stakeholders from these new technological possibilities and protecting the enhanced rights many customers will enjoy under GDPR. A compliance-based approach may no longer be feasible because it is unlikely to capture the nuances needed to deal with this ethically sensitive area. In fact, many are arguing that successfully handling the new data landscape will require auditors to develop ethical principles and soft skills that have been undervalued in this area.
The Challenge of Consent
|Main Provisions of GDPR|
Article 5 of the General Data Protection Regulation requires that personal data shall be:
(a) Processed lawfully, fairly, and in a transparent manner in relation to individuals.
(b) Collected for specified, explicit, and legitimate purposes and not further processed in a manner that is incompatible with those purposes; further processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes shall not be considered to be incompatible with the initial purposes.
(c) Adequate, relevant, and limited to what is necessary in relation to the purposes for which they are processed.
(d) Accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that personal data that are inaccurate, having regard to the purposes for which they are processed, are erased or rectified without delay.
(e) Kept in a form that permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific, or historical research purposes, or statistical purposes subject to implementation of the appropriate technical and organizational measures required by the GDPR in order to safeguard the rights and freedoms of individuals.
(f) Processed in a manner that ensures appropriate security of the personal data, including protection against unauthorized or unlawful processing and against accidental loss, destruction, or damage, using appropriate technical or organizational measures.
Article 5(2) requires that “the controller shall be responsible for, and be able to demonstrate, compliance with the principles.”
Source: U.K. Information Commissioner’s Office
“If you don’t know what you are going to discover from a big data project, how can you possibly explain to the data subject how you will use it and get consent?” Henry Chang, an adjunct associate professor at the Department of Law at the University of Hong Kong, says. Chang is one of several academics and business organizations arguing that new regulations such as GDPR coupled with new technologies require a paradigm shift when it comes to personal data use and protection. Chang and Vincenti agree, for example, that organizations pursuing a compliance-based approach to data privacy and protection are going to run into a brick wall when trying to leverage their data innovatively.
“When you look at a compliance-based approach, you have to decide where the pass-mark is legally,” he says. “That tends to cause businesses to aim low and achieve low, and businesses can spend a lot of time on trivial areas because they think they need to comply in every part of their business equally, rather than where they are most at risk.”
He says what is required instead is a more holistic, accountable approach that has privacy controls engineered into business processes, which themselves are underpinned by ethical principles. While there is no magic solution, he urges organizations to try a cocktail of approaches to see what works best. For example, he says that data privacy is built on the notion that one has respect for the individual’s right to have a say over how that information is used. Compliance cannot address how those rights might change over time if the systems used to comply with regulations do not have some elasticity built into them.
“Respecting someone’s privacy rights is actually a soft skill and needs a soft approach,” he says. “Putting in an ethical boundary as an extra element into your compliance processes could help deal with shifts in the way that personal data can be analyzed and used.”
In practice, that could mean that if a company is using automated processes, some part of those systems could include a right for decisions to be made by a human. Or where mistakes are made with the use of data, there is a human at the end of the process and effective redress mechanisms in place.
“The head of audit’s role could be to bring these debates to the attention of the board,” he says. “You obviously cannot prescribe a set of ethics to the board, but you can ensure that the board has the opportunity to think ethically about personal data.”
A Balancing Act
While obtaining consent for the use of data may seem reasonable, what happens if the potential uses are beyond the understanding of the individuals involved? According to the Information Accountability Foundation (IAF), a global research nonprofit, there is a growing agreement that consent is not fully effective in governing such data and use. Many national laws include limited exemptions for processing when consent is unavailable, while others, notably European law, provide legal justification based on the legitimate interest of an organization when it is not overridden by the interest of the individuals. But such exemptions tend to be limited, unclear, or outdated, and those legitimate interests require a balancing procedure that has yet to be developed.
“Companies are meant to balance the legitimate interests of individuals, organizations, and shareholders,” Martin Abrams, executive director of IAF in Plano, Texas, says. “That means not only looking at the potential negative impacts on individuals, but on stakeholders, too, if you do not process that data.”
For example, Abrams says, next-generation clinical research by pharmaceutical companies could draw data from multiple devices — smartphones and watches, genomics, location-sensitive information, and clicks on webpages — into the data pool in a way that could be difficult to describe to people who are asked to consent because it is unclear how the various interests at play can be balanced. If some of that data is European, a difficult problem could become intractable. “It’s not clear how one could do data analytics under GDPR,” he says.
The IAF has been working with the Canadian government to test an ethical assessment framework it has created to help organizations develop accountability processes that go beyond the consent model. It aims to provide a common framework for developing systems of accountability and for ranking the importance of potentially conflicting interests for each project.
Internal auditors, he says, should be asking their boards to think about how the business is balancing the various interests at stake in its use of data. How those decisions and processes are documented and assessed, and whether the business has the right skill sets to implement such an approach, could all be the topic of audit assignments.
Transparency and Communication
One approach to addressing data concerns is for businesses to become as transparent as possible about their aims and objectives and how those interests are balanced. “It is very important for the business to tell a clear story about what its intentions are, how it is going to use the data, and how that will be for the betterment of society,” says Lisa Lee, vice president, Audit, at Google in Mountain View, Calif.
She says that innovation requires research and having too many rules around how data can be used could stifle developments that could benefit the community. Too many checklist-style controls are unlikely to keep pace with the speed at which technology is developing. That is why Lee says that companies need to engage people in dialogue about their ethics and articulate the benefits to society they are attempting to deliver.
Not everyone will align with a story. Lee says that people often have different tolerances to technology notifications, for example, and what one person would find useful, another might find intrusive. Business units need to have thought through those issues and communicate how they approach such risks and what the controls are for doing so. She says Google sets the tone for its values from the top of the company and those values inform its protocols, how it operates, and how it attempts to manage risks.
This approach impacts how internal audit works. “Internal audit has to have a very in-depth grasp of the business,” she says. Unlike organizations that tend to pool auditors into one team, Google has some dedicated audit teams attached to particular areas — such as data security and privacy — where a deep understanding of the systems is necessary. In addition, auditors focus on what the business objectives of the product or service are during an audit and spend time listening to how the business is attempting to approach risk and control.
“We work in a very dynamic environment and need to keep an open mind when we are thinking about controls and their impact or effectiveness,” she says.
Grasping the Data
Few companies are as advanced in their handling of data as Google. One of the most common problems organizations face is that they do not know where their data comes from, how it is used, and in many cases, what data they hold. Mark Brown, vice president of Software Solutions and Services at the risk management software company Sword Active Risk in Maidenhead, U.K., recently estimated that only about 1 percent of businesses could pull in and analyze internal and external data in a meaningful way.
“One of the biggest challenges when it comes to data is knowing what you have,” says Shannon Urban, executive director with EY in Boston and 2017–2018 chairman of The IIA’s North American Board. “As businesses have grown through expansion and acquisition, they have continued to accumulate data with no formal inventory.” In addition, is it easy for data to move around the organization via enterprise resource planning systems, email, and mobile devices, making it possible for it to be used in unintended ways.
“If you don’t have an identification and classification process that can identify what is sensitive, then using it effectively, never mind ethically, is going to be impossible,” she says. “The models internal auditors use can sometimes be a bit upside down — we make sure the data is accurate and complete, but spend less time on whether it is appropriately sourced and accessed. That could mean rethinking our audit plan and checking that we properly source the competencies to deal with these issues,” she adds.
Urban says it is important not to get overwhelmed. If auditors find their organization’s data is unstructured, she advises them to take a risk-based approach and start with the information that is most critical to the business, including intellectual property, employee, and customer data. “It is completely within internal audit’s purview to connect the dots and think about data across business lines,” she says.
Center of Excellence
Internal audit can take a lead in bringing their organization up to speed with these new challenges about the nature of data and technological innovations in data processing. “Internal auditors need to be well-versed in these developments and be able to educate management through our audits,” Nordstrom’s Vincenti says. She says internal auditors should make their function a center of excellence not only in both data protection and privacy practices but also in data governance and rapidly evolving enterprise information management approaches and capabilities. “Internal audit can be a role model. Let’s show the business how we are using data in innovative and ethical ways,” she says.
For more information on protecting organizational data, see the IIA Practice Guide, Auditing Privacy Risks, 2nd Edition.