Evolving FIPPs: Proactive Approaches to Privacy, Not Privacy Paternalism
© Springer Science+Business Media Dordrecht 2015
Serge Gutwirth, Ronald Leenes and Paul de Hert (eds.)Reforming European Data Protection LawLaw, Governance and Technology Series2010.1007/978-94-017-9385-8_1212. Evolving FIPPs: Proactive Approaches to Privacy, Not Privacy Paternalism
(1)
Information and Privacy Commissioner, Toronto, ON, Canada
Abstract
Privacy and data protection are at times contrasted with other legitimate societal values and goals, with the suggestion that one must yield to the other. But is it really necessary to weaken existing privacy measures in the name of pursuing greater efficiencies, innovation and economic growth? The goal of reconciling privacy rights with the free flow of data was reaffirmed by the OECD in a multi-year review of the 1980 OECD Guidelines – all eight of the original principles were left intact. This paper examines proposals to abridge these fundamental FIPPs in order to allow for Big Data and other technological and socially beneficial innovations. This paper suggests that the future of privacy depends on informational self-determination as embodied by taking a holistic approach to the FIPPs. Moreover, the paper suggests that the FIPPs be further enhanced through the application of Privacy by Design, which supplements the FIPPs with new elements such as proactively embedding privacy into information technologies, business practices and network infrastructures. Transparency and accountability are also key features in this framework.
12.1 Introduction
Information and communications technologies are transforming our worlds, challenging our ideas of privacy and data protection. Since technological innovations and how we use technology often form the basis of tomorrow’s standards, it is natural that important questions be asked regularly about the Fair Information Practice Principles (FIPPs). The enduring tension in these discussions is whether we should create technologies that respect our current understanding of privacy, or whether our understanding of privacy must change in order to allow for new technologies and other developments.
Some privacy professionals, academics and public policy makers believe that the venerable Fair Information Practice Principles (FIPPs) should give way in an era of cloud, social and mobile computing, and the internet of things. They argue that information self-determination is largely a fiction today and that systems of notice and choice, in practice, have become a pointless burden in an era of passive collection of personal information and dense, legalese privacy statements. Purpose specification requirements are out of step with exciting new Big Data applications and insights, prescribing unjustifiable limits on collecting and using personal data, blocking innovation, societal benefits and progress. Better, they argue, to focus on punishing misuses of personal information and to strengthen accountability of data processors/users.1
While the intent of these calls may be to shift the burden of privacy protection away from individuals and towards data users/controllers, the effect of such a proposal will be to weaken fundamental privacy rights of individuals, while strengthening the power of data users/controllers to decide what personal data to collect and process, whenever and however they see fit, placing greater burdens on both individuals and regulators to seek effective redress. I consider this a paternalistic approach to privacy.2
This paper argues against diminishing the FIPPs – or selectively applying some principles over others – and in support of a proactive approach to privacy that supplements privacy principles in a manner that promotes innovation, privacy, data protection and trust in the twenty-first century.3 This is consistent with a recent OECD Council recommendation where it noted that, “These Guidelines should be regarded as minimum standards which can be supplemented by additional measures for the protection of privacy and individual liberties, which may impact transborder flows of personal data.”4 The author agrees that accountability should be strengthened, and there are many ways to achieve this using a proactive approach to privacy rather than diminishing the FIPPs.5
It is clear the world is changing. The expectation placed on individuals to navigate through dense and lengthy privacy notices and policies to protect their privacy is unsustainable. However, as stated by the Article 29 Working Party of the European Commission (WP29), in their opinion on purpose limitation, “[w]hen we share personal data with others, we usually have an expectation about the purposes for which the data will be used.”6 Big Data analytics, the Internet of Things, and Cloud Computing are all trends that may yield remarkable new correlations, insights, and benefits for society at large. While there is no intention that privacy should stand in the way of progress, it is essential that privacy practitioners participate in these efforts to shape trends in a way that is truly constructive, enabling both privacy and these technology advances to develop, in tandem.7 As a privacy community, we must explore the question of how to reconcile, on the one hand, the practical challenges of implementing FIPPs in new technological environments with, on the other hand, the individual’s expectation of privacy. However, we will have already failed in our endeavours if we begin these discussions by adopting a zero-sum perspective.
12.2 Privacy Paternalism: Removing Limits and Obligations Related to the Collection Principle
Removing obligations to obtain informed consent when collecting personal information could have sweeping impacts on the privacy of individuals. In many contexts, providing effective “notice and choice” to individuals about data processing operations may seem like an unnecessary, pointless burden.8 , 9 Notices can be long and complicated, hard to understand and inconvenient for individuals; and practical options may be limited. In the emerging Big Data, the Internet of Things, and Cloud Computing environments, the individual is often unaware of data collection taking place or is completely absent from the transaction and processing.10 Nevertheless, the problems with “notice and choice” should not be used as a simple justification for diminishing consent.
Despite the difficulties with the concept,11 and despite their being other legitimate bases for the collection of personal information,12 informed consent – explicit or implicit – remains the cornerstone of modern FIPPs13 and is foundational to modern private sector privacy laws in force around the world. Diminishing the central role of consent diminishes the individual’s right and ability to participate in the management of their personal data by others, undermining the application of other FIPPs which are complementary and intended to be applied holistically.14
Consent is multi-dimensional. It is so much more than permission to a one-time collection of personal information. Lacking the opportunity to provide informed consent, the individual is effectively disempowered. Consent empowers individuals to exercise additional privacy rights and freedoms, such as the ability to:
make consent conditional;
revoke consent;
deny consent for new purposes and uses;
be advised of the existence of personal data record-keeping systems;
access personal data held by others;
verify the accuracy and completeness of one’s personal data;
obtain explanation(s) of the uses and disclosures of one’s personal data; and
challenge the compliance of data users/controllers.
Informed and empowered individuals serve as essential checks on the uses and misuses of personal data, holding data processors accountable in a way no law, regulation or oversight authority could ever do. In Germany, the concept of informational self-determination was created over 30 years ago by the Constitutional Court who derived it from their Constitution in 1983.15 This captures the central role that the individual is expected to play in determining the uses of his or her personal data. Individuals are intended to feature prominently in considering the acceptable secondary uses of their personal data. Central to this determination is context – context is key to determining what may be considered an appropriate secondary use, and is often lacking without the involvement of the data subject.
Removing consent from the data collection equation risks undermining fundamental individual rights, protections and freedoms far beyond “notice and choice” systems. Instead of doing away with consent, we should work at strengthening, not weakening consent by improving transparency and individual control mechanisms – addressing the challenges head-on.
Another criticism of consent which focuses on the practicality of providing notice, should not close out the possibility of moving beyond our current system of notices, and beyond existing enhancements (e.g. tables, icons, and layers), towards the possibility of a new generation of notices, such as notices based on experiences of technology (i.e. ‘visceral notice’).16
In sum, many would agree that there is poor understanding of individual responses to notice and choice options, and until they are resolved, they should not be used as a proposition for arguments to remove or limit obligations under the collection principle.
12.3 Why Eliminate Purpose Specification?
Another foundational FIPPs is Purpose Specification: “The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfilment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.”17 Purposes are the basis for setting and evaluating limits on collection and use of personal information, and for determining necessity and proportionality.
Some argue that with a diminished need to notify data subjects and obtain their consent, there should be less duty to specify purposes in advance. They argue that some purposes are implied or do not require consent, such as fulfilling an order, improving service quality or cooperating with law enforcement authorities. Others argues that it is impossible to know in advance all of the possible uses and benefits of personal information, and that specifying or limiting purposes unnecessarily limits those future uses and benefits. Non-existent or excessively broad purposes permits a wide spectrum of compatible uses and allows indefinite data retention “just in case.”
The central problem here is that eliminating purpose limitation gives an unprecedented free hand to data users/controllers – public or private, large or small, wherever in the world they may be located, to unilaterally decide why, what, or when personal data should be collected, used and disclosed, with little input from data subjects or oversight authorities.
Lacking sufficient restraints and taking a paternalistic approach could lead to what privacy advocates fear most – ubiquitous mass surveillance, facilitated by more extensive, and detailed profiling, sharpened information asymmetries and power imbalances, ultimately leading to various forms of discrimination.18 A greater burden would be placed upon both individuals and regulators to prove harms, establish causation, and seek effective redress – the exact opposite of taking a proactive approach to privacy which emphasizes prevention and the taking of proactive measures.19
If the history of privacy has taught us anything, it is that an individual’s loss of control over their personal data leads to more and greater privacy abuses, not fewer and smaller. It is not difficult to imagine how this proposal to eliminate purpose specification, if implemented, could lead to a “collect the entire haystack” mentality, and to overbroad or unspecified and undesirable secondary uses – “fishing expedition” methods of data processing. When making decisions affecting individuals, out-of-date or incomplete data, incorrect inferences, and automated decision-making processes can have profoundly negative consequences.
The Purpose Specification principle is even more critical when individual participation and consent have been diminished. Whether or not consent is informed or explicit, individuals will always have basic expectations about how their personal data is to be used, namely, that it will be used for the purpose(s) for which they provided it. There is a natural expectation that there will be some basic limitations when an individual provides his/her personal data. The individual does not hand over his/her information to the government or a business to do with it whatever it wants.
On April 2, 2013, the WP29 provided an opinion on the principle of purpose limitation. In particular, the WP29 discussed the principle of purpose limitation under the current European Union (“EU”) Directive 95/46/EC and provided recommendations for the proposed EU General Data Protection Regulation.
In the WP29 Opinion, the WP29 stated that purpose limitation protects individuals by restricting how data controllers use personal information, while also providing a degree of flexibility. The WP29 further described purpose limitation as being comprised of two elements: (1) purpose specification; and (2) compatible use. The WP29 explained the relationship between these two elements by referencing Article 6(1)(b) of the EU Directive which states that personal information must only be collected for “specified, explicit and legitimate purposes” (purpose specification) and not be “further processed in a way incompatible” with those purposes (compatible use).20
The WP29 also stated the following: “The prohibition of ‘incompatibility’ in Article 6(1)(b) does not altogether rule out new, different uses of the data – provided that this takes place within the parameters of compatibility.”21 The WP29 goes on to state that compatibility needs to be assessed on a case-by-case basis, with the following factors taken into account:
the relationship between the purposes for which the personal data have been collected and the purposes of further processing;
the context in which the personal data have been collected and the reasonable expectations of the data subjects as to their further use;
the nature of the personal data and the impact of the further processing on the data subjects;
the safeguards adopted by the controller to ensure fair processing and to prevent any undue impact on the data subjects.22
Similarly, from a public sector viewpoint, in the jurisdiction of Ontario, Canada, the Freedom of Information and Protection of Privacy Act (FIPPA) and its municipal equivalent (MFIPPA) limit an institution’s ability to use information in its custody and control.23
In determining whether the individual might reasonably have expected such a use or disclosure, the practice of the Information and Privacy Commissioner of Ontario has been to impose a “reasonable person” test. Therefore, the question that must be asked is whether an individual would have reasonably expected the use of their personal information for the identified purposes. Investigation reports issued by the Commissioner have found that there must be a rational connection between the purpose of the collection and the purpose of the use, in order to meet the “reasonable person” test. In applying the “reasonable person” test and determining whether there is a rational connection, the Commissioner considers many factors, including the factors listed by the WP29 when assessing compatibility.
It is important to note that section 43 of FIPPA and section 33 of MFIPPA define “consistent” purpose in relation to personal information that has been collected directly from the individual. Where information has been collected indirectly, a consistent purpose would be one that is “reasonably compatible” with the purpose for which the personal information had been obtained. Note that Ontario’s “reasonably compatible” language is virtually identical to the E.U. WP29 “compatible use” language. The Commissioner’s practice when assessing “reasonably compatible” purposes is not an “identical purpose” test; rather, the Commissioner will look to what the wording and intent of the indirect collection of the information indicates.
It should also be noted that when a consistent purpose cannot be established, Ontario institutions may still use the personal information in their custody or control if the person to whom the information relates has identified that information and consented to its use.24
As evidenced above, privacy legislation in both the EU and Ontario, Canada, place justifiable limits and provide flexibility on a data user/controller’s collection, use and disclosure of personal information.
12.4 Concerns with a New Use Principle of Balancing Benefits with Harms to Individuals
The call to substantially revise the Use Limitation principle25 to introduce the notion that the data user/controller should balance benefits of the use, with harms to the individual and harm mitigation tools in place for each intended data use, is of great concern. This approach would lead to a “race to the bottom” scenario. In this new Use Principle, there is the concept that harms to the individual “should be permitted with protections.” By what standards should benefits and harm be evaluated – to the individual, society, or a company’s bottom line? It is difficult to support any principle that would allow foreseeable harms to individuals even if safeguards are employed. In addition, such safeguards are chosen not by the individual, but by the data user/controller, and may or may not include the “protection” of consent.
Even if a harms-based approach to privacy was feasible, we are a long way from achieving meaningful national, let alone international, consensus on defining “harms” (nor broadening the scope). We are far from… “put[ting] in place practical frameworks and processes for identifying, balancing, and mitigating those harms.”26 And who would do this? U.S. courts have been reluctant to step in on behalf of affected individuals.27
Absent clearly defined and agreed standards for privacy-related “harms,” any call to liberalize the market for using personal data should be viewed with skepticism. As noted above, individuals would be significantly disadvantaged by the lack of notice and consent, and the minimization of their ability to participate in the process. Any significant loss of individual autonomy in relation to one’s personal data should be viewed as harmful.
Greater accountability for the uses of personal data is critical.28 However, the call to diminish the Use Limitation principle shifts the burden of proof to demonstrate the existence of harm to individuals, with regulators officiating such cases to document the harms, to prove causality, and then seek redress. Proving the causality of harms is notoriously difficult to do, and will likely become even more so in the current era of complex, interconnected global information systems and networks that are increasingly opaque to both individuals and oversight authorities.
Even today, harms arising from cases of identity theft due to a security breach are difficult to prove. Similarly, establishing links between poor organizational data-handling practices and the negative effects of individuals being erroneously placed on a watchlist or other similar blacklist, losing an employment opportunity, paying a higher insurance premium, being denied health coverage, or suffering a damaged reputation or the inability to travel, can be a Kafkaesque experience.
While superficially appealing in theory, in practice, harms tests are far too narrow a basis for effectively protecting privacy in this day and age.29 As the name implies, harms tests are fundamentally reactive, allowing harms to arise rather than proactively preventing the harm, right from the outset. The effect of such a proposal will be to retard the development and application of real, effective preventative remedies.30 In the meantime, a mountain of unnecessary harms will have occurred, responsibility for which will most likely go undetected and unchallenged. A flexible, robust set of FIPPs, ideally embedded into design, remains the best bulwark against future harms (material or immaterial). There should be greater emphasis on preventative methods, such as conducting comprehensive privacy impact assessments (PIAs). Moreover, regulators’ resources are already stretched to the limit, and it is highly unlikely that additional staffing will be provided to absorb the additional burdens imposed by such a proposal. The opposite is happening – resources are shrinking, not expanding.
12.5 Privacy Does Not Stand in the Way of Innovation
Some suggest that rigid adherence to general privacy principles inhibits innovation and interferences with economic and social progress, and that these limits should be relaxed. We should be wary of good intentions and seek ways to achieve positive-sum outcomes. Many of the perceived barriers associated with obtaining informed consent, specifying and limiting purposes, and restricting collection and uses of personal information can be obviated by applying innovative methods and widely available data processing techniques. Many Big Data applications may be achieved using de-identified data in place of identifiable personal information. For example, Dr. Khaled El Emam, Professor at the University of Ottawa and Canada Research Chair in Electronic Health Information, has developed a tool that de-identifies personal information in a manner that simultaneously minimizes both the risk of re-identification and the degree of distortion to the original database.31 The European Data Protection Commissioners have developed criteria and practical guidelines on open data and public sector information re-use,32 as has the Office of the Information and Privacy Commissioner of Ontario, Canada.33
Privacy and data protection are at times contrasted with other legitimate societal values and goals, with the suggestion that one area must yield to the other. It is not necessary to weaken existing privacy measures in the name of pursuing greater efficiencies, innovation and economic growth. Further, there is a long and growing list of public and private-sector authorities in the United States, the EU, and elsewhere, who unequivocally endorse a proactive approach to privacy as a more robust application of FIPPs, and as a critical means by which to establish sufficient, necessary trust in the evolving information economy.34