Public consultation on the Privacy Act – Submission – Lisa M. Austin and Andrea Slane

Cette soumission n’est disponible qu’en anglais.

The Honourable David Lametti
Minister of Justice and Attorney General of Canada
284 Wellington Street
Ottawa, Ontario
Canada K1A 0H8

via online submission portal

February 14, 2021

Dear Minister Lametti,

We are both Canadian academics who work in the field of privacy law and policy and welcome this opportunity to provide our written submissions responding to the Department of Justice discussion paper “Modernizing Canada’s Privacy Act: Online Public Consultation” (“the discussion paper”). We find many of the suggestions for reform put forward in the paper to hold promise for a more privacy-protective Privacy Act. In these submissions, we are focusing on the areas where we have the strongest suggestions for further adjustments, drawing on our established research expertise in the intersections between data protection legislation and the Charter. Many of our comments are framed by our interest in the role of this intersection in supporting the rule of law, especially with regard to law enforcement. While these comments are largely framed by this intersection, some apply more broadly.

1. Summary of recommendations

Recommendation 1: The title of the legislation should be “The Data Protection Act”.

Recommendation 2: The Act should update the current collection threshold to what is “necessary and proportional” for a federal public body’s functions or activities and include a list of factors that must be considered which better reflect the constitutional dimensions of collection.

Recommendation 3: The Act should impose a general obligation to mitigate privacy risks in all collection, use and disclosure of personal information and de-identified personal information.

Recommendation 4: “Privacy risks” should be defined to mean the processing of information about persons in a manner that increases the risk of identifying an individual or inferring information about a specific individual.

Recommendation 5: More work needs to be done on the definition of “de-identified personal information” including more consultation with the technical community regarding best practices for privacy-protective data analysis.

Recommendation 6: The scope of obligations captured under the accuracy principle should be expanded to include not only requirements to ensure the accuracy of personal information, including de-identified personal information, but also requirements to evaluate the accuracy of means of collecting and analyzing that information, and setting accuracy requirements for data analysis tools and procedures. The principle should also clarify whether the use of personal information and de-identified personal information for law enforcement and national security investigations are “administrative purposes”.

Recommendation 7: Obligations to implement privacy risk-minimization practices should apply to collection, use and analysis of “publicly available” personal information. Specific exceptions, such as relaxing the requirement to collect information directly from the individual, may be appropriate for such information: but this does not mean that other privacy risk minimization practices are not required.

Recommendation 8: There should be no law enforcement or national security exception from transparency and accountability requirements in relation to the use of automated decision-making systems.

2. Title and purpose

The discussion paper suggests that the title of the legislation should change to reflect the underlying aim of protecting the privacy of personal information rather than a broader understanding of privacy. We agree that the title should change, but suggest that the new title be the “Data Protection Act”. This would accommodate the concern regarding the kind of privacy interest at issue, but would also accommodate the concern that privacy is not the only interest at stake: the Act also guards against misuse that will affect other important interests, as well as ensure accountability.

The title of the Act and the purposes of the Act should be consistent. The discussion paper suggests that the purpose clause could “reflect the important underlying public objectives” of the legislation and offers a list of these objectives. However, we note that these are not all about “privacy”, let alone “personal information”. For example, while privacy is thought to protect “human dignity, personal autonomy, and self-determination”, these values are also much broader than privacy. Similarly, reconciliation with indigenous communities, public trust, responsible use, and ethical and accountable public decision-making extend beyond individual privacy. Other jurisdictions, such as Europe, as well as the academic research community more generally, recognize that the interests protected by data protection law go beyond individual privacy and can include freedom of expression, freedom of association, and equality, as well as protecting democratic governance. Moreover, there can be collective interests in data, including in “de-identified” personal information. Framing the legislation in terms of the protection of privacy in relation to personal information is too narrow a framework for 21st century data protection law.

Although the discussion paper mentions the Charter, it does so only in order to point to the many different sources of privacy protection in Canada. However, the Charter is not a separate and parallel regime to the Privacy Act but sets out the basic ground rules to which all government data practices must conform. The re-drafting of the Act’s purpose clause should reflect this reality, and a valuable step in this direction would be the express inclusion of the list of broader objectives in the purpose clause (p. 7). As we further outline below, these constitutional dimensions are particularly important in relation to the proposed new “Limiting collection” principle.

Recommendation 1: The title of the legislation should be “The Data Protection Act”.

3. Rethinking the “limiting collection” principle

3.1. The “reasonably required” threshold

One of the basic Fair Information Practice Principles (FIPPs) is the “collection limitation” principle: “There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.”Footnote 1 Currently, the way in which the Privacy Act implements this principle is through requiring that the collection of personal information “relates directly to an operating program or activity of the institution” (s. 4). This is a very weak threshold and the discussion paper proposes a new “Limiting collection” principle with a “reasonably required” threshold (p. 13). We agree with a stricter threshold. However, this threshold should be even stronger than “reasonably required” and should more clearly reflect the government’s Charter obligations the revised purpose clause would acknowledge.

The collection of personal information by law enforcement and national security agencies governed by the Privacy Act could impact on the right to be secure from unreasonable search or seizure, as protected by s. 8 of the Charter if it involves information that meets the constitutional test for “reasonable expectation of privacy.” But even outside of the s.8 context, and even if individuals voluntarily provide their personal information to the state, s.7 rights could be at stake. Jurisprudence supports the argument that s.7 protects privacy as an aspect of liberty or security of the person.Footnote 2 The collection of personal information that attracts a “reasonable expectation of privacy” could affect an individual’s liberty and security of the person and therefore such collection needs to be consistent with the principles of fundamental justice. The Supreme Court of Canada has also held that even where there is judicial deference to administrative decision-making, the “reasonableness” of such decisions “centres on proportionality, that is, on ensuring that the decision interferes with the relevant Charter guarantee no more than is necessary given the statutory objectives.”Footnote 3

Canadian courts have held that the Privacy Act is “quasi-constitutional” legislation and it should be drafted in a way that harmonizes with the government’s constitutional obligations and does not leave the relation between the Privacy Act and the Charter to be worked out in the courts through costly litigation. The proposed “Limiting collection” principle falls short of these obligations. We acknowledge the concern raised in the discussion paper that including a necessity requirement is feared to “unduly hamper the ability of federal public bodies to carry out their mandates effectively” (p. 13). However, we note that a necessity requirement can be informed by factors to be considered, just as the proportionality component will be. We recommend that all collection of personal information should be subject to a “necessary and proportional” requirement, and the interpretation of this requirement must take into account the impact of the collection on the full spectrum of an individual’s Charter rights and freedoms (privacy, freedom of association, freedom of speech, and equality).

The current proposal is to include a list of factors that a federal public body must take into account in determining whether collection is “reasonably required”. In our opinion these do not properly reflect the constitutional dimensions of data collection. For example, one factor is to consider the purpose of the collection and “whether it was for law enforcement or national security purposes”. However, there is no indication that this should lead to a stricter application of ideas of necessity and proportionality, given the coercive context. Another factor is whether there are less intrusive means to achieve the purpose, but this is then qualified by “at a comparable cost and with comparable benefits to the public” (p. 13). With this qualification, the proposed legislation is basically allowing federal public bodies to collect more personal information than is necessary so long as this is cheaper than collecting only what is necessary. It cannot be that the protection of important rights and interests is only required when relatively costless. Finally, these factors use the language of “intrusiveness” to capture the impact of the data collection which reflects a too-narrow focus on individual privacy. A better formulation would be to consider the impact of collection on individual and group interests bearing on fundamental rights and freedoms.

The discussion paper raises the concern that a “necessity” test would not provide federal public bodies with enough flexibility in adopting new technologies, such as AI, that require access to significant amounts of data and where it is not always possible to specify in advance what data will be relevant. In response to this concern we make three observations. First, the current proposal for revamping our federal private sector privacy law, Bill C-11, uses the threshold of “necessity” in its “limiting collection” principle. It would be detrimental to public trust for the government to assert that federal public bodies should be able to collect more information with fewer safeguards than the private sector – especially when the government can do so without consent. Second, even if the collection threshold were “necessary” this would not apply to all collections as the discussion paper also proposes that federal public bodies can collect personal information where “expressly authorized” (p. 13). This places a democratic check on information collection. Third, often the concern with “big data” or the use of AI is the secondary use of data rather than the initial collection of it. The government is already proposing an expanded regime for “de-identified personal information” that would potentially govern this possibility.

Recommendation 2: The Act should update the current collection threshold to what is “necessary and proportional” for a federal public body’s functions or activities and include a list of factors that must be considered which better reflect the constitutional dimensions of collection.

3. 2. Mitigating privacy risks of collection/use/disclosure rather than only limiting collection

The need to rethink collection limitation goes further than the discussion paper acknowledges. One of the problems with the FIPPs framework is that it assumes that the way to minimize privacy risks is to minimize the amount of data collected, and then restrict the purposes for which it is used and/or disclosed and ensure that there are adequate safeguards against unauthorized uses and/or disclosures. However, there are privacy risks associated with the use and/or disclosure of personal information that are not caught through a focus on purposes or safeguards. In addition, as we describe in the following section, there are privacy risks associated with processing “de-identified personal information” as well.

We propose that the legislation adopt a general principle of privacy risk mitigation that applies to all collection, use and disclosure of personal information and de-identified personal information.

It is also important that privacy risks be understood more broadly than simply the risk of identifiability. Many new approaches to managing privacy risks – such as differential privacy – are premised on the idea that increasing the likelihood of inferring information about a specific individual when analyzing datasets is an increase in privacy risk. We agree with this approach, as the more information that is linked to a specific individual the more likely it is that the individual can be identified at some point. Waiting until the point at which there is a “serious possibility” of re-identificationFootnote 4 in order to identify a privacy risk is to fail to implement responsible data practices in the federal public sector early enough in the data pipeline.

One of us has been involved in separate work commenting on Bill C-11, including proposing a general risk mitigation principle and some definitions. This work is publicly available here: https://srinstitute.utoronto.ca/news/nikolov-papernot-lie-austin-information-about-persons.

While we appreciate the discussion paper’s endorsement of the use of “privacy by design” and Privacy Impact Assessments (PIAs) to identify and mitigate privacy risks, such measures will only be effective if the underlying statutory obligations and definitions regarding privacy risks are adequate. Currently they are not.

Recommendation 3: The Act should impose a general obligation to mitigate privacy risks in all collection, use and disclosure of personal information and de-identified personal information.

Recommendation 4: “Privacy risks” should be defined to mean the processing of information about persons in a manner that increases the risk of identifying an individual or inferring information about a specific individual.

4. Role of “de-identified” personal information

The discussion paper defines “de-identified personal information” as referring “to personal information that has been modified so that it can no longer be attributed to a specific individual without the use of additional information.”

We strongly recommend that you rethink this definition. There are three main problems with this definition. The first problem is that it focuses on manipulating the data rather than a range of approaches that can mitigate the risks of re-identification. For example, as some of our colleagues at the University of Toronto have pointed out, current computer science approaches to privacy focus on the analysis of the data rather than the data itself – including extremely robust approaches like differential privacy.Footnote 5 If the point of defining “de-identified personal information” is to enable the analysis of large datasets in a privacy-preserving manner, then the definition should reflect current best practices in the field.

The second problem is that this definition ignores taking into account the ways in which the computing environment can also be designed to mitigate privacy risks. For example, instead of “de-identified” data being shared or disclosed, data can be held in a secure computing environment where others are allowed to access it in order to do computations on the data but where this access never involves access to the underlying “raw” data and is auditable.Footnote 6 From a privacy perspective this offers far better protection than the more prevalent model of de-identify-and-release. It also does not necessarily require that the “raw” data be “de-identified” according to the definition proposed. The idea is that there can be a combination of privacy-protective techniques and safeguards that together reduce the risk of re-identification to the desired level.

The third problem is that the definition assumes that “de-identification” is something that is done to “personal information”. This could potentially create a strange situation where a federal body collects information that is not “personal information” because there is not a serious possibility of identifiability but then takes other personal information and de-identifies it, making it “de-identified personal information”. The risk of re-identification could be identical as between these two collections of data but would the legislation treat them differently? We note that this is a weakness with the proposed new federal private sector privacy legislation.Footnote 7

The discussion paper indicates that there are “some well-known anecdotes” of successful re-identification attacks on released de-identified personal information. These are not “anecdotes”. The computer science research community largely considers the idea of “de-identification” to be a failed paradigm for privacy protection, which is why other more robust approaches have been developed and continue to be developed. We are happy to facilitate further consultation with our computer science colleagues to better inform the way the legislation can embrace these new directions in privacy-protective techniques.

One of us has been involved in separate work proposing an alternative definition of “functionally non-identifying information”. This work is publicly available here: https://srinstitute.utoronto.ca/news/nikolov-papernot-lie-austin-information-about-persons.

Recommendation 5: More work needs to be done on the definition of “de-identified personal information” including more consultation with the technical community regarding best practices for privacy-protective data analysis.

5. Accuracy

The discussion paper indicates that the current accuracy provision in the Privacy Act already substantially implements the “Accuracy” principle from the FIPPs framework, where government institutions must “take all reasonable steps to ensure that personal information that is used for an administrative purpose by the institution is as accurate, up-to-date and complete as possible” (subsection 6(2)). The discussion paper notes that by updating the definition of “administrative purpose”, the accuracy provision is expanded to capture “personal information that could have a direct impact on an individual, consistent with an expanded definition of administrative purpose.” (p.26) This is an improvement over the existing definition that applies only to use of that information in a decision-making process. However, having a “direct impact on an individual” is a loose standard, and requires greater precision that captures the full scope of impact that government handling of information about people can have. Further, while a revised definition of “administrative purposes” is welcome, we note that the term “administrative” is not defined. It is therefore unclear what counts as “administrative” with regard to some of the more potentially privacy-invasive government practices, namely those of law enforcement and national security organizations. We note that privacy commissioners, both federal and provincial, have been increasingly called upon to provide guidelines for police use of investigative and pre-investigative technologies, such as body worn cameras and facial recognition technologies. While it therefore appears that privacy commissioners understand public sector data protection legislation to apply to such activities to some extent, the application of specific obligations and exceptions is often unclear; the term “administrative” is vague and leaves the scope of police and national security activities captured by it open to interpretation.

Further, in light of the development of artificial intelligence systems that rely heavily on de-identified data in order to determine population, community, and demographic level trends, many administrative purposes that have a “direct impact” on individuals will manifest on the collective rather than individualized level. Concerns about the accuracy, bias, and potentially discriminatory impact of such tools should be addressed via expanded accuracy requirements that are specifically designed to assess the possibility of inaccurate and biased analysis on a collective basis. Assessment of compliance with accuracy requirements should be integrated into accountability and transparency requirements. This should include requiring annual reporting on accuracy and bias in information processing arising from use of artificial intelligence tools by government entities covered by the Act.

Recommendation 6: The scope of obligations captured under the accuracy principle should be expanded to include not only requirements to ensure the accuracy of personal information, but also requirements to evaluate the accuracy of means of collecting and analyzing that information, including de-identified personal information, and setting accuracy requirements for data analysis tools and procedures. The principle should also clarify whether the use of personal information and de-identified personal information for law enforcement and national security investigations are “administrative purposes”.

6. Publicly available information

In the current technology-permeated information landscape, it is misguided to characterize some information as “publicly available” and hence not subject to the same privacy-risk minimization requirements. Any definition of “publicly available” should strictly limit the circumstances under which information is removed from any of these requirements, and will need to be precise and narrow. The discussion paper proposes to define personal information as being “publicly available” in three instances: “first, when it has been made manifestly public by the individual the information relates to; second, when it is broadly and continuously available to all members of the public and the individual has no reasonable expectation of privacy in the information; and third, when another act of Parliament or a regulation requires the information to be publicly available.” (p. 9) These instances are not sufficiently clear to provide justification for reducing the privacy-risk minimization requirements: for instance, what counts as making information “manifestly public” in the context of social media? While we welcome the connection between the context of such information as “broadly and continuously available to all members of the public” and the requirement that the individual has no reasonable expectation of privacy in the information, we are concerned that the former description of “broadly and continuously available” may foreclose analysis of the latter. A modernized Privacy Act should be clear that even information that is “broadly and continuously available to all members of the public” may still require privacy risk minimization: for instance, footage from public video surveillance cameras that capture a person’s movements in a public space should not be deemed to wholly negate a reasonable expectation of privacy and so continues to require privacy risk minimization practices. While this already appears to be the approach taken by the OPC, the Act should make clear that this interpretation of “publicly available” is required.

We further note that the discussion paper suggests that "[t]he Act could add specialized rules for using or sharing “publicly available” personal information to clarify and support the application of new personal information protection principles to publicly available personal information, and to align public sector use and disclosure of publicly available personal information with individuals’ reasonable expectations of privacy." (p. 34) There is an opportunity to specify more precise and narrow parameters for publicly-available information in such rules. But unless the central Charter based understanding of “reasonable expectation of privacy” is integrated into the Act’s understanding of the handling of “publicly available” personal information, there is a real risk that public trust will be damaged – for instance, if it were to fail to limit surveillance of social media by police and border control.

We caution that removing “publicly available” personal information from the purview of privacy-risk minimization requirements would potentially exempt government organizations from accountability and transparency requirements related to their collection and use of such information. This would leave a large gap in oversight of such practices, particularly with regard to our proposed amendment to the accuracy requirements set out above.

Finally, we also note that there can be uses of publicly available information that result in deleterious impacts on other fundamental rights and freedoms. For example, a narrow focus on individual privacy is not adequate to capture how forms of group surveillance can impact on freedom of association and freedom of speech, or how inaccuracies in this information can lead to the creation of biased models.

Recommendation 7: Obligations to implement privacy risk-minimization practices should apply to collection, use and analysis of “publicly available” personal information. Specific exceptions, such as relaxing the requirement to collect information directly from the individual, may be appropriate for such information: but this does not mean that other privacy risk minimization practices are not required.

7. Transparency and accountability

The discussion paper suggests that the transparency and accountability requirements of the Privacy Act could be updated to better respond to the use of automated decision-making systems. We think that this is important. However, the discussion paper also indicates that there should be exceptions for contexts “such as law enforcement and national security, where providing details on such information could harm the public interest.” (p.12) To the contrary – we think that an exception to transparency and accountability requirements for law enforcement and national security would significantly harm the public interest by undermining public trust. As Justice Iacobucci stated in R v Mentuck, “A fundamental belief pervades our political and legal system that the police should remain under civilian control and supervision by our democratically elected officials; our country is not a police state. The tactics used by police, along with other aspects of their operations, is a matter that is presumptively of public concern.”Footnote 8

In fact, we are generally concerned that the number of exceptions proposed for law enforcement and national security – in relation to publicly available information, direct collection, and transparency – will largely render the Privacy Act quite ineffective as a means of protecting Canadian’s privacy in those contexts. At a time when law enforcement and national security agencies are seeking to utilize new controversial technologies while retaining public trust, this is irresponsible.

Recommendation 8: There should be no law enforcement or national security exception from transparency and accountability requirements in relation to the use of automated decision-making systems.

We appreciate this opportunity to offer the foregoing eight recommendations for improving the suggested reforms to the Privacy Act. We believe that implementing these recommendations will ensure that the revised Act truly protects the rights and interests of Canadians and instills public trust in government organizations, now and in the future.

Sincerely,

[Information was severed]

Lisa M. Austin
Professor and Chair in Law and Technology
University of Toronto Faculty of Law
lisa.austin@utoronto.ca

[Information was severed]

Andrea Slane
Professor, Legal Studies
Ontario Tech University Faculty of Social Science and Humanities
andrea.slane@ontariotechu.ca