Public consultation on the Privacy Act – Submission – Susannah M. Ward

Cette soumission n’est disponible qu’en anglais.

If we ever lose our freedoms in this country, it will not be as a result of invasion from without, but erosion from within. The job is going to be done not by malevolent autocrats seeking to do bad but by parochial bureaucrats seeking to do good - but who are fixated on what they are trying to accomplish that they neglect the price that inevitably has to be paid for it. What is going to be lost on the other side if we create such broad powers?” [Alan Borovoy]

Many have expressed similar sentiments when Bill C51, the Anti-Terrorism Act, was first introduced. In my view, modernizing the Privacy Act provides an opportunity to reign in the broad powers embedded in the Anti-Terrorism Act, and elsewhere.

With the proposed changes to modernize the Privacy Acta comparative analysis be should be made in reference to the revisions made to Bill C51 and Access to Information. The final changes to Bill C51 did not have a thorough review by civil liberties groups. The revised document was released the last day before parliament closed for the summer leaving a gap of inactive time for legislative review and debate. When parliament began its Fall session the revisions were fast approved, and ‘swept under the rug’, with virtually no mention from civil liberty watch groups or the legal community. Those that did comment were concerned that not enough changes were made. This was concerning given the public attention and resistance when the Bill was first introduced.

Slight changes were made to Access to Privacy Act that fell way below expectations from the electoral promise made during the 2015 election. The changes were so moderate that the outgoing Commissioner, Suzanne Legault, held a press conference expressing her concern and disappointment. The Privacy Act directly impacts these two pieces of legislations since the Act ensuring heightened accountability and oversight.

Sharing information with government departments, agencies, third-parties institutions and private sector.

Inconsistencies can exist between what the previous (federal and provincial) governments agreed to and how projects are handled when transitioning to a new government. For example, unauthorized or quasi-authorized projects, projects that are operating well past its expiration and/or operating well beyond the scope from what was approved. A breech would continue if the present government knows and does not stop the project. A scenario such as this usually happens with projects done for research purposes with universities, academic institutions, and/or national security projects, perhaps with some profit gain by a commercial sponsor, because “it’s in the public interest”. This would involve the issues of informed consent, especially for “public interest projects”, and extensions on project extensions.

Even when the state exercises public authority over a person, sharing personal information without the consent of the individual can cause harm. Public interest project should use de- identified information.

To maximize oversight, I strongly agree that Federal Government departments and agencies should enter into a formal agreement before sharing personal information with each other and especially when third parties and private sector partnerships are made. The onus seems to stays with the individual, who in most cases, are kept clueless so the project can continue (exploitation) while damage is being caused, as in the case with Maher Arar and the dangers of information sharing. If the federal government still employs the Ann’s test for Crown liability (where the government is concerned) perhaps this could be part of the ‘privacy by design’ function in the risk assessment process prior to beginning [and extending] a project.

Digital innovation

Cyber-dependency is a rising concern with the convergence of new technologies from AI to big data, supercomputing and ultrafast networks. Digitization can work against us unless law and technology are crafted to respect “Properties of Identity”. We recognize the impact of technology on the government’s service delivery in-terms of agility, capacity and responsiveness, however, failure of critical infrastructure and information breakdown can also occur as we have seen with the Phoenix pay. Citizens and federal employees should have the power to choose how their personal information is shared, therefore an opt-opt-out mechanism should be made available. Not everyone wants to be digitized.

AI technology

Some of our greatest challenges are yet to come. We are generating more data about ourselves than we realize than ever before. But data is being generated about us beyond our control, collected and shared by countless of entities. Decisions are being based from the data we don’t even know we are generating.” [Mathew Rice]

There are many ways an AI system can be tricked, poisoned and attacked. When gathering information about a person through their social media websites, how can AI systems confirm the authenticity of the information it is gathering? False social media profiles and videos, altered photographs, voice mimicking software, mis-information and dis-information is already prevalent; an AI program cannot determine what is authentic and what’s not?

People’s critical thinking and analysis skills tend to relax when using technology and tend to develop a dependency on its quick computing of mass data overly rely on what the system suggests. Yet, “the government is using technology for automated decision-making that can have a major impact on people’s lives, such as denial of consequential services, risks to human rights and risks of harm, human attention and cognition is still required to monitor algorithmic attention-hacking and abilities of persuasion.” [**Decision Points on AI Governance, UC Berkeley, White Paper Series, page 30: G20’s OECD AI Principles, page #] Also, Given that AI systems are built on top of software and hardware, errors in the underlying systems can still cause failures at the level of AI systems.” [The State of AI Ethics; Bias and Algorithmic Injustice; Learning to Diversify from Human Judgments; Montreal AI Ethics Institute, AI Governance: A Holistic Approach to Implement Ethics in AI, June 2020, page 12, page 63] Until technology improves governments should be weary of using AI to gather information and to make decisions even if it reduces effort, time, and money.

PIPEDA and Privacy Act

Obstruction and illegal erasure of data is also a concern. This holds true if personal data is kept on servers physically located outside our province, country and on foreign soil; which municipal, provincial, federal or international jurisdictional rules will be followed? US has regulations for health data and some rules around financial and credit information but few rules on how your data is used by other sectors. This maybe a point for PIPEDAbut it also concerns federal government partnerships with private sector initiatives (health, banking, education etc.) where state authority maybe used for commercial gains.

“As cities are becoming smart and generating data about us, with the Internet of Things (IoT) our understandings about what generates data will change dramatically we are seeing a convergence of technology where our houses and cars are becoming smart like our phones with leaky security and we measure and record our health differently.” [Mathew Rice]

Contract tracing is another matter, as we’ve seen with COVID and expanding the scope to non- COVID related [health and surveillance] matters through the use of 5G technology, especially as more “smart cities” are created with this technology. In fact, the danger of the various clandestine misuses of personal information by 5G and the Internet of things IoT led former Privacy Commissioner Ann Cavoukian to resign from Google’s Smart City project.

I fully support ‘privacy by design’. If the federal government still employs the Ann’s test for Crown liability (on projects where the government is concerned) perhaps it could be part of the ‘privacy by design’ function embedded in the risk assessment process prior to beginning and extending a project. Telecoms will play a big part in protecting personal privacy I also support amending the Telecom Act to respond to our changing digital realities.

Privacy and Bill C51

Privacy, unauthorized use and erasure of data is very concerning. So is clandestine surveillance using unmanned aerial vehicles (drones), facial recognition and spying technologies to intercept communications on cell phones and emails; psychological profiling mechanisms on social media sites, SMS, texts, Zoom calls, email communications and other forms of voyeurism that intrudes on the privacy of people inside their homes and private spaces without consent.

Justification for these actions are often used in de-radicalization and research projects, pilot programs, and social experiments “in the public’s interest” where an individual unknowingly becomes a participant due to “deportation by clairvoyance” where it’s not simply what an individual has done, but what authorities believe they are going to do or what group an individual belongs to is going to do. This is fear-based decision-making and it interferes with freedom of expression, freedom of membership, personal autonomy, self-determination, enjoyment of life and right to trial since the individual is most often kept clueless, as in the case with Maher Arar and the dangers of information sharing, so the project can continue (exploitation). We must also refrain from using state authority which could be used to justify the intent of personal interests and gain.

Intelligence activities” performed through the Anti-Terrorism Act require the oversight body offered through the Privacy Act and Access to Information for protection of human dignity. Private sector’s objective is to increase profits, therefore, private sector should have stronger rules to protect against exploitation. PIPEDA and the Privacy Act shouldn’t be subject to the same rules.

To maximize oversight, I strongly agree that Federal Government departments and agencies should enter into a formal agreement before sharing personal information with each other and especially when third parties and private sector partnerships are made.

EU’S GDPR and International Standards

In these circumstances the government should harmonize protection of personal data with GDPR’s data erasure procedures. EU is enforcing stronger compliance measures. Some countries focus on protecting some categories of data but not others. There is a lack of general protection in a few world regions, for example, the Gulf countries under MENA, where general privacy protection emanates from constitutional protections or other areas of law.

Privacy Commissioner

[Q. 28-34: I strongly agree] There is a general confidence in the Office of the Privacy Commissioner, many would welcome increased powers to minimize fraud and abuse. But the job will only function as well as the person who occupies the position. I believe in the role of the Privacy Commissioner; I have faith in M. Therrien.

Consulting all those who have completed an Access to Information request and/or have processed a privacy complaint with the Commissioner will be another source to provide enriching feedback. The average citizen would arbitrarily give positive feedback since we generally trust the government. That is, until something happens. In fact, the average citizen doesn’t have a full understanding of the nature and various circumstances their personal information can be misused. They only think in terms of extraneous circumstances such intelligence gathering or perhaps border-crossing and health matters. They cannot imagine their government would give unauthorized access of their personal information to third parties, private-sector, and research institutes operating outside the government. A person who has taken the time to request access to their personal information and/or filed a privacy complaint will have more knowledge about the process and how it can be improved. “This new world brings new opportunities and risk. Our challenge is to ensure our right to privacy is respected, protected and fulfilled.”

Prepared by: Susannah M. Ward