Bill C-27: An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
Tabled in the House of Commons, November 4, 2022
Explanatory Note
Section 4.2 of the Department of Justice Act requires the Minister of Justice to prepare a Charter Statement for every government bill to help inform public and Parliamentary debate on government bills. One of the Minister of Justice’s most important responsibilities is to examine legislation for inconsistency with the Canadian Charter of Rights and Freedoms [“the Charter”]. By tabling a Charter Statement, the Minister is sharing some of the key considerations that informed the review of a bill for inconsistency with the Charter. A Statement identifies Charter rights and freedoms that may potentially be engaged by a bill and provides a brief explanation of the nature of any engagement, in light of the measures being proposed.
A Charter Statement also identifies potential justifications for any limits a bill may impose on Charter rights and freedoms. Section 1 of the Charter provides that rights and freedoms may be subject to reasonable limits if those limits are prescribed by law and demonstrably justified in a free and democratic society. This means that Parliament may enact laws that limit Charter rights and freedoms. The Charter will be violated only where a limit is not demonstrably justifiable in a free and democratic society.
A Charter Statement is intended to provide legal information to the public and Parliament on a bill’s potential effects on rights and freedoms that are neither trivial nor too speculative. It is not intended to be a comprehensive overview of all conceivable Charter considerations. Additional considerations relevant to the constitutionality of a bill may also arise in the course of Parliamentary study and amendment of a bill. A Statement is not a legal opinion on the constitutionality of a bill.
Charter Considerations
The Minister of Justice has examined Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, for any inconsistency with the Charter pursuant to his obligation under section 4.1 of the Department of Justice Act. This review involved consideration of the objectives and features of the Bill.
What follows is a non-exhaustive discussion of the ways in which Bill C-27 potentially engages the rights and freedoms guaranteed by the Charter. It is presented to assist in informing the public and Parliamentary debate on the Bill.
Overview
Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts, is also known as the Digital Charter Implementation Act, 2022.
The Consumer Privacy Protection Act is Part 1 of the Digital Charter Implementation Act, 2022. The Consumer Privacy Protection Act would repeal parts of the Personal Information Protection and Electronic Documents Act and replace them with a new legislative regime governing the collection, use, and disclosure of personal information for commercial activity in Canada. This would maintain, modernize, and extend existing rules and impose new rules on private sector organizations for the protection of personal information. The Consumer Privacy Protection Act would also continue and enhance the role of the Privacy Commissioner in overseeing organizations’ compliance with these measures. Provisions of the Personal Information Protection and Electronic Documents Act addressing electronic alternatives to paper records would be retained under the new title of the Electronic Documents Act.
Part 2 of the Digital Charter Implementation Act, 2022 contains the Personal Information and Data Protection Tribunal Act. It would create a new administrative tribunal to hear appeals of orders issued by the Privacy Commissioner and apply a new administrative monetary penalty regime created under the Consumer Privacy Protection Act.
Part 3 of the Digital Charter Implementation Act, 2022, the Artificial Intelligence and Data Act, sets out new measures to regulate international and interprovincial trade and commerce in artificial intelligence systems. It would establish common requirements for the design, development, and use of artificial intelligence systems, including measures to mitigate risks of harm and biased output. It would also prohibit specific practices with data and artificial intelligence systems that may result in serious harm to individuals or their interests.
Search and Seizure (section 8 of the Charter)
Section 8 of the Charter protects against unreasonable searches and seizures. The purpose of section 8 is to protect individuals from an unreasonable intrusion into a reasonable expectation of privacy by the state. A search or seizure will be reasonable if it is authorized by law, the law itself is reasonable (in the sense of striking an appropriate balance between privacy interests and the state interest being advanced), and the search is carried out in a reasonable manner.
The Digital Charter Implementation Act, 2022 has the potential to implicate rights under section 8 of the Charter because the Privacy Commissioner’s powers and certain provisions allowing government institutions access to personal information may implicate information subject to a reasonable expectation of privacy.
First, the Consumer Privacy Protection Act would re-enact and add to the Privacy Commissioner’s existing investigation and audit powers. These powers, including the authority to compel the production of records and enter private places, other than dwelling-houses, to examine records and to talk with individuals in the place entered, would be extended to the Privacy Commissioner’s new authority to conduct inquiries into alleged violations of the Consumer Privacy Protection Act. The Privacy Commissioner would be allowed to share information with certain other federal regulatory bodies if certain conditions are met. The Privacy Commissioner would also maintain the authority to share information with provincial counterparts and foreign states.
The following considerations support the consistency of the Privacy Commissioner’s investigative, inquiry, audit, and information-sharing powers with section 8. Where information is subject to a reasonable expectation of privacy, the Privacy Commissioner’s legal authority to access and share the information would be clearly set out in the Consumer Privacy Protection Act. This legal authority would support the objectives of the Consumer Privacy Protection Act through tailored powers similar to those found in other regulatory contexts and subject to similar restrictions governing their use.
Second, the Consumer Privacy Protection Act may implicate rights under section 8 by re-enacting a range of existing provisions in the Personal Information Protection and Electronic Documents Act that allow organizations to disclose an individual’s personal information to a government institution without their knowledge or consent in certain circumstances. For example, the Consumer Privacy Protection Act would re-enact a provision allowing an organization to disclose personal information without the knowledge or consent of the individual if a government institution has made a request for the information, identified its lawful authority to obtain the information and indicated that it suspects the information relates to national security. In addition, it would re-enact related provisions allowing an organization to collect personal information for the purposes of certain disclosures to government institutions, such as in the situation described above or where a disclosure is required by law. Existing rules that may limit the information an organization is allowed to provide to an individual about these disclosures would be kept in the bill. In each case, the rules that applied under the Personal Information Protection and Electronic Documents Act would remain the same under the Consumer Privacy Protection Act.
In the case of re-enacted provisions allowing organizations to disclose personal information to a government institution in certain prescribed circumstances, consistency with section 8 is supported by the following considerations. The focus of the measures is on ensuring the Consumer Privacy Protection Act does not prevent an organization from disclosing information where a disclosure is authorized by another source of legal authority, such as another law, a warrant, or a subpoena, which allows the government to collect the information. The re-enacted provisions also permit an organization to disclose personal information to a government institution when the disclosure is at the initiative of a private sector organization, made voluntarily by the organization, and the relevant statutory requirements have been met. These provisions apply in only limited and specific circumstances where the disclosure is in the public interest. To limit impacts on privacy interests, a broad range of transparency, accountability, and oversight provisions exist under other measures and the broader legal context governing public institutions. For example, the Privacy Act and other specialized legal regimes impose rules and safeguards that apply to government institutions collecting personal information; the Privacy Commissioner has broad powers to oversee institutions’ compliance with the Consumer Privacy Protection Act and the Federal Court is also available to resolve complaints. In addition, information about these disclosures to government institutions may be shared with the individual affected when it would not compromise sensitive government operations.
Similar considerations support consistency with section 8 in relation to provisions permitting organizations to collect personal information without an individual’s knowledge or consent for the purposes of specific disclosures to government institutions. Like the disclosure measures discussed above, these collection provisions enable organizations to respond to government requests for information where government access to the information is authorized by another source of legal authority, like another law, a warrant or a subpoena. They also support the collection of personal information for voluntary disclosures at the initiative of private sector organizations. Collection of personal information without an individual’s knowledge or consent for this purpose is permitted only in limited situations of significant public importance, such as where the information potentially implicates national security, the defence of Canada or the conduct of international affairs or where a disclosure of information is required by law.
Third, measures under the Artificial Intelligence and Data Act may also implicate rights under section 8. For example, under the Artificial Intelligence and Data Act, the Minister responsible may compel the production of certain information from persons subject to the Act for the purpose of verifying compliance with the Act. Regulated persons may also need to provide auditors with records. In addition, the Artificial Intelligence and Data Act authorizes the Minister to publish information about artificial intelligence systems posing a serious risk of harm; to order a person to publish information related to their compliance with the Act; and to share information with other government entities specified in the Act.
The following considerations support the consistency of the Minister’s authorities to collect, disclose, and share information with section 8. Privacy interests are diminished in contexts where government access to business information and records is essential to ensuring compliance with the regulatory obligations a regulated entity must meet. Under the Artificial Intelligence and Data Act, the Minister’s powers to gather, compel the production of, or disclose relevant information for regulatory or administrative purposes are consistent with similar powers upheld as reasonable in other regulatory contexts. They are tailored and subject to limits. They authorize access to information that is reasonable and necessary to advance the Act’s important regulatory objectives. In addition, there are restrictions on the uses to which information gathered under the Act may be put and protections for personal and confidential business information. These protections include limits on the circumstances in which personal and confidential business information may be disclosed by the Minister.
Freedom of Expression (section 2(b) of the Charter)
Section 2(b) of the Charter protects freedom of thought, belief, opinion and expression. Section 2(b) has been interpreted broadly to encompass any activity or communication, aside from violence or threats of violence, which conveys or attempts to convey meaning. It protects the rights of those seeking to exercise expressive freedoms and those who would receive this expression.
Restrictions on the collection, use, and disclosure of personal information in the Consumer Privacy Protection Act could impact regulated organizations’ commercial expressive activities where these activities would involve a collection, use, or disclosure of personal information that is restricted or prohibited under the Act. In addition, regulatory and other legal standards under the Artificial Intelligence and Data Act could potentially impact freedom of expression to the extent they may limit expressive uses of artificial intelligence systems or restrict access to any expressive content these systems may generate.
To the extent the Consumer Privacy Protection Act would interfere with protected commercial expression, consistency with the Charter is supported by the following considerations. The Consumer Privacy Protection Act applies only to personal information. It advances the important regulatory purpose of protecting personal information in accordance with individuals’ privacy rights and organizations’ legitimate needs. It balances individuals’ interests in the protection of their personal information with organizations’ needs. It is consent-based and recognizes a range of other circumstances in which personal information may be legitimately used for commercial purposes. The provisions of the Consumer Privacy Protection Act may be viewed as proportionate to the objectives of supporting and promoting commerce and the right of privacy of individuals through the protection of personal information.
To the extent measures under the Artificial Intelligence and Data Act could restrict Charter- protected expression, consistency with the Charter is supported by the following considerations. The Artificial Intelligence and Data Act aims to protect individuals against a range of serious risks associated with the use of artificial intelligence systems, including risks of physical or psychological harm or biased output with adverse impacts on individuals. This important purpose will be supported by regulatory requirements to be set out in future regulations that must satisfy Charter standards. Offences under Part 2 of the Artificial Intelligence Data Act address limited and specific acts associated with either the creation of artificial intelligence systems or making them available for use. Any potential impacts on protected expression may be viewed as proportionate to the pressing public interest in addressing exploitation of unlawful practices with personal information in the development of artificial intelligence systems or the making available of artificial intelligence systems that are likely to, and do, cause serious harms to individuals, their property or other economic interests.
Open courts principle (section 2(b) of the Charter)
The open courts principle receives protection under section 2(b) of the Charter. Under the open courts principle, court and tribunal proceedings are presumptively open to both the public and the media.
The Consumer Privacy Protection Act would maintain the Privacy Commissioner’s general obligation of confidentiality with some exceptions, including an ability to make information public when the Commissioner considers that it is in the public interest to do so. While the adjudicative functions of the Personal Information and Data Protection Tribunal would, under the Personal Information and Data Protection Tribunal Act, be public by default, there are exceptions allowing for the protection of confidential information and holding proceedings in private. In addition, unless the complainant consents, the Tribunal would be required not to disclose the complainant’s name or identifying information and would have discretion with respect to naming organizations in its decisions.
The following considerations support the consistency with the Charter of provisions in the Consumer Privacy Protection Act and the Personal Information and Data Protection Tribunal Act that could limit the openness of proceedings that may implicate the open courts principle or restrict the disclosure of related information. Both the Privacy Commissioner and the Tribunal would retain discretion over the openness of their respective proceedings. This discretion, which must be exercised in accordance with the Charter, allows a proper balance to be struck between openness and any competing considerations such as privacy. There is a presumption that Tribunal proceedings will be open.
Offence Rights (section 11 of the Charter)
Section 11 of the Charter guarantees certain rights to persons who have been charged with an offence, including the right to a fair and public hearing before an independent and impartial adjudicator. Its protections apply only to persons “charged with an offence”. For the purposes of section 11, this occurs when a person is subject either to proceedings that are criminal in nature, or that result in “true penal consequences”. True penal consequences include imprisonment and fines with a punitive purpose or effect, such as when a fine or penalty is out of proportion to the amount required to achieve regulatory purposes.
The Consumer Privacy Protection Act would authorize the Personal Information and Data Protection Tribunal to impose an administrative monetary penalty on an organization that has contravened the Act. It would keep existing offences for when organizations knowingly contravene specific obligations they have under the Act. The offence of obstructing the investigation of a complaint or the conduct of an audit would be extended to the Commissioner’s new inquiry function. The Consumer Privacy Protection Act would also create new offences for re-identifying personal information that has been de-identified, subject to specified exceptions such as for security testing, and contravening an order issued by the Privacy Commissioner following an inquiry. These offences would be punishable by fine.
The Artificial Intelligence and Data Act would authorize the imposition of administrative monetary penalties further to rules to be set out in regulation. It would also create new offences for failing to comply with certain regulatory requirements (sections 6 to 12 of the Artificial Intelligence and Data Act) or obstructing or providing false or misleading information to the Minister or their delegate or an independent auditor. These offences would be punishable by way of fine. The new general offences set out in Part 2 of the Artificial Intelligence and Data Act would be punishable by way of fine or imprisonment.
The following considerations support the consistency of the administrative monetary penalty and offence provisions in Parts 1 and 3 of the Digital Charter Implementation Act, 2022 with section 11.
The proposed administrative monetary penalty provisions would not involve criminal charges, prosecution, or sentencing. Both the Consumer Privacy Protection Act and the Artificial Intelligence and Data Act would expressly provide that the purpose of administrative monetary penalties is to promote compliance with the relevant legislative regime, not punish. Under the Consumer Privacy Protection Act, administrative monetary penalties would be subject to a legislated cap with no mandatory minimum fine. The Personal Information and Data Protection Tribunal would have discretion to determine and impose administrative monetary penalties. The exercise of this discretion would be governed by statutory criteria. In the case of the Artificial Intelligence and Data Act, the Governor in Council would be authorized to make regulations prescribing the relevant rules, including rules for the process of imposing an administrative monetary penalty and the amount that may be imposed. These future rules must meet Charter standards.
The proposed offence provisions under both the Consumer Privacy Protection Act and the Artificial Intelligence and Data Act would provide for criminal charges, prosecution, and sentencing that could engage rights under section 11 of the Charter. In reviewing the relevant measures, no potential inconsistencies between the offence provisions and rights under section 11 have been identified.
Right to liberty (section 7 of the Charter)
Section 7 protects against the deprivation of an individual’s life, liberty and security of the person unless done in accordance with the principles of fundamental justice. These include the principles against arbitrariness, overbreadth and gross disproportionality. An arbitrary law is one that impacts section 7 rights in a way that is not rationally connected to the law’s purpose. An overbroad law is one that impacts section 7 rights in a way that, while generally rational, goes too far by capturing some conduct that bears no relation to the law’s purpose. A grossly disproportionate law is one whose effects on section 7 rights are so severe as to be “completely out of sync” with the law’s purpose.
Part 2 of the Artificial Intelligence and Data Act would create a new general offence that applies to personal information that is known or believed to have been acquired by way of conduct constituting an offence that is recognized in Canada. It would prohibit possessing or using such personal information for the purpose of designing, developing, using, or making an artificial intelligence system available for use. Part 2 of the Artificial Intelligence and Data Act would also prohibit knowingly or recklessly making an artificial intelligence system available for use either if the system is likely to cause, and does cause, serious specified harms or with intent to defraud the public and cause substantial economic loss, and the artificial intelligence systems use causes such loss. Because the offences proposed under Part 2 of the Artificial Intelligence and Data Act allow for the possibility of imprisonment, they engage the right to liberty.
The following considerations support the consistency of the proposed offences with the Charter. The offences seek to protect the public from blameworthy conduct capable of causing serious harm to individuals and their interests. The offences have been tailored to exclude conduct unrelated to this objective. One offence targets very specific actions with personal information that would exploit breaches of Canadian legal standards in the design, development, use or making available for use an artificial intelligence system. This offence applies when an individual knows or believes that personal information has been obtained or derived as a result of the commission of an offence but they nevertheless use or possess it for the purposes of designing, developing, using or making available for use an artificial intelligence system. The other offence targets the specific act of making an artificial intelligence system available for use where this would introduce – knowingly, recklessly, or intentionally – significant threats to individuals or important individual interests that then occur. In reviewing the relevant measures, the Minister of Justice has not identified any potential inconsistencies between the offence provisions and the principles of fundamental justice under section 7.
- Date modified: