Bill C-63: An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts
Tabled in the House of Commons, May 30, 2024
Explanatory Note
Section 4.2 of the Department of Justice Act requires the Minister of Justice to prepare a Charter Statement for every government bill to help inform public and Parliamentary debate on government bills. One of the Minister of Justice’s most important responsibilities is to examine legislation for inconsistency with the Canadian Charter of Rights and Freedoms [“the Charter”]. By tabling a Charter Statement, the Minister is sharing some of the key considerations that informed the review of a bill for inconsistency with the Charter. A Statement identifies Charter rights and freedoms that may potentially be engaged by a bill and provides a brief explanation of the nature of any engagement, in light of the measures being proposed.
A Charter Statement also identifies potential justifications for any limits a bill may impose on Charter rights and freedoms. Section 1 of the Charter provides that rights and freedoms may be subject to reasonable limits if those limits are prescribed by law and demonstrably justified in a free and democratic society. This means that Parliament may enact laws that limit Charter rights and freedoms. The Charter will be violated only where a limit is not demonstrably justifiable in a free and democratic society.
A Charter Statement is intended to provide legal information to the public and Parliament on a bill’s potential effects on rights and freedoms that are neither trivial nor too speculative. It is not intended to be a comprehensive overview of all conceivable Charter considerations. Additional considerations relevant to the constitutionality of a bill may also arise in the course of Parliamentary study and amendment of a bill. A Statement is not a legal opinion on the constitutionality of a bill.
Charter Considerations
The Minister of Justice has examined Bill C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts (“Online Harms Act”), for any inconsistency with the Charter pursuant to his obligation under section 4.1 of the Department of Justice Act. This review involved consideration of the objectives and features of the bill.
What follows is a non-exhaustive discussion of the ways in which Bill C-63 potentially engages the rights and freedoms guaranteed by the Charter. It is presented to assist in informing the public and Parliamentary debate on the bill. It does not include an exhaustive description of the entire bill, but rather focuses on those elements relevant for the purposes of a Charter Statement.
Overview
Bill C-63 contains a variety of measures to address a range of harmful content online as well as hate speech and hate crimes both online and offline.
In order to combat harmful content online, Part 1 of the Bill proposes a new Online Harms Act to create a regulatory regime to hold social media services accountable for reducing exposure to harmful content on their platforms. Harmful content is defined as content that sexually victimizes a child or revictimizes a survivor, intimate content communicated without consent, content used to bully a child, content that induces a child to harm themselves, content that foments hatred, content that incites violence, and content that incites violent extremism or terrorism. The Online Harms Act would set out baseline safety requirements that could evolve over time and would increase transparency about the strategies that platforms are using to protect users and the effectiveness of those strategies. The Online Harms Act would establish a Digital Safety Commission of Canada to administer the framework, a Digital Safety Ombudsperson of Canada to be a resource for users and victims and to advocate for online safety, and a Digital Safety Office to support the Commission and the Ombudsperson.
In order to combat hate speech and hate crimes both online and offline, Part 2 of the Bill would amend the Criminal Code to create a new peace bond to help prevent and deter hate crimes and hate propaganda offences from being committed, increase the maximum punishments for hate propaganda offences, and create a new general hate crime offence.
Part 3 of the Bill would amend the Canadian Human Rights Act to complement the regulation of social media services by allowing recourse to the Canadian Human Rights Commission against individual users who post hate speech on those services and elsewhere online.
Finally, Part 4 of the Bill would improve the mandatory reporting of child pornography online by persons who provide an internet service. The main Charter-protected rights and freedoms potentially engaged by the proposed measures include:
- Freedom of religion (section 2(a)): Section 2(a) provides that everyone has freedom of conscience and religion. This guarantee protects a person’s sincerely held practices or beliefs that have a connection with religion. A law or government action that interferes with the ability to act in accordance with such practices or beliefs may engage section 2(a) if the interference is more than trivial or insubstantial.
- Freedom of expression (section 2(b)): Section 2(b) of the Charter provides that everyone has freedom of thought, belief, opinion and expression, and includes freedom of the press and other media of communication. Section 2(b) has been broadly interpreted as encompassing any activity or communication, aside from violence or threats of violence, which conveys or attempts to convey meaning. Section 2(b) protects both a freedom to express oneself, as well as a freedom from being compelled to express oneself. Section 2(b) includes the “open court principle” whereby members of the public have a right to access information about court and other adjudicative proceedings.
- Right to liberty (section 7): Section 7 of the Charter protects against the deprivation of an individual’s life, liberty and security of the person unless done in accordance with the principles of fundamental justice. These include the principles against arbitrariness, overbreadth and gross disproportionality. An arbitrary law is one that impacts section 7 rights in a way that is not rationally connected to the law’s purpose. An overbroad law is one that impacts section 7 rights in a way that, while generally rational, goes too far by capturing some conduct that bears no relation to the law’s purpose. A grossly disproportionate law is one whose effects on section 7 rights are so severe as to be “completely out of sync” with the law’s purpose.
- Right to be secure against unreasonable search or seizure (section 8): Section 8 of the Charter protects against unreasonable searches and seizures. The purpose of section 8 is to protect individuals from an unreasonable intrusion into a reasonable expectation of privacy by the state. To comply with section 8, a law authorizing a search or seizure that intrudes upon a reasonable expectation of privacy must be reasonable, including striking an appropriate balance between affected privacy interests and the state interest being pursued, and intruding upon privacy no more than reasonably necessary to further its objective.
- Offence rights (section 11): Section 11 of the Charter guarantees certain rights to persons who have been charged with an offence, including the right to a fair and public hearing before an independent and impartial adjudicator. Its protections apply only to persons “charged with an offence.” For the purposes of section 11, this occurs when a person is subject either to proceedings that are criminal in nature, or that result in “true penal consequences.” True penal consequences include imprisonment and fines with a punitive purpose or effect, such as when a fine or penalty is out of proportion to the amount required to achieve regulatory purposes.
- Cruel and unusual treatment or punishment (section 12): Section 12 of the Charter guarantees that everyone has the right not to be subjected to any cruel and unusual treatment or punishment. In the context of sentencing, section 12 prohibits grossly disproportionate punishments.
- Equality rights (section 15(1)): Section 15(1) of the Charter provides that every individual is equal before and under the law and has the right to the equal protection and equal benefit of the law without discrimination, including on the basis of race, national or ethnic origin, colour, religion, sex, age or mental or physical disability.
Part 1 - Online Harms Act
Duty to act responsibly
Part 1 of the Bill would enact a new statute called the Online Harms Act. Other Parts of the Bill would amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service. The Online Harms Act would impose a duty to act responsibly on operators of regulated services requiring them to adopt measures that are adequate to mitigate the risk that users of their services will be exposed to harmful content online. Harmful content would be defined as intimate content communicated without consent; content that sexually victimizes a child or revictimizes a survivor; content that induces a child to harm themselves; content used to bully a child; content that foments hatred; content that incites violence; and content that incites violent extremism or terrorism. Generally, regulated services would have some flexibility to determine how to mitigate the risk, but in certain instances there would be stronger directives including in relation to content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent. The Commission would be mandated to determine the adequacy of the measures adopted by the operators of regulated services, including taking into account whether the measures are designed or implemented in a discriminatory manner within the meaning of the Canadian Human Rights Act. It also would be open to the Commission to adopt regulations prescribing more specific measures for operators to follow. Where there is non-compliance by operators, actions could be taken by the Commission up to and including the issuance of an administrative monetary penalty. As the fulfilment of the duty to act responsibly by the operators may affect how and even whether the specified content appears on a service, the imposition of the duty has the potential to engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency of the operators’ duty to act responsibly with the Charter. The protection of freedom of expression is premised upon fundamental principles and values that promote the search for and attainment of truth, participation in social and political decision-making and the opportunity for individual self-fulfillment through expression. Some of the harmful content, namely content taking the form of violence or directed towards violence or terror, or threatening violence, generally is not protected by the Charter. The other harmful content generally is expression that lies far from the core of the guarantee of free expression, meaning it receives less protection under the Charter. Much of it undermines rather than furthers the purposes for which freedom of expression is guaranteed in the Charter. It poses risks of harm to other users, in particular members of more vulnerable groups, and jeopardizes other users’ own full participation in expressive activities and civil discourse online.
The objectives of imposing the duty include: having the risks of exposure to harmful content by users reduced in order to protect their safety, and reducing the harms caused to persons in Canada as a result of harmful content online. The Online Harms Act would generally not take a prescriptive approach to the measures that operators should adopt in the expectation that a more flexible approach would enable operators to use their knowledge, expertise and innovation to effectively mitigate risks. The intent of the definitions of the several categories of harmful content is to capture only content that poses a risk of harm, while leaving out lower or no-risk content. The Act would not require operators to proactively search for harmful content, although the Commission may make regulations requiring operators to adopt technology to prevent the uploading of content that sexually victimizes a child or revictimizes a survivor. The Act also would not apply to social media services that do not enable users to communicate content to the public and would not require operators to address any harmful content that may be disseminated via any private messaging features of their regulated services. Nor would the Act apply to websites that do not host user-generated content. The Online Harms Act would not impose direct consequences for users posting harmful content, beyond whatever measures operators may take to reduce the risks posed by their content. In other words, users would not be penalized, that is, convicted, fined or found liable in any way under the Online Harms Act for posting content that poses a risk of harm. Any such consequences could be imposed separately under other statutes, such as the Criminal Code or the Canadian Human Rights Act, if content contravenes those statutes and proceedings are pursued under those statutes. As for the measures that an operator may adopt, the Act provides that operators are not required to implement measures that unreasonably or disproportionately limit users’ expression on their services.
Duty to make certain content inaccessible
In addition to the general duty to act responsibly, under the Online Harms Act operators would have the more specific duty to make inaccessible to all persons in Canada content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent, including deepfake images that fall under this category of content. Under this duty, operators of regulated services would have to make the content inaccessible if they become aware of it themselves in the course of operations or if it is flagged by users. When content is flagged, operators would need to make it inaccessible on a preliminary basis within 24 hours (or a different period that could be set out in regulation) unless the flag is trivial, frivolous, vexatious or made in bad faith, or duplicative of another flag. Those who post affected content as well as those who flag it would be provided the opportunity to make representations as to the character of the content before a final decision is made to make the content permanently inaccessible. Additional aspects of procedural fairness, such as notice of actions and decisions, and the right to request reconsideration of a decision, would also be provided. The requirement to make the specified content inaccessible has the potential to engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency with the Charter of the duty to make certain content inaccessible to persons in Canada. Content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent is expression that lies far from the core of the guarantee of free expression. Generally, expression that lies far from the core receives less protection under the Charter. Dissemination of some of this content already constitutes an offence under the Criminal Code. The objective of the duty to make this content inaccessible when it has been flagged or otherwise comes to an operator’s attention would be both to mitigate the risk of harm to users and to the public caused by the dissemination of such harmful content online, and more specifically to protect identifiable individuals who are the victims of such content. Expression that degrades or dehumanizes is harmful in and of itself and members of society suffer when harmful attitudes are reinforced by the proliferation of such content. The dissemination of such content is also specifically harmful to the persons depicted, including to their privacy, mental health and dignity. The intent of the scope of the definitions of these two types of content is to capture only what poses a risk of the identified harms, while leaving out lower or no-risk content. The opportunity for interested parties to make submissions on a decision to make content inaccessible would provide a safeguard against errors in the interpretation or application of the definitions in less obvious circumstances. The Act would not require operators to proactively search for harmful content, nor to mitigate the risks posed by any content that may be disseminated via any private messaging features of their regulated services. The transmission of child pornography by private messaging is already prohibited under the Criminal Code.
Labelling of bot-disseminated harmful content
To address the specific role of bots in the proliferation of harmful content, under the Online Harms Act an operator of a regulated service would be required to attach a label to bot-disseminated harmful content identifying it as such, if it has reasonable grounds to believe that the content is both the subject of multiple instances of automated communication on the service by a computer program, and is more prominent on the service than it would have been had it not been disseminated in this way. A requirement that operators attach labels to content has the potential to engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency of the labelling requirement with the Charter. The objective of the measure would be to inform users that they are engaging with content whose dissemination is automated and amplified, and so not a product of popular and genuine human interest. Users could then choose to value the content accordingly, which could mitigate the manipulative effects of bots. The obligation on the operators would be limited to communicating factual information and does not require operators to remove or address automated communications in any other way.
Commission complaint process
Under the Online Harms Act, persons in Canada would be able to make a complaint to the Digital Safety Commission that content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent is accessible on a regulated service. When a complaint is made, the Commission would have to make a preliminary order to the operator to make the content inaccessible unless it determines that the complaint is trivial, frivolous, vexatious or made in bad faith, or duplicative of another complaint. The Commission would then proceed to finally decide whether there are reasonable grounds to believe that the content is as alleged. Users who post the content and complainants would be provided the opportunity to make representations before a final decision is made, as well as be provided notice of actions and decisions. The Act would also provide that the operator be provided notice of the complaint and decisions. If the Commission makes a final decision that the content in question is content that sexually victimizes a child or revictimizes a survivor, or is intimate content communicated without consent, it would have to give notice of its decision to the operator of the regulated service and would have to make an order requiring the operator to make the content permanently inaccessible to persons in Canada. Orders of the Commission would be subject to judicial review. The requirement to make this content inaccessible on a service has the potential to engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency of the complaint process with the Charter. Content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent lie far from the core of the guarantee of free expression. Generally, expression that lies far from the core receives less protection under the Charter. As with the duty to make such content inaccessible to persons in Canada when it has been flagged or otherwise comes to an operator’s attention, the purpose of the complaint mechanism would be both to mitigate the risk of harm to users and to the public caused by the dissemination of such harmful content online, and more specifically to protect identifiable individuals who are the victims of such content. Content that degrades or dehumanizes other people is inherently harmful, and individuals and groups suffer when harmful attitudes are promoted or reinforced by the spreading of it. The dissemination of such content is also specifically harmful to the persons depicted, including to their privacy, mental health and dignity. The complaint mechanism would provide an additional safeguard leading to an enforceable order from the Commission requiring that specific content be made inaccessible. It would provide an independent process for the determination of whether particular content must be made inaccessible. Under the Online Harms Act, the only consequence to the person posting relevant content would be that their content would be made inaccessible. There would be no obligation on the poster to be identified or to participate in the complaints process. There could be other consequences for a poster under other statutes, including criminal liability under the Criminal Code for distributing child pornography.
Power to hold hearings in private
Any hearings the Digital Safety Commission decides to hold in relation to an operator’s compliance with the Online Harms Act or related to a complaint that intimate content communicated without consent or content that sexually victimizes a child or revictimizes a survivor is accessible on a regulated service would be open to the public by default. However, a hearing could be held in private in whole or in part to protect important public and private interests, including national security, the interests of victims of harmful content, privacy and confidential business information. The existence of an authority to close adjudicative proceedings to the public in limited circumstances to protect sensitive information is well-established in courts and is a routine feature of administrative tribunals that deal with sensitive matters. The authority to close hearings to the public has the potential to engage the “open court principle” that is part of the guarantee of freedom of expression, which includes freedom of the press.
The following considerations support the consistency with the Charter of the Commission’s discretion to close hearings to the public. Like other Charter rights, the open court principle is not absolute and may be limited in order to advance pressing objectives. The proposed power to allow a closed hearing is limited to circumstances where the Commission considers that the protection of other interests outweighs the public interest in open proceedings. This could include circumstances where a person with a complaint about content that revictimizes them as a survivor of child sexual exploitation would be giving evidence. The exercise of the discretion to close a proceeding to the public would have to take into account the relevant interests at stake and strike an appropriate balance, including by limiting the use of closed proceedings to what is necessary to protect the countervailing interest. This could include keeping confidential the identity of a victim of child sexual exploitation, and hearing behind closed doors any of their relevant evidence that might reveal their identity while holding the remainder of the hearing in public.
Protection of identities of employees
Where an employee of an operator makes submissions to the Digital Safety Commission respecting harmful content that is accessible on a regulated service or the measures taken by the operator of a regulated service to comply with its duties under the Online Harms Act, they would be able to request that their identity and any potentially identifying information be kept confidential by relevant Commission officials. Disclosure of the confidential information would be an offence. Prohibitions on persons’ liberty to communicate information may engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency with the Charter of the prohibitions on the disclosure of confidential information. The purpose of the prohibition would be to protect the identity of employees of operators, such as whistleblowers for example, in order to avoid those employees facing potential negative consequences from their employers, in order to encourage employees to assist the Commission in its mandate to further the purposes of the Act. Making it an offence to disclose identifying information would provide some assurance to persons well-positioned to witness and understand operators’ non-compliance with the Act that they can come forward without risking their livelihoods or other retribution by their employers. The offence of disclosing identifying information would only be punishable by a fine, without the possibility of imprisonment.
Protection of confidential information
The Online Harms Act would allow information to be designated as confidential by persons who submit it to the Digital Safety Commission or the Digital Safety Ombudsperson or by the person to whose business or affairs it relates if the information is a trade secret or if it is financial, commercial, scientific or technical information that is confidential and that is treated consistently in a confidential manner by the person who submitted it or the person to whose business or affairs it relates. Relevant officials would be required to keep the information confidential, and its improper disclosure would be an offence. Prohibitions on persons’ liberty to communicate information may engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency with the Charter of the prohibitions on the disclosure of confidential information. The purpose of the prohibition is to protect confidential business information from disclosure that could cause economic harm to its owners. The protection of confidentiality is a means to mitigate the commercial risk to owners of such information and the impact on their privacy that is caused by its necessary disclosure to the Digital Safety Commission or Digital Safety Ombudsperson to advance the purposes of the Online Harms Act. The prohibition only applies to a disclosure that is likely to make confidential information available for the use of a person who may benefit from it or use it to the detriment of any other person to whose business or affairs the information relates. The commission of the offence would only be punishable by a fine, without the possibility of imprisonment.
Publication of notice of violation or undertaking
Where an operator of a regulated service is found to have committed a violation of the Online Harms Act, the Digital Safety Commission would be able to order that operator to publish a notice setting out the facts of the violation. The same may be done in relation to any undertaking that an operator enters into with the Commission related to acts or omissions under the Act. A requirement that operators communicate information they may not otherwise choose to communicate has the potential to engage freedom of expression in section 2(b) of the Charter.
The following considerations support the consistency with the Charter of requiring operators to publish a notice of a violation or of an undertaking. Whether an operator would be ordered to publish would be at the discretion of the Digital Safety Commission. In exercising its discretion, the Commission would have to take into account any freedom of expression of an operator that may be affected and weigh it against the purposes of the Online Harms Act in order to make a reasonable and proportionate decision. The only information that could be required to be published is factual in nature and relates to an operator’s non-compliance with statutory requirements. Relevant purposes of the Online Harms Act in this context would include ensuring that operators are transparent and accountable with respect to their duties under the Act, including by informing users of the duties operators are obliged to fulfil and the consequences for any failures to do so as a way of enabling and facilitating greater accountability.
Regulatory inspection and enforcement powers
Generally, the Online Harms Act would regulate and impose legal obligations on operators of social media services, not on users of those services. To facilitate monitoring operators for compliance with the Act, including the investigation of complaints that certain harmful content is accessible on a regulated service and other administrative violations, the Commission would be granted routine regulatory powers such as the powers to summon persons to give evidence, to compel the production of information and documents from operators, and to enter and inspect business premises and anything relevant found in them. The exercise of such powers has the potential to intrude upon a reasonable expectation of privacy so as to engage section 8 of the Charter. The power to compel an individual, for example an employee of an operator, to attend at a specified place for oral examination under oath potentially engages the liberty interest under section 7 of the Charter.
The following considerations support the consistency of the compliance monitoring authorities with the Charter. The purpose of these authorities would be to monitor operators for compliance with the regulatory regime and to support administrative enforcement of the Act through orders and administrative monetary penalties, not to investigate offences with a view to imposing criminal sanctions. These authorities would not be available for the investigation of offences, which would need to be done by police relying on traditional criminal investigative tools such as a warrant. Generally, persons have a reduced expectation of privacy in the regulatory context and compliance monitoring activities such as inspections and demands for the production of documents relevant to regulating a business can be done without prior approval from a judge. Statutory powers to enter business premises, demand the production of business records, and compel information from business operators and their employees are routine in regulatory statutes and have been upheld as reasonable under the Charter.
Power to grant access to data sets and information
In order to generate more information, transparency and arms-length scrutiny on matters related to the identification, management, and mitigation of risk on regulated services, the Commission would be authorized to give accredited persons access to an operator’s data sets and information. Access could be given to information that operators have already provided to the Commission, and the Commission could also order operators to give access to other relevant data. Enabling the Commission to give access to an operator’s data and information has the potential to intrude upon their reasonable expectations of privacy so as to engage section 8 of the Charter.
The following considerations support the consistency with the Charter of the proposal to give access to accredited persons. The objective would be to contribute to government and public understanding of regulated services to better enable holding them to account for the risks they pose. Organizations seeking such access could only be accredited if their primary purpose is to conduct research or engage in education, advocacy or awareness activities, and their research or engagement in such activities are related to the purposes of the Act. The Act contemplates the making of regulations by the Commission specifying conditions with respect to confidentiality, intellectual property, data security and the protection of personal information under which access to electronic data is granted. In exercising the discretion whether to grant access to an accredited person, including the nature and extent of any access, the Commission would have to take into account any privacy interest an operator of a service may have and weigh it against the purposes of the Act in order to make a reasonable and proportionate decision. An order granting access could also include conditions as to confidentiality to protect affected privacy interests.
Preservation of data
Where content that incites violence or content that incites violent extremism or terrorism is accessible on a regulated service, and is then made inaccessible by the operator, the Act would require the operator to preserve that content as well as related computer data in its possession or control for one year. Related computer data could include data that would help locate or identify the user who posted the content. This information would not be automatically provided to the police. However, during the one-year period, it would be open to law enforcement and national security agencies to seek judicial authorization to obtain relevant information, relying on existing authorities such as a warrant. The requirement to preserve a user’s content and related data to facilitate access to it by law enforcement and national security agencies has the potential to intrude upon a reasonable expectation of privacy so as to engage section 8 of the Charter.
The following considerations support the consistency of the preservation requirement with the Charter. Some of the affected content, namely content taking the form of violence or directed towards violence or terror, or threatening violence, generally is not protected by the Charter. Actions by operators to address content that incites violence or content that incites violent extremism or terrorism by removing it from their platforms could make it more challenging for law enforcement and national security agencies to investigate users or activities related to such content. An obligation to preserve relevant content that operators make inaccessible would help mitigate the negative impact on these agencies’ ability to pursue their important mandates to protect the public and address threats to national security. Operators would not be required to preserve content that was never made accessible on the regulated service. The only content that would be preserved would be content that users made public, and data related to that content. The Act would not authorize law enforcement and national security agencies to automatically access preserved content and data; the agencies could only access it pursuant to existing authorities such as a warrant. Operators would only be obligated to preserve the information for one year and would then be required to destroy it, unless subject to judicial order or if it would be retained in the ordinary course of business.
Administrative monetary penalties
The Commission would be provided the authority to inquire into the compliance of regulated services with the Act and regulations, and to impose administrative monetary penalties on an operator where there are reasonable grounds to believe that they have committed a violation. The Commission could impose a penalty in an amount up to a maximum of 6% of an operator’s global revenues or $10 million, whichever is higher. The provisions could result in the imposition of substantial monetary penalties for violations of the law, and therefore have the potential to engage section 11 rights.
The following considerations support the consistency of these provisions with the Charter. The proceedings leading to the imposition of an administrative monetary penalty would be administrative in nature. The purpose of any penalty would be to promote compliance with the Act rather than to “punish”, as that concept is defined for the purposes of section 11 of the Charter. There is no minimum amount that must be assessed against a violator, allowing the Commission to appropriately tailor the amount of any penalty to the purpose of promoting compliance with the Act. The Act, properly interpreted and applied, would not authorize the imposition of a penalty that could give rise to “true penal consequences” and so would avoid engaging the fair trial rights guaranteed in section 11. Finally, the civil enforcement of a penalty would be in the Federal Court of Canada and non-payment could not result in imprisonment.
Part 2 - Criminal Code amendments
Increase to maximum terms of imprisonment and new hate crime offence
The Bill would amend the Criminal Code to increase the maximum punishments for the four hate propaganda offences in sections 318 and 319 of the Criminal Code. It would raise the maximum sentence for the offence of advocating or promoting genocide against an identifiable group in section 318, which is an indictable offence, from five years to a maximum sentence of imprisonment for life. It would raise the maximum sentence for the hate propaganda offences in section 319, when prosecuted as indictable offences, from two to not more than five years of imprisonment. There would be no change to the maximum punishments imposed for these offences when prosecuted as summary conviction offences. The offences in section 319 are inciting hatred against an identifiable group in a public place where it is likely to lead to a breach of the peace; wilfully promoting hatred against an identifiable group; and wilfully promoting antisemitism by condoning, denying or downplaying the Holocaust.
The Bill would also create a new hate crime offence of committing an offence under the Criminal Code or any other Act of Parliament that is motivated by hatred based on race, national or ethnic origin, language, colour, religion, sex, age, mental or physical disability, sexual orientation, or gender identity or expression. The new offence would carry a maximum punishment of imprisonment for life.
The proposal to increase the maximum terms of imprisonment for the hate propaganda offences and the proposed new hate crime offence, which would provide for criminal charges and the possibility of imprisonment, engage the right to liberty in section 7 of the Charter and must be consistent with the principles of fundamental justice. Potential inconsistencies between these proposals and the principles of fundamental justice in section 7 have not been identified. The new hate crime offence is tailored to its objective of combating crimes motivated by hatred, which the Bill defines. The higher maximum punishments, including the possibility of a life sentence for the offence of promoting or advocating genocide and the new hate crime offence, reflect the seriousness of these offences while preserving the discretion of judges upon conviction to impose a fit and appropriate sentence. The Supreme Court of Canada has held that a maximum penalty of any kind will, by its very nature, be imposed only rarely and in light of the general principles of the Criminal Code, applied in an individualized context. Section 718.1 of the Criminal Code articulates the fundamental principle of sentencing, which provides that a sentence must be proportionate to both the gravity of the offence and the degree of responsibility of the offender. Accordingly, it is expected that the maximum sentence of life imprisonment would be reserved for conduct that falls to the most serious end of the spectrum of offending captured by these offences. A right of appeal also exists to guard against disproportionate sentences.
Because the proposed amendments do not include a mandatory minimum penalty, they do not engage section 12 of the Charter. The Supreme Court of Canada has held that a sentencing provision establishing a high maximum penalty, but no mandatory minimum, will not itself limit section 12. As discussed above in relation to section 7, the inclusion of a high maximum penalty does not alter a trial judge’s obligation to craft individual sentences that are proportionate to both the gravity of the offence and the degree of responsibility of the offender. For this reason, the maximum sentence of life imprisonment would be reserved for the most serious forms of conduct captured by these offences. In all cases, judges are obligated to impose proportionate penalties that reflect the seriousness of the offence and the degree of responsibility of the offender; any sentence imposed that would be grossly disproportionate would amount to a legal error that could be corrected on appeal of the individual sentence and would not call into question the validity of the sentencing provision itself.
Definition of hatred
The Bill would create a definition of “hatred” for two of the hate propaganda offences in section 319 (publicly inciting hatred where it is likely to lead to a breach of the peace and wilfully promoting hatred), for the new hate crime offence, and for the new peace bond authority. The definition would specify that “hatred” involves detestation or vilification and does not mean disdain or dislike. The Bill further specifies that the communication of a statement does not incite or promote hatred solely because it discredits, humiliates, hurts or offends. This would codify decisions of the Supreme Court of Canada.
The proposed definition of “hatred” has the potential to engage freedom of expression in section 2(b) of the Charter. The following considerations support the consistency of the proposed amendment with section 2(b). The proposed definition reflects the way “hatred” has been defined by the Supreme Court of Canada. The Supreme Court articulated a definition of “hatred” in Saskatchewan (Human Rights Commission) v. Whatcott (2013) that characterizes the concept as connoting extreme manifestations of detestation or vilification, which go beyond mere dislike or causing humiliation or offence. The Supreme Court defined “hatred” in similar terms in R. v. Keegstra (1990), while considering the constitutionality of the offence of wilful promotion of hatred against an identifiable group in section 319 of the Criminal Code. The definition would focus on vilification and detestation, and it would also clarify what does not constitute hatred. The proposed amendment would thus codify a definition settled in the leading jurisprudence of the Supreme Court of Canada.
New peace bond authority
The Bill would create a new recognizance, often referred to as a “peace bond”, that would seek to prevent the commission of hate propaganda offences and hate crimes, whether online or offline. This provision would be modelled on existing peace bond provisions in the Criminal Code. It would allow a judge to impose an order requiring a person to keep the peace and be of good behaviour if the judge is satisfied by the evidence that there are reasonable grounds to fear that the person will commit a hate propaganda offence in section 318 or section 319, or the proposed new hate crime offence under section 320.1001. A judge would be able to impose the peace bond for a period of up to one year, or two years if the defendant has previously been convicted of the above-noted hate offences.
While peace bonds are a preventative tool, they are frequently used as a means of resolving less serious criminal matters where the parties agree, and where there is a need to maintain some further supervision over the person in order to prevent future offending. The new peace bond could be imposed as an alternative to pursuing criminal charges for certain types of hate crime offences such as hate-motivated mischief, if the parties agree and there is a need to deter further offending.
The new authority would also provide that a judge may impose any reasonable condition as part of the peace bond in order to secure the defendant’s good behaviour and prevent the potential offences from occurring. These conditions would be optional and could only be imposed if considered reasonable in the circumstances and necessary to address the specific threat posed by the defendant. Some non-exhaustive conditions that a judge could impose, if warranted by the specific circumstances of the case include: prohibitions on communicating with any person identified in the peace bond and not going to any place specified in the peace bond, except in accordance with conditions that the judge considers necessary; requirements to return to and remain at their place of residence at specified times; drug or alcohol prohibitions; weapons prohibitions. These conditions are similar to those available for existing peace bonds, for example, peace bonds that may be imposed under the Criminal Code in relation to fear of personal injury or damage to property, fear of a criminal organization offence, and fear of serious personal injury, among others. A defendant who failed or refused to enter into the recognizance could be subject to imprisonment for up to one year. In addition, a defendant who violated the terms of the order could be charged with the offence of breaching a recognizance under section 811 of the Criminal Code and, if prosecuted as an indictable offence, face imprisonment for up to four years. It can also be prosecuted as a summary conviction offence.
In addition, the Bill would amend the Youth Criminal Justice Act to give youth justice courts exclusive jurisdiction to impose this new peace bond on young persons, a jurisdiction that youth courts already have for other peace bonds.
The existing peace bonds in the Criminal Code include provisions enabling their enforcement. If a judge decides it is appropriate in a particular case to impose a peace bond condition prohibiting drug or alcohol consumption, these existing provisions allow for the collection of bodily substances to monitor compliance with the prohibition. The Bill would extend these existing rules for the collection of bodily substances to the new peace bond. The Bill would not change these rules.
Because the peace bond authority would allow a judge to impose an order restricting behaviour and liberty of movement based on the likely prospect of a specific offence being committed, and because a defendant may face imprisonment for not entering into the peace bond or by breaching a condition of the peace bond, the proposed authority affects liberty interests under section 7 of the Charter and must be consistent with the principles of fundamental justice. Freedom of expression in section 2(b) of the Charter may also be engaged because the peace bond authority is linked to the hate propaganda offences and because it provides for the potential imposition of a condition prohibiting communicating with an identified person.
The following considerations support the consistency of the peace bond authority with section 7 and section 2(b). The provisions serve the important objective of preventing hate propaganda and hate crime offences and protecting potential victims from harm. The proposed authority, including the maximum duration of the order and the conditions that could be imposed, is structured along the lines of existing recognizance provisions in the Criminal Code that have been found to comply with the Charter. Conditions that may be imposed under existing peace bonds include a requirement not to communicate with a specified person (such as in cases of criminal harassment) and/or not to communicate using particular means (such as via text message or by posting on social media). These conditions must be linked to the conduct that is feared, which is a safeguard to ensure liberty interests and expressive freedom are impaired as little as necessary. As with existing recognizance provisions, a hearing would be held during which the parties can make submissions before an order is imposed. The authority to impose an order would be subject to a standard of “fear on reasonable grounds” that one of the offences at issue will be committed, which courts have interpreted as meaning more than mere suspicion. Judges are required to look objectively at the fear and determine whether a reasonable person in the same situation would have the same fear or belief before imposing the order. Even where the “fear on reasonable grounds” standard is met, a judge would retain discretion on whether to impose the peace bond and any conditions, which would need to be exercised in accordance with the Charter. Any order imposed would be time limited. A person subject to an order could appeal the order and could at any time seek to vary the conditions. In addition, applying for the peace bond would require the consent of the Attorney General, which has been recognized by the Supreme Court as providing a degree of procedural protection against misuse.
If a judge decides it is appropriate in a particular case to impose a drug or alcohol prohibition in a peace bond, the proposed amendments would allow for the collection of bodily substances to enforce such a condition. These amendments have the potential to engage the protection against unreasonable searches and seizures in section 8 of the Charter. The following considerations support the consistency of these amendments with section 8. The provisions, which are the same as those that apply to five existing peace bonds in the Criminal Code, provide lawful authority for the collection of bodily substances for the limited purpose of enforcing compliance with a peace bond condition that the defendant abstain from consuming drugs, alcohol or other intoxicating substances. The imposition of such a condition would be subject to all the safeguards applicable to the new peace bond authority that are described above. The defendant’s privacy interest would be protected by the existing Criminal Code provisions that require regulatory standards and safeguards to be in place for the collection, storage, analysis and destruction of the samples and related records.
Part 3 - Canadian Human Rights Act amendments
Prohibition on hate speech
The Bill would amend the Canadian Human Rights Act (CHRA) to provide that it is a discriminatory practice to communicate, or cause to be communicated, hate speech by means of the internet or other telecommunications in a context in which the hate speech is likely to foment detestation or vilification of an individual or a group of individuals on the basis of a prohibited ground of discrimination. This prohibition would not apply to private communication. The Bill would empower the Canadian Human Rights Commission to accept complaints alleging this discriminatory practice and the Canadian Human Rights Tribunal to adjudicate complaints and order remedies.
“Hate speech” would be defined, following guidance from the Supreme Court of Canada, as the content of a communication that expresses detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. The prohibited grounds of discrimination in the CHRA are race, national or ethnic origin, colour, religion, age, sex, sexual orientation, gender identity or expression, marital status, family status, genetic characteristics, disability, and conviction for an offence for which a pardon has been granted or in respect of which a record suspension has been ordered. The Bill would provide, for greater certainty, that content of a communication does not express detestation or vilification solely because it expresses mere dislike or disdain or it discredits, humiliates, hurts or offends. The Commission would be required to dismiss at an early stage complaints that do not involve hate speech as defined.
Hyperlinking, hosting, and caching would be deemed not to communicate or cause to be communicated hate speech. Telecommunications service providers and broadcasters would be excepted. Complaints could not be made against the operators of social media services as defined in the Bill.
The Bill would empower the Canadian Human Rights Commission to receive complaints alleging instances of this discriminatory practice. A provision would require the Commission to reject any complaint that indicates no hate speech according to the definition. The Bill would empower the Canadian Human Rights Tribunal to adjudicate complaints forwarded by the Commission and to award remedies where the complaint is substantiated. The new provision would not create an offence. The primary remedy, and most likely remedy to be ordered in most cases, would be an order to cease the communication of the hate speech. Monetary remedies would be available only in special circumstances. In general, the available remedies would be:
- an order that the person cease the discriminatory practice and take measures, in consultation with the Commission on the general purposes of the measures, to redress the practice or to prevent the same or a similar practice from occurring in the future;
- an order to pay compensation of not more than $20,000 to any victim personally identified in the communication that constituted the discriminatory practice, for any pain and suffering that the victim experienced as a result of that discriminatory practice, so long as the person created or developed, in whole or in part, the hate speech indicated in the complaint;
- an order to pay a penalty of not more than $50,000 to the Receiver General if appropriate having regard to the nature, circumstances, extent and gravity of the discriminatory practice, the wilfulness or intent of the person who is engaging or has engaged in the discriminatory practice, any prior discriminatory practices that the person has engaged in, and the person’s ability to pay the penalty.
The proposed amendments to the CHRA engage freedom of expression in section 2(b) of the Charter. The Bill also has the potential to engage section 2(a) of the Charter. Section 2(a) is potentially engaged if the expression of religious or conscientious beliefs involves the communication of hate speech that would constitute a discriminatory practice.
Courts have upheld previous versions of the proposed amendments to the CHRA and a similar provincial law as justified limits on fundamental freedoms.
The following considerations support the consistency of the Bill with section 2(b) and section 2(a) of the Charter.
The Bill would advance the objective of the CHRA that all individuals should have an equal opportunity to make a life for themselves without being hindered by discrimination based on the prohibited grounds. The Bill would seek to protect the social standing of members of groups identified by these prohibited grounds of discrimination from the discriminatory effects of hate speech.
The limits on hate speech are linked to the above objectives. The Supreme Court of Canada has found in cases such as Saskatchewan (Human Rights Commission) v. Whatcott (2013) that hate speech communicated on the internet and through other telecommunications can hinder equal opportunity and undermine social standing by inflicting grave psychological and social consequences on members of these groups. It can lay the groundwork for later attacks on group members, including discrimination, ostracism, segregation and violence. By marginalizing the group and forcing its members to argue for their basic humanity or social standing, hate speech creates barriers to their full participation in our democracy.
The definition of hate speech is limited to content that expresses detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. The Bill would clarify that hate speech does not include expression of mere dislike or disdain nor expression that merely discredits, humiliates, hurts or offends. Private communications would be excluded. The Bill targets only an extreme and marginal type of expression and leaves untouched almost the entirety of political and other discourse.
The effects of the Bill on freedom expression are outweighed by the benefits of protecting members of vulnerable groups identified by a protected ground of discrimination. Hate speech falls far from the core values underlying freedom of expression and can in fact impede those values. Hate speech detracts from the search for truth and can distort or limit the robust and free exchange of ideas.
The CHRA is a complaint-based regime that seeks to remedy discriminatory practices rather than to punish persons who engage in them. The discriminatory practice defined in the Bill is not an offence. The available remedies are limited to those set out in the Bill.
Confidentiality provisions
The Bill would amend the CHRA to enable the Canadian Human Rights Commission and Canadian Human Rights Tribunal to make confidentiality orders where needed to protect hate speech complainants, witnesses or alleged victims from retaliation. The proposed amendments would empower the Commission, in dealing with a hate speech complaint, to order a person not to disclose the identity of a complainant, witness or alleged victim, where there is a real and substantial risk that those persons could be subject to threats, intimidation or discrimination. The Commission would also be able to process the complaint anonymously if necessary to prevent these risks. The Bill would give the Tribunal expanded powers to take measures or make orders to ensure the confidentiality of the proceedings and protect the identities of complainants, witnesses or alleged victims where there is a real and substantial risk that those persons could be subject to threats, intimidation or discrimination. Failure to comply with confidentiality orders would constitute an offence under the CHRA and the offence would be punishable by a fine.
Permitting the Commission and Tribunal to prohibit disclosing the identity of hate speech complainants, witnesses or alleged victims and ensure the confidentiality of proceedings potentially engages the “open court principle” in section 2(b) of the Charter. The open court principle applies to courts and administrative tribunals that have a quasi-judicial role like the Canadian Human Rights Tribunal.
The following considerations support the consistency of the proposed amendments with section 2(b) of the Charter. The proposed amendments do not themselves limit freedom of expression. Rather, they empower the Commission and Tribunal to develop and require case-specific limits on freedom of expression only if it would be reasonable for them to do so. Confidentiality decisions by the Commission and Tribunal will be discretionary. The Commission and Tribunal are subject to the Charter and must exercise their discretion in a way that complies with the Charter. The Commission and Tribunal will be required to assess the risk of harm and to balance that risk against any limits on freedom of expression that result from the confidentiality measures or orders and to make decisions in accordance with their duty of procedural fairness. Commission and Tribunal decisions are subject to review by the courts.
Promoting equality rights
The proposed amendments to the Canadian Human Rights Act and the Criminal Code promote the aims of equality rights protected by section 15 of the Charter. Equality entails the promotion of a society in which all are secure in the knowledge that they are recognized at law as human beings equally deserving of concern, respect and consideration. The Bill furthers the core values that underpin section 15 by helping to combat prejudice and by discouraging, denouncing and remedying discrimination against groups targeted by hate speech, hate propaganda and hate crimes. In so doing, the Bill also promotes the ability of targeted groups to exercise their freedom of expression online without being undermined or silenced by hate speech.Part 4 - Amendments to An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service
Transmission data
An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service requires internet service providers to notify law enforcement when they have reasonable grounds to believe that their service is being or has been used to commit a child pornography offence. The Bill would amend the Act to require service providers to include transmission data in their reports in circumstances in which the content is manifestly child pornography. Transmission data is defined in the Criminal Code as encompassing data related to the routing of a communication (for example, telephone number, Internet Protocol address, port number, date and time) and information that allows service providers to authenticate the user and their devices (for example, the International Mobile Equipment Identity (IMEI) number or Subscriber Identity Module (SIM) card). Transmission data does not reveal the substance, meaning or purpose of the communication.
Because mandatory reports under the Act have the potential to interfere with privacy interests, the amendment may engage rights under section 8 of the Charter. The following considerations support the consistency of the amendment with section 8. The Act would provide lawful authority for interferences with privacy interests that may result from the mandatory disclosure of transmission data. The amendment serves the compelling purpose of protecting children against child exploitation offences, which are facilitated by the power and anonymity of the internet. The amendment would enable law enforcement to act more expeditiously in protecting children, a highly vulnerable group, from serious harm. It would do so by providing police, in the initial report, with more of the information that they need to apply to a court for a production order to identify the individual responsible for the child pornography offence. Transmission data does not include or directly reveal the substance, meaning or purpose of the communication. Internet service providers are not obligated to look for child pornography under the Act and would only be required to include transmission data in their mandatory reports where the content is manifestly child pornography. This is a high threshold that narrows the state’s online reach to situations and information that are clearly relevant to the investigation of serious child exploitation offences.
- Date modified: