Standing Committee on Access to Information, Privacy and Ethics - Protection of Privacy and Reputation on Platforms such as Pornhub - April 12, 2021

Tab 3 Summary of committee study to date

Summary of ETHI Meetings

To date, ETHI has held four meetings, heard testimony from 21 witnessesFootnote1 and received 43 written briefs. Witnesses to date included three executives from MindGeek; Cybertip.ca and its American equivalent, the National Center for Missing and Exploited Children; the RCMP; Justice Canada, and representatives from other organizations as well as individuals such as subject matter experts, and victims.Footnote 2 The following summarizes some key submissions received by ETHI.

MindGeek Canada

MindGeek is one of the largest brands in online adult entertainment, and Pornhub, their flagship video-sharing platform site, is one of the top five most visited websites on the internet. Their Chief Executive Officer (CEO) said that MindGeek shares the Committee’s concern about the spread of unlawful content online and the non-consensual sharing of intimate images. Such images are contrary to their values and business model. He also said that the vast majority of attempts by criminals to use their platforms for illicit material are stopped. There should be no child abuse material on their website because all content is viewed by human moderators and goes through software. MindGeek’s evidence indicated that they are partnered with the U.S.‑based National Center for Missing and Exploited Children (NCMEC) (although NCMEC subsequently rejected the characterization of “partners”). The CEO said that MindGeek reports every case of Child Sexual Abuse Material (CSAM) when they are aware of it, so that authorities can investigate. He also stated that MindGeek shares the objectives housed in the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse, developed for digital industry to combat child sexual exploitation and abuse on their networks (developed in consultation with industry following discussions at the Five Country Ministerial and Quintet of Attorney Generals meetings). However, when asked if there were any reports from their company to NCMEC in 2019, the Chief Operating Officer (COO) said that he would have to verify.

The company testified it has taken measures to ensure that all material is traceable and verifiable. They are more vigilant in moderating than “almost any other platform”. Content can only be uploaded by professional studios and verified users and creators, and the user/creator’s personal identity and date of birth are confirmed by MindGeek. In questioning, the COO said that 9 to 10 million videos were removed last year due to unverified users uploading material. To ensure that removed content cannot make its way back to their platform, MindGeek is (1) training their staff to remove such material upon request, and (2) digitally fingerprinting any content removed from their website so that it cannot be re-uploaded to their platform. MindGeek is also implementing a new tool called “SafeGuard” to help fight the distribution of non-consensual intimate images.

Canadian Centre for Child Protection (C3P)

This charity operates Cybertip.ca, a tip line for reporting online sexual exploitation of children. C3P discussed how the proliferation of illegal pornographic material harms children. One tool that they use to combat this is Project Arachnid, which processes tens of thousands of images per second to detect known CSAM—allowing them to quickly identify and trigger the removal of the content. Arachnid has processed more than 126 billion images and has issued over 6.7 million takedown notices to providers around the world. In particular, Arachnid has confirmed 193 instances of CSAM on MindGeek’s platforms in the past three years. C3P does not believe MindGeek’s claim that moderators manually review all content that is uploaded to their services. It believes that self-regulation by the industry results in harm to children.

Friends of Canadian Broadcasting

The witness discussed “Platform for Harm”, a legal analysis that shows that platforms like Pornhub are already liable for the illegal user-generated content they promote. Platforms promoting illegal content are liable if they know about the illegal content in advance and publish it anyway or if they are made aware of it post-publication and neglect to remove it. Pornhub employs software to find offending content because human moderation is expensive; yet, they indicate that they are blameless when their software does not work. If widespread distribution of illegal content is an unavoidable side effect of a business, then that business should not exist. In questioning, the witness argued that if entities like the CBC and CTV are capable of ensuring that all their material is lawful, adult content sites should have the same responsibility.

National Center for Missing and Exploited Children (NCMEC

NCMEC clarified that is not a partner with Pornhub, contrary to MindGeek’s evidence. Pornhub has registered to voluntarily report instances of child sexual abuse material on its website to NCMEC—that does not create a partnership. MindGeek has signed agreements with NCMEC to access their hash-sharing databases, however, Pornhub has not yet taken steps to access these databases or use these hashes. NCMEC also spoke about how there have been several cases where victims contacted Pornhub to remove illegal content but the content was only removed when NCMEC asked Pornhub to do so. NCMEC argued that it is essential for websites like Pornhub to have effective means to review content before it is posted, to remove content when it is reported as child sexual exploitation, to give the benefit of the doubt to the child/parent/lawyer when they report content as child sexual exploitation, and to block the recirculation of abusive content once it has been removed.

RCMP

The RCMP was represented by C/Supt Marie-Claude Arsenault, A/Commissioner Stephen White, Deputy Commissioner, Specialized Policing Services and Paul Boudreau, Executive Director, Technical Operations, Specialized Policing Services. The RCMP discussed online child sexual exploitation generally and how they try to address it. There has been a dramatic increase in reports of online child sexual exploitation in recent years. Children are revictimized when photos and videos of their abuse are shared on the internet. However, illegal online content goes beyond child sexual exploitation—the parameters of the type of content in question includes material that depicts the sexual assault and the non-consensual distribution of intimate images of adults. Large international entities like MindGeek pose investigative challenges. Where the company is housed could differ from where it is incorporated, and where they are incorporated and housed could differ from where they maintain servers with their data. These companies may also have data that exists in cloud storage.

Date modified: