Public consultation on the Privacy Act – Submission – Digital ID & Authentication Council of Canada
Cette soumission n’est disponible qu’en anglais.
DIACC
Disclaimer: The recommendations expressed in this publication reflect a consolidated view of many of Canada’s digital identity leaders including representatives from financial institutions, telecommunications firms, start-ups, security developers, insurance providers, and more. See all DIACC members. The opinions expressed do not purport to reflect the recommendations or views of individual public and private sector DIACC members. Federal Government of Canada representatives recused themselves from providing feedback on this publication.
- Introduction
- Privacy and Trust Built Upon Standards and Digital Identity
- Data Empowerment of Subjects
- Accountability of Data Holders and Handlers
- Conclusion
Introduction
The Government of Canada is taking an essential step in modernizing the country’s regulations for privacy and digital technology. Leveraging the expertise of Canadians who have been leading in these areas and collaborating with communities that are committed to creating a shared, beneficial vision for Canada’s digital economy will be crucial to accelerate the implementation of new standards and avoid working in parallel.
The Digital Identity and Authentication Council of Canada (DIACC) worked with members across the country and from both private and public sectors to prepare this consultation submission. The DIACC has long advocated for a secure, interoperable digital identity for Canadians to access the advantages of digital technology and economic opportunities, without sacrificing privacy and security.
When implementing new federal privacy regulations, real-world, experience-based technology and interoperability are required to make the vision of a digitally-enabled, globally-competitive Canada a reality. These qualities are also crucial for ensuring that these capabilities work across the country and across the economy. It is worthwhile to consider amending the proposed Bill C-11 to create a clear regulatory framework.
In reviewing the proposed changes, it is crucial to consider risk-based models wherever possible, to account for the wide variety of digital activities Canadians partake in hourly - and the wide range of risks associated with each. Critically, transparency and notice should be reciprocal and proportionate to the context and purpose of use. A risk-based approach would also support subject education, making it possible for individuals to receive clear explanations when sharing or inputting data that is more likely to cause harm, be exposed, misused, or subject to automated decision making.
Throughout the proposed legislation, many DIACC members also highlighted areas where specific and clearer language is needed to make enforcement possible. More detailed recommendations can be found in subsequent sections of this submission.
As these sweeping changes are made, on behalf of the DIACC membership, DIACC offers recommendations prioritizing:
- Privacy and Trust Built Upon Standards and Digital Identity
- Data Empowerment of Subjects
- Accountability of Data Holders and Handlers
Privacy and Trust Built Upon Standards and Digital Identity
Foundations for Success
While strong regulation is crucial for Canada’s ongoing security and economic strength, it must be built upon a foundation of privacy and trust. The 2021 Edelman Trust Barometer has declared a state of “information bankruptcy” - citing disinformation and distrust as growing global threats. Addressing these challenges locally will be crucial for Canada’s enduring economic strength and social cohesion. Trust is vital to ensuring digital identity solutions protect consumers’ privacy while also supporting these technologies’ use in conveniently and securely solving regulatory requirements (for example, anti-money laundering [AML]).
It is important to establish a solid foundation by relying upon existing standards and best practices for significant privacy regulation. For example, digital privacy is first dependent on the strength of the security and assurances of the technology involved, which are dependent on the strength and nuanced consideration of the regulatory infrastructure. Meaningful consent first requires meaningful security and assurance in the technologies used to track people. Without sufficient security and transparency, the entire Canadian ecosystem is put at risk.
Many existing standards, like DIACC’s consensus-based Pan-Canadian Trust Framework, can be leveraged to accelerate adoption. DIACC Member Mastercard has developed a series of data responsibility principles that may also provide guidance in driving consumer trust in the digital era. Emphasizing the values they have identified in a policy framework can give Canadians greater trust and certainty in the digital age. They are as follows:
- Security & Privacy: Companies must uphold best-in-class security and privacy practices.
- Transparency & Control: Companies should clearly and simply explain how they collect, use, and share an individual’s data and give individuals the ability to control its use.
- Accountability: Companies must keep consumer interests at the centre of their data practices.
- Integrity: Companies must be deliberate in how they use data to minimize biases, inaccuracies, and unintended consequences.
- Innovation: Companies should constantly be innovating to ensure individuals benefit from the use of their data through better experiences, products and services.
- Social Impact: Companies should use data to identify needs and opportunities to impact society positively.
While there are very material differences in the private and public sectors, establishing a collaborative and synergistic approach is essential for successful adoption, enforcement, and maintaining global competitiveness. Standards should address parallel efforts happening across regulatory bodies and make progress in unifying these approaches.
Privacy legislation should also be built to remain under federal jurisdiction, without “extras” added at the provincial or territorial level. In what should adequately be a singular mandated legislation, Provincial requirements add unnecessary complexity and expense to operations. Canada is too small to have multiple requirements.
There is a clear opportunity to take leadership by recognizing identity as a key area that addresses cyber-resiliency and economic prosperity issues. Identity is a foundation required for Bill C-11 (‘the bill’) to become a practical reality across sectors. Implementing an interoperable digital identity for Canadians is a key enabler for all proposed policies and strategies.
Data Empowerment of Subjects
Subject Rights
“Consumer” policy is, in essence, identity policy. Identity policy and verification are two of the foundations of trust. Individual identity rights should be enhanced to encompass changing consumer expectations in a rapidly evolving, global technology space.
The copyright act allows individuals to maintain ownership over our likeness, and the same should hold true in our digital likeness. Individuals should give their explicit consent to use of their data, identifiable or not.
Many issues arise surrounding the potential use of data, particularly identifiable but also non-identifiable data without explicit consent. This may result in concerns from industry groups where the outsourcing of personal information (any use of personal information without explicit consent) could become a real challenge for the ongoing creation of Digital Government and the Digital Economy. Using a standards and risk-based approach can help alleviate these concerns, as consent may be designed to encompass a variety of lower-risk or de-identified data uses.
Subject Education
Standardization supports trust by empowering Canadians with an understanding of how Canadian companies and companies operating in Canada will be required to maintain responsibility for their data. Clear, actionable, and well-designed standards take the onus off of consumers in reading extensive legal agreements or keeping up with opaque product updates.
In the Section 8(1) “Designated individual” Amendment of Bill C-11: Individuals should not bear the burden of being aware that this designation exists nor should they be required to seek out contact information. The designee’s role and contact information should be prominently displayed on the organization’s website, associated materials, and readily available to individuals.
Upholding an individual’s right to know how their information is being processed in relation to automated decision systems is laudable. However, organizations must be required to provide explanations that may be easily understood by the individual. Additional guidance is recommended for how this type of information should be communicated to individuals in the spirit of transparency.
The right to algorithmic transparency regarding the use of automated decision systems is also important to individuals’ understanding of how their data is used. The government should regulate these communications based upon the type of personal data involved. The aforementioned risk-based approach would ensure that individuals receive explanations when data that is more likely to cause harm if misused is subject to automated decision making.
Clarity in language and communications strategies (for example. trust seals) should be used to convey confidence in security systems to everyone. These approaches are also useful in creating a shared visual language of privacy and trust, and supporting consistency across (cultural, language, location-based) communities.
The DIACC Pan-Canadain Trust Framework Trustmark program is currently being used in-market a clear method to ensure Canadians understand their rights and the safety standards governing their data across platforms and places.
Accountability of Data Holders and Handlers
Many DIACC Members are data holders and handlers and, as such, much of the feedback we received falls under this category.
The following changes are recommended:
De-identification of Data
De-identification of data should come with a more specific definition of de-identification. In addition to providing more specificity of ’de-identification’, a review process by an organization’s committee, charged with ensuring the appropriate administrative, physical, and technical safeguards are in place, and documenting and monitoring the solution could also be incorporated. If specifications are met, there should be no further obligations from organizations.
A risk-based model may also help support Canadian innovation and economic prosperity. For example, in Section 22(1), Prospective business transaction, there is a requirement that organizations de-identify personal information before sharing it with parties in the context of a proposed business transaction. This could prohibit innovation in solving for digital identity solutions and may not meet other legislative requirements (such as AML).
For de-identified personal information, disclosure can be made without consent if to a government institution, health care institution or educational institution. Further exemptions to include some non-public corporations should be considered. The right to be forgotten is also an important consideration that should be clearly outlined in relation to de-identified data.
Defining and Restricting Relationships
Section 18(2, e), List of Activities, states: “an activity in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual.” Consider adding agreement clauses with 3rd parties, requiring they notify the collector of the information who holds the obligation to notify the individual.
Concerning the above, consent is needed where an organization does not have a direct relationship with the individual before sharing in a proposed business transaction. There should be further exemptions.
Some verbiage in the bill offers blatant loopholes that will make enforcement extremely challenging, if not impossible. For example, the aforementioned Section 18(2, e), List of activities, states businesses may use personal information without consent. This implies that an organization is processing unconsented personal data and is not compliant, or is a sub-processor or a legal exception. The law needs to be usable to hold organizations to account, and should be more specific and nuanced to make this possible.
To begin to address that nuance and implement a risk-based model, in Section 15(3, e), “The names of any third parties or types of third 15 parties to which the organization may disclose the personal information”, consider adding, “f) In a proportional and reciprocal manner and according to privacy risk.”
Data Portability
The concept of portability could play a role in digital identification but seems limited to only “digital framework members.” This should be further defined to make it more widely applicable. As is the case throughout, a standards- and risk-based approach is advised. These approaches offer more nuance and flexibility without sacrificing enforceability. To illustrate this, the quality of the information should be taken into consideration. For example, porting of raw data is fine; porting of insights could be considered proprietary.
The importance of “data portability“ can not be understated as it will be a key to support innovative products and services for the benefit of Canadians. The financial sector is expected to be the first area of focus for the implementation of these rights. Given the sensitivity of this information, and potential harm for consumers, the responsibilities of all parties involved in the transfer, custody, and use of data under this framework must be clearly articulated. It must be supported by a strong and enforceable liability framework that provides certainty for all parties.
The Section 12(4), New purpose, has been identified as a major cause for concern. “If the organization determines that the personal information it has collected is to be used or disclosed for a new purpose, the organization must record that new purpose before using or disclosing that information for the new purpose.” This approach undermines data’s security and integrity, its providence, and people’s ability to control their own data personally - and may violate other fundamental legal precepts.
Disposal and Disclosure of Personal Information
The right of individuals to request organizations provide access to, and deletion of, their information is a key element of an individual’s ability to exercise control over their personal information. Concurrently, organizations should not be compelled to release or delete personal information where there is a substantial, articulable, and unreasonable risk of fraud or risk to the security of the business’s systems or networks.
Section 55(1), Disposal at the individual’s request, should be amended to limit this obligation to personal information that remains under the organization’s control. Although this may be implied in the current form, as it is currently drafted, this clause may be interpreted too broadly. For example, the following subsection (c) could be added to this section: (c) the information is no longer under the organization’s control.
In Section 72, Disclosure under data mobility framework, amendments should be made to limit the disclosure to personal information under the regulated organization’s control. As drafted, this section requires that an organization disclose all personal information it has collected - even if that information is no longer in the organization’s possession or control. This may prove onerous and contrary to the principle that personal information should only be retained for as long as required to meet the purpose for which it was collected.
Enforcement of Regulations
Tracking the impact and consequences of algorithms and AI decision-making is an ambitious task. Standards must be established in a way so that the law can reasonably cope with a wide variety of feeds (or the absence thereof) into individual user profiles based on browsing behaviour, history, previous purchases, and more.
Consider adopting standardized or generalized categories for the legal justifications of processing. For example, Explicit Consent, Contrac, Legitimate Service Request (or specified by business interest), Public Interest (for example, Covid Tracking), Safety of the Person, and Legal Obligation could support a broader understanding of subject rights and make enforcement a realistic undertaking.
Several recommendations that encompass evolving technical capabilities and organizational accountability are suggested:
- In Section 2(1), Interpretation, several definitions could be amended to reflect new technologies, such as blockchain:
- Add the definition of ‘publicly available information’ - Include information stored on decentralized public electronic networks.
- Add definition of ‘control’ - We support a definition that defines control as having the authority or lawful ability to modify or remove personal information directly or through a service provider. This would help mitigate potential issues arising from the immutable nature of decentralized blockchain networks.
- Amend the definition of ‘organization’ - This definition should be amended to expressly exclude decentralized electronic networks without centralized ownership or control. This amendment would have the effect of modifying the definition of service provider to also exclude decentralized electronic networks.
- In Section 51, Information specified by regulations, the regulations should include express authorization for the collection, use and disclosure of publicly available personal information found on decentralized electronic networks, subject to reasonable limitations and protections.
- In Section 9(1), Privacy management program, we recommend adding “(e) how the organization will monitor and audit its compliance with the policies and procedures."
Technologies will be required to deliver and continue to enhance the Notice & Consent practices and record keeping that is defined by the bill. The adequacy, international interoperability, and integrity of the data flows in Canada requires privacy laws that protect the individual as well as society by providing needed safeguards, like consent. Not removing them. In this regard, we depend on the Consumer Canadian Privacy Act to address gaps in enforcement, and gaps that challenge data governance interoperability beyond Canada’s borders. In this way, Canada can compete in the emerging global market on privacy infrastructure already in development in the PCTF Privacy components.
Offering Canadians a Competitive Edge
Maintaining competitiveness of Canadian organizations and solutions means the same legislation applies to all participants in the ecosystem. Compliance from Canadain companies becomes easier when foreign companies are required to meet the same obligations. It is important to avoid mandating Canadian companies with legal obligations while letting foreign companies skirt the legislation.
Exceptions must be carefully considered, especially those that may allow a ‘FANG’ provider to collect data through a subsidiary or an app. A more practical limitation should be that the world’s largest technology companies must ensure compliance of their subsidiaries (whether they are direct or operating as part of their app ecosystem). For example, the out of the box configuration for Google was to allow for the collection of data on all apps, even if it wasn’t required.
The legislation must also account for and support innovation in new and developing technologies. As currently drafted, the proposed Bill C-11 fails to clarify: (i) how the Office of the Privacy Commissioner (Canada) and the CPPA will interpret the relationship between regulated entities and decentralized blockchain networks; (ii) whether the Office of the Privacy Commissioner (Canada) will consider personal information stored on decentralized blockchain networks to be under the direct or indirect control of regulated entities and subject to ongoing statutory obligations; and (iii) whether personal information stored on decentralized public blockchain networks will be considered ‘publicly available information’ that may be collected, used, and/or disclosed without consent subject to specific regulatory constraints.
As decentralized public blockchain networks generally act as decentralized public ledgers (or registries) without a defined legal structure or central control, we believe that information published on these networks should be treated as publicly available information for the purpose of CPPA enforcement. The treatment of information on decentralized public blockchains as ’publicly available information’ is technically and conceptually sound and allows for the development of concise standards for collecting, using and disclosing any personal information on these networks, and the regulation of how this information is used after publication.
Conclusion
Public sector leadership is needed to make authoritative sources of data available to residents and empower them with the ability to use data for their transactions. Policy changes may be required to make this possible but the potential impact would be staggering. There will of course be an adjustment period to achieve compliance as systems are built or modified. As these changes are implemented, a competitiveness and integration plan at provincial/territorial, national, and global levels must be included.
Changes need to be accompanied by strong education and communications that convey how Canadians will use and benefit from these new standards and policies. The DIACC 5 Year Strategy offers an overview of future scenarios for Canada’s digital ecosystem, including four distinct models and an analysis of possible benefits and drawbacks for each.
The DIACC welcomes further conversation with the Minister of Digital Government and the Minister of Innovation, Science, and Industry to expand on recommendations.
Signaler un problème sur cette page
- Date de modification :