Public consultation on the Privacy Act – Submission – Anonymous #10

Cette soumission n’est disponible qu’en anglais.

Dear Sir/Madam

I am pleased to submit hereby a proposal on amending the modernization of privacy act. I submitted it online but I am not sure if it was considered due to deadline and time differences. So I am submitting via email herby. Sorry, today I just saw the announcement about submitting a proposal today about the act. And I was rushed a bit in time to finalize mine

Any law or act is like a security fence no matter how it has become comprehensive, once it has any gap in it. it will risk being compromised by anyone who is aware of those gaps. When it comes to privacy and misuse of data, I do have a negative experience that is a bit relevant and I can reflect on. A case was sent for querying to Canada’s attorney general and it will be submitted in detail for relevant measures soon.

The proposal

In this proposal I will introduce three parts:

  • A- The general point that may enhance the act if considered.
  • B- Remarks on point by point articles of the proposed act and relevant amendments to make more resilient.
  • C- A proposal of practical solution to relevant aspects of the act.

A- The general point that may enhance the act if considered.

The context of discussion of my point is the following:

1- The current technological advancement has created a significant gap between data as recorded (or even unrecorded) and the utility of the data. Intuitively, one can predict the utilization of data by protecting data, but that point is not quite accurate. If you train an AI model on specific data they may result in specific “effect” that have been captured by trained models in its parameters despite the data later on being erased or discarded. Some companies in the past, like Facebook, were thinking about deleting users’ data eventually.Thus, the proposal should not only data collection but data processing whether the inferences are generated on the same group of users whose data is being processed or on another group.

2- Despite that the concept of “flexiable principle-based approach” does really make a good sense along with other considerations, restricting the laws principles to general notion about general principles such the one provided in article two “the modernizing the purpose clause to better protect the act broader objectives” may not prevent misusing data for other unjustified interests that is not relevant to the functional use of data. For example, suppose the collected data about individual is used unofficially, indirectly for advancing chances of competitors. Where to define the measures of investigation about capturing the governances aspect of such situation. The possible answer comes in self-declaration of a checklist. Such as “the collected data should not result in distribution to civil rights, free market principles or official or unofficial gainst beyond the scope of the functional communicated purposes”.

3- Utility of some data modeling and profiling can go far beyond the existence of data itself. For example, if the individualistic data is used for modeling electoral pattern (as in Cambridge analytica scandal) then it can be used to indirectly manipulate individuals even if its not individualistic data (even if the original data is erased) the effect of such profiling and inferences may result in “rules” that can be used beyond life timeline of users’ data. These inferences can be applied based on rules of profile similarity of groups despite that their personal data is not modeled (as in the article considering the inferences about personal data as collection)to create such inferences. It was suggested that Those rules then can affect conditional key predictors. The result in essence, is misusing of data but in contexts that detach the conclusions built on data from the data itself. That also should be taken in consideration.

4- The exception that was given to law enforcement and national security on data processing should be conditional not ultimate (beyond questioning or out of the scope of justice system) to a clear relevant scope of their functions and what they address. Otherwise, if the capabilities are used for example to address civil contextual interests that are not relevant to public safety, it will be a challenge to the justice system that is difficult to address or develop investigative and accountability measures against. Been if such a situation happens in any country the justice will be compromised systematically.

For example, harvesting intelligence data for making gains for individuals, corporations or systems of interests in terms of business ideas or strategies, scientific research or even for addressing political advantages. Equally importantly, closing an eye on such practices by any source of data (e.g individualistic hackers or third party providers, private business or for example) The scope of those actions is not relevant to the objective of the regulatory bodies (e.g law enforcement or national security). It is relevant to contextual violations, exploitation when done by such entities will not make violators ,regardless of whatever title or capacity they hold, any less “criminal” than any other individual or entity misusing data collection capabilities for other contextual interest. Definitely, those law enforcement and national security will have the upper hand on issues of data collection due to their scope of speciality, importance and necessity of their scope.

The answer for this issue is

  • (A)- opening a reporting channel on compliance for violations or doubts of violations.
  • (B)- severe accountability and investigative measures that set no one above the laws even when investigative measures, accountability and justice consequences are done within the proper measures. And whistle blowing policies and protections to be in place.
  • (C)- Following self-declaration and abiding to general code of ethics and conduct to “abstaining from violating the laws with respect to the utility or use of data”.
  • (D)- Require national security and law enforcement a pre-approval from the justice system and privacy commissioner of any of data collection practices or utilities that goes beyond the predefined objectives and practices (e.g, preventing stealing business or research ideas or use it for civil advantages, requesting them to have transparent and investigative measures and hierarchical reporting on such issues).
  • (E)- Penalizing the implicit or explicit blackmailing by misuse of data collecting or utilization capabilities beyond the authorized scope.
  • (F)- Requiring the same conditions to be applied on all formal or semi-formal or external contractors including local and foreign ones.
  • (H)- Require proactive whistleblowing and official reporting for violations or relevant ethical controversy including to any foriegn entity or individuals, official ones that violate the civil rights by using or misusing data, utility or its reporting on (such as the economy, political, research related or interests).
  • (I)-Establishing committees by the parliament and justice ministry to investigate the violation in this regard of civil rights by such entities and allies under the governing laws and national lands to hold however found guilty accountable.

Clearly one can assess that gap that comes excluding such entities from accountability by the justice system will eventually converge to an illegal marriage between interests and such entities with capabilities or who might be tempted to use them (as the case in many developing countries) not for neutral or general good deeds but exploitative practices. Been which misuse of capabilities for non-objective advantages (e.g affecting fair chances of individuals or securing economic gains) will become the new norm.

5- regarding the exception that was given to the privacy commissioner in closing complaints, technically, speaking, when any individual is given the authority to discard any case, then that could be one of the vulnerabilities. Individuals no matter how experienced they are, may be subject to biases or direct or indirect pressure for whatever reason or interest that could be anything. To address this concern by a democratic tool of check and balances, the decisions of closing a privacy relevant complaint will not close it entirely, but it will suspend the case and move it into an open archive where current and future committees (or even civil society, media or journalism) can have the authority to review and revise the detail (along with the rejection reasons) and request reassessment upon necessity. Such an appeal mechanism will prevent any individual from having the final word on any case. In essence, the cyber world (including the cyber manipulation for example) may not have clear obvious lines to draw when they are reported (despite it is a genuine risk for the democracies, like in Cambridge Analytica scandal) moreover, those threats do significantly require being exposed to in order to assessed their effect (that context of experience may not be accessible privacy commissioner at the first place). Thus, such claims can be easily tried to be discarded/covered/rejected by any authorized individuals given an excuse before getting such a case to surface. Thus, any claim, even if it is genuine and critical could be easily neglected due to mere perception about it. Whereas when it is archived with specific duration (e.g 7 years), who make those complaints may appeal by mechanism that can impose cost on them or require them to actualize their claim or contact members of the parliament to raise their concern. On the other hand the successive committees may mine or sample such claims. This way, there will be a balance between the practicality of not exhausting the privacy compliant system from one side and closing any gap on misunderstanding, bias or misuse of power from the other.

6- when an inference is built about the user by using their data (by a public or even private provider), the model should allow the user to request being "forgotten" within a specific grace period. Not by deleting data but by adjusting the parameters of the model itself by training the

model on counter examples (this could be an AI research area). Keep assessing the AI model according to ongoing change of trends.

7- Introducing the concept of "parameteric discrimination" and parameteric violations" to capture the cases when the AI system paramters result in intentional or unintentional discriminationary measures. Criminalizing the intentional tuning of the model for such context and establish verifying, investigative and accountability measures.

8- Integratability with medical and healthcare data considerations with relevant to cost of health and life insurance should be require special considerations and assessment.

B- Remarks on point by point articles of the proposed act and relevant amendments to make more resilients.

1- In article 4-Clarifying concept subsection “Applying the Act to federal public bodies”.
An additional proposed amendment: Including the official and non official contractual parties with the federal government to be subject to the same act. Avoid mentioning that, may result in relying on the third contractual parties to escape accountability and its scope whether the third party are individuals, private business or local or foriegn entities.

2- In article 4-Clarifying concept subsection “Including unrecorded personal information”.
An additional proposed amendment: The unrecorded information is more likely to depend on the context of that information. But in general if the context pertains to making “Conclusion” or "inferences" about individuals using for example AI or individualistic reports. The documentation of those conclusions should be requested to be maintained (including the parametric algorithms that were used for. That will require asserting version for the parameters and train them on batches of data than on online case-by-case training) as long the claim is held constant. The rationale of that is not to cover “fake claims” by erasing their justifications or evidence.

3. In article 4-Clarifying concept subsection “Clarifying when an individual is “identifiable”.
An additional proposed amendment:As people may be subject to different circumstances, the contextual circumstances that identify individuals and make claims about them should be a part of the data captured about the individuals. The rationale is that to avoid neither mis-identifying individuals nor collecting data that is relevant only to the specific context by which individuals were identified and avoid selectiveness in such measures. (e.g for example, capturing the data on an online shopping website may result in identifying individuals, however, their other online activities may clarify the purchasing justifications), when those contexts are excluded the conclusions could be intentionally or unintentionally distorted or become misleading. e.g over purchasing and booking for a party).

4- In article 4-Clarifying concept subsection “Introducing a balancing approach where personal information reflects the views and opinions of one individual regarding another”.
An additional proposed amendment: Individual to Individual opinion and views should be done after signing an official checklist and self-declaration about the professional attitude while capturing the context, the consideration and information for best knowledge. in order to avoid using “reporting” as an intentional tool for exploitation, corruption or blackmailing. Along with that when the reporting results in harmful measures, there should be (A) accountability measures. (B) channel for reported individuals to access the reported information (regardless of the actor). (C). A preventive channel of proactive communication of individuals if and when they feel something could be fishy or wrong. (D) An explicit clarity on the rights of individuals and mechanism of preserving and maintaining their rights. The relevant exceptions should be narrowed down in an inverse relationship to how major or significant reporting and its consequences. Otherwise the reporting mechanism will turn eventually into a “dirty business” full of bribery and interests as claimed to be the case of many third world countries.

5--In article 4-Clarifying concept subsection “Broadening the concept of administrative purpose”
An additional proposed amendment: While the purpose of that act was mentioned to ensure that the full suite of protections in the Act applied to the design and development of artificial intelligence systems” Two aspects are necessary to be considered. The context of the data (including the relevant and irrelevant ones) is crucial, and the intentional or unintentional implied biases of AI systems in making decisions.

The proposed amendment could be in using different arrangements of data and AI systems to assess the both the context of the relevant-irrelevant data, reduce the biases and involve the human factors process on the outliers and systematically on sample of the data (match that to to the performance of the system on going basis as the trained data itself may become invalid due to the change of trends or major aspects, e.g covid-19). Allowing individuals to use external equivalent assessment providers from their own budget (that is expected to give similar outcomes given the evaluation criteria in hand) and get a channel for correcting relevant or irrelevant incorrect data. (e.g for example the case on credit score and it has a context that is not reflected by numbers). Acknowledge the limitation and biases of the AI systems and communicate that to the public with proper corrective and appealing measures (potentially with additional cost for verification by humans).

5--In article 5- “Updating rights and obligations…” subsection “A right for the individual to be notified when his or her personal information is collected by a federal public body, unless an exception applie
An additional proposed amendment: Granting a right of individuals to know what their data is being used including the secondary research. This right will also extend to cover the national security and law enforcement under category “other purposes than national security”. In other words, the typical usage of personal data for law enforcement and national security for the legal purposes defined by their scope. But if and when any law enforcement or national security agency use the data of any individual for any other purpose that is not related to their main function as law enforcement or protecting national security (e.g for example, using data for addressing any economic, political or scientific gains, arguably), then it is no longer “national security and law enforcement measures” the user should have the right to know what is the data is being used for in such close to civil and out of scope context.

6--In article 5- “Updating rights and obligations…” subsection Certain rights relating to enhanced public awareness of interactions with automated decision-making systems (such as artificial intelligence tools)

An additional proposed amendment: The development and measuring the accuracy of automated decision systems should be subject to (1) self-declaration of avoiding embedded with intentional biases or adjusting the parameters for favoring or disfavoring specific individuals or groups of individuals (i.e paramateric discrimination). (2) ongoing performance evaluation by external observers from the civil society and other governmental authorities. (3) Human participaction of processing outliers and ongoing validating samples and improving the system. (4) law enforcement and national security may have similar mechanisms with measures relevant to nature and confidentiality of the scope (e.g by relying on an intelligence committee in the parliament, non-disclosure agreement and penalties within independent technical system evaluators of civil parties sworn before the parliament and selected based on merit).

C- A proposal of practical solution to relevant aspects of the act.

Privacy Justice App:

The ministry of justice will administer the development of an app (website, mobile app) for the following):

1- To notify the user about the primary and secondary usage of their personal data.

2- To allow the user to authorize or deny using their data in specific contexts.

3- Potentially, the app allows for specific tokens, prizes for the users to motivate the users to allow using their data.

4- The app will show the user the effect of their outcomes of using data as a participation for projects it has been used for (i.e for motivation).

5- The app will educate the users by videos, messages and contents about their rights of data.

C- B The National Cyber Identity project :

The federal government will develop a mechanism that allows the users to collect their data from third party providers (e.g social media providers), host it on national servers with proper measures (cost, encryption ..etc). And develop and interface (another app or it can be integrated with the Privacy Justice App)

1- Potentially, the app will allow users to encrypt their data and sell it (by tokens or money) to third parties who may use it for research, developing products or providing services (e.g search engine, social media services ..etc).

2- The app will match the data seekers (e.g potential business, researchers ..etc) by the data providers with the norm and scope of modernized privacy act.

3- Potentially, the app will allow the users to dedicate the cost of using their data for any causes they choose (e.g nonprofit causes).

4- The app will provide the data seekers with general statistics on the available data. Informing them that the data will be subject to the relevant regulations.

5- Potentially, the government will administer directly or through a third party seeking users consent to collectively provide or sell the data to third parties with proper regulation and legislative measures including the future privacy .

6- Potentially, this process can allow using tokens and blockchain (through the federal government or a third party)for both managing the accessibility and the encryption of data and make it integrated within a preset dynamic that is not subject to any to changes of decisions taken by the government.