Canada’s Privacy Overhaul: Deep Dive into Key Topics of Cross Border Transfers, Service Provider Obligations, AI, Employment Considerations and Class Action Developments

On March 3, 2021, McCarthy Tétrault LLP hosted the second session in its two-part series Canada’s Privacy Overhaul: Deep Dive into Key Topics. This article is about the second session (and a summary of the first session is available at this link). Both sessions are available for viewing online:

As a result of ongoing developments in technology, how and why organizations collect, use and disclose personal information has grown increasingly complex. In response, the Government of Canada introduced Bill C-11, or the Digital Charter Implementation Act, 2020 (the “Act”), on November 17, 2020. If the Act comes into force, it would replace the privacy component of the Personal Information Protection and Electronic Documents Act (“PIPEDA”) with the Consumer Privacy Protection Act (“CPPA”) and the Personal Information and Data Protection Tribunal Act (“PIPDT”).

Continuing from Part I, during Part II we covered the following topics:

Cross-border data transfers

An issue that PIPEDA did not expressly address was whether separate consent was required for cross border transfers. For decades, the Office of the Privacy Commissioner (the “OPC”) had consistently taken the position that cross border transfers of data were considered “uses” rather than “disclosures”, and thus did not require an additional consent for the cross border transfer itself. This long-standing status quo was suddenly thrown into question in 2019 by a series of consultations and reports from the OPC (see related blog posts here and here). Ultimately the OPC returned to its status quo position (see related blog post here), but the uncertainty these events created underscored the importance of having legislative clarity on this issue.

The CPPA, seeks to address PIPEDA’s lack of clarity regarding cross-border data transfers through some key provisions. First, Section 19 of the CPPA expressly states that organizations may transfer an individual’s personal information to a service provider without their knowledge or consent. Notably, this statement does not distinguish between transfers within Canada, inter-provincially, or internationally. The permissiveness of this default position (that organizations can transfer data internationally without separate consent) is distinct from other jurisdictions, such as those regulated by the European Union’s General Data Protection Regulation (“GDPR”), where the default position is that transfers outside the EEA are not allowed, subject to exceptions. While Section 19 of the CPPA is permissive, Section 62.2 of the CPPA imposes an obligation that organizations’ privacy policies must state whether or not the organization carries out any international or interprovincial transfer or disclosure of personal information that could have a significant impact on the individual.

Organizations seeking to comply with the cross-border transfer provisions of the CPPA should:

  • Understand the location and flow of its data;
  • Ensure that its privacy policies disclose in plain language any interprovincial and international data transfers; and
  • Ensure that in arrangements with third parties involving transferring data across borders, the underlying contracts address what is required under PIPEDA and the CPPA, including data use limitations and transfers to other third parties.

For more information, see our article: CPPA: Welcome Clarification on Contractual and Other Duties on Cross-Border Transfers of Personal Information

Service providers

The CPPA helps clarify the role of service providers. As noted above, Section 19 of the CPPA explicitly permits the transfer of personal information to service providers without a separate consent requirement. Although it is unclear why the CPPA does not use the term “processor”, as is used by other privacy laws, “service providers” are defined under the CPPA as organizations that provide services for or on behalf of another organization to assist the organization in fulfilling its purpose. Despite these provisions, the CPPA indicates that the controlling organization is always accountable for personal information under its control. Organizations are required to ensure by contract, or otherwise, that the service provider provides substantially the same protection of the personal information that the organization is required to provide under the CPPA (s. 11(1)). Additionally, under Section 57 of the CPPA, organizations are required to protect personal information through physical, organizational and technological security safeguards. What remains unclear is why the CPPA is structured to result in the imposition of potentially two different safeguard standards on service providers, under Section 11(1) and Section 57.

Other new obligations in the CPPA that would directly affect service providers include:

  • Section 61 of the CPPA, which says that if a service provider determines that any breach of security safeguards has occurred that involves personal information, it must, as soon as feasible, notify the controlling organization that controls the personal information; and
  • Section 55(3) of the CPPA, which says that if an organization disposes of personal information (e.g. under a request by an individual to dispose of their personal information), it must as soon as feasible, inform any service provider to which it has transferred the information of the individual’s request and obtain a confirmation from the service provider that the information has been disposed of.

For more information, see or article: CPPA: transfers of personal information to service providers

AI & automated decision systems

One area under the CPPA which has no equivalent under PIPEDA is the introduction of provisions that are related to automated decision systems. “Automated decision systems” are defined under Section 2 of the CPPA as “any technology that assists or replaces the judgment of human decision makers using techniques such as rules based systems, regression analysis, predictive analytics, machine learning, deep learning and neural nets”. These provisions seem to stem from an increased awareness of the impact of AI and the increased reliance on organizations on automated decision systems in a wide range of contexts, perhaps in response to some public anxiety relating to their use. While the AI and automated decision systems provisions of the new CPPA seem to draw inspiration from Article 22 of the GDPR, which sets out that a data subject shall have the right not to be subject to an automated processing if the decision impacts them, subject to several exceptions, the provisions of the CPPA do not go as far as their counterparts under the GDPR and are somewhat less burdensome.

The Quebec government was the first to propose a fundamentally revised privacy law that introduced specific provisions related to automated decision-makers. Under Section 12.1 of Quebec’s Bill C-64, when a business uses personal information in an entirely automated decision, it must inform the person affected at the time of or before the decision. In addition, Section 12.1 of Bill C-64 requires that, if requested by the individual, an organization must provide the individual with the personal information used to render the decision, the reasons and the principal factors and parameters that led to the decision, and the right of the person concerned to have the personal information used to render the decision corrected. In summary, Bill C-64 establishes the obligation to be transparent and the obligation to provide an explanation and contest the outcome. The CPPA is similar in several ways. Section 62(2) of the CPPA establishes a similar transparency obligation, while Section 63(3) has an explanation obligation. Under section 62(2)(c), organizations must provide a general account of their use of any automated decision makers to make any predictions, recommendations or decisions about individuals that could have significant impacts on them. Section 63(3) further requires that where an organization has used an automated decision making process to make a prediction, recommendation or decision about an individual, the organization must, on request by an individual, provide them with an explanation of the prediction, recommendation or decision about the individual and of how the personal information that was used was obtained.

It is noteworthy that different standards are applied in Section 62(2) versus Section 63(3) of the CPPA. The transparency obligation in 62(2) is subject to the qualifier that it only applies where automated decision systems are used in relation to “predictions, recommendations or decisions about individuals that could have significant impacts on them”. By comparison, the explanation obligations in 63(3) applies to an automated decision system [used] to make a prediction, recommendation or decision about the individual” (whether or not there are any significant impacts on individual). This drafting discrepancy would pose a challenge for organizations, as it suggests that companies would have to explain the results of their automated decision systems even when they are used for trivial and insignificant purposes.

These provisions related to automated decision making provisions both under Bill C-64 and the CPPA could have substantial impacts on organizations. In addition to likely increasing the administrative costs of companies required to meet the disclosure requirements, companies will likely need to consider these transparency provisions in the design phase of various automated decision making tools. Additionally, what remains unclear is that, as written, there is no exception to the general accountability for confidential information or trade secrets, and it remains unclear about how the final iterations of the CPPA and Bill C-64 will work in unison.

Employment considerations

Similar to PIPEDA, the proposed CPPA will apply to federally regulated employers in relation to the employee personal information that they collect (e.g. employers engaged in a federal work undertaking or business, which includes interprovincial transportation, telecommunications, banking, radio and broadcasting, certain atomic energy employers and federal Crown employers).

Under the CPPA, control of personal information will rest with the organization, even when transferred to service providers. In the employment context, service providers are commonly used by employers to assist with managing their workforce, including for example, the administration of benefits and payroll services. The concerns addressed above regarding service providers should be considered in this context. 

Similar to under PIPEDA, the CPPA reiterates the requirement that an organization must designate an individual to be responsible for matters relating to its obligations under the CPPA, and this individual’s business contact information must be provided to any person who requests it. The CPPA does not specify who within the organization must fulfil this requirement.

The CPPA further imposes the requirement to have a privacy management program, building on the requirements that are set out in PIPEDA. All organizations, irrespective of size, will be required to implement such privacy management programs. Some highlights of the current requirement include that privacy policies are available and signed off by employees. Additionally, organizations are required to have an escalation reporting program in the event of a privacy complaint or a privacy breach. The CPPA outlines that the privacy management program must, among other things, address protection of personal information, requests for information and compliance procedures, training and information that is provided by the organization’s staff relating to those policies practices and procedures.

The training requirements, and the training about the privacy management program, will apply not only to federally regulated employers, but to provincially-regulated employers who need to comply with the CPPA in the course of commercial activities, particularly where the employees are engaged in those commercial activities. Meeting the training requirements should be well documented, as employers will want to ensure they have records confirming that the training has taken place. Such records should include employee signoffs or attendance sheets, and those items should be retained in a place where they are readily accessible in the event of future privacy investigations or if employers want to discipline or terminate employees for reasons including breach of privacy policies or procedures. In general, as a best practice, employers should consider whether the training can be in person or a virtual presentation that allows for some engagement and for the employees to ask questions to show that they have had an opportunity with the information and seek clarification. Employers requiring employees to sign-off on training materials will want to ensure that they provide employees with an adequate opportunity to read and engage with the materials and to ask clarifying questions as needed before they are required to return the signed form.

Under the CPPA, the requirement to obtain express consent will not apply where the collection, use, or disclosure of personal information is for establishing, managing or terminating an employment relationship and where an individual is informed that their information will be collected, used or disclosed for such purposes. Nonetheless, some employers still use consents when gathering certain information. The CPPA introduces a plain language requirement for consent. As such, despite the exemption, it is recommended that employers review  any consents being used to consider whether they use plain language and to update them where appropriate.

Finally, similar to PIPEDA, the CPPA prohibits employers from engaging in reprisal against an employee who raises a complaint against the organization. In general, employers need to ensure that in their processes, procedures, and reporting structure for addressing privacy complaints, there are anti-reprisal provisions.

Class action developments

Privacy class actions are still a relatively new, evolving and maturing field. Some recent decisions from Canadian courts have shed some clarity on factors that may bar privacy class actions, particularly where the class has suffered only nominal harm.  

In Setoguchi v Uber BV, the Court of Queen’s Bench of Alberta denied certification to a class where a third party unauthorized external actor allegedly hacked the defendant’s storage of the proposed class members’ personal information. The plaintiffs in Setoguchi claimed negligence and breach of contract. In its decision to deny certification in Setoguchi, the court followed Kaplan v. Casino Rama, noting that there was no evidence of harm or loss from the disclosure of non-private personal information. This line of cases is consistent with Bourbonnierre, c. Yahoo Inc where the Superior Court of Quebec noted that the transient embarrassment of a data breach is an ordinary annoyance and is not enough to merit damages when there is no proof of loss.

These cases emphasize that not all data breaches should result in class actions, and that plaintiffs seeking nominal damages will have difficulty certifying class actions on negligence or contractual theories. On the other hand, to the extent that there are damages suffered by some class members who can make out claims, those claims should be litigated individually.

It bears noting, however, that the recent decision of the Ontario Superior Court of Justice in Stewart v Demme suggests that a class proceeding may be certified on a theory of vicarious liability for intrusion upon seclusion despite only nominal or incidental harm, particularly where the breach concerns private personal information, such as health records.

Another recent example of the courts’ refusal to certify privacy class actions was Simpson v. Facebook, where the plaintiffs challenged the data management practices of the defendant, specifically alleging indirect data sharing with a third party. In denying the class certification the Ontario Superior Court of Justice noted that there was not a shred of evidence that the data was shared with the third party. There was therefore no basis in fact for the existence of the proposed common issues around the purported sharing of data with the third party and the case collapsed. This decision emphasizes that bald allegations of misconduct will not support certification under the double-barreled approach to the common issues criterion, which asks whether an issue exists before considering commonality. 

In class actions, a cause of action under PIPEDA is less commonly brought than actions for negligence, intrusion upon seclusion, breach of provincial privacy legislation and breach of contract. It seems likely that Section 14 of PIPEDA offers an explanation, insofar as it limits access to the court, requiring that complaints first be assessed by the Office of the Privacy Commissioner of Canada before applying to the Federal Court. Thus, while PIPEDA does get pleaded, it is more often in the context of negligence, and setting a purported standard of care.

The CPPA includes a civil enforcement provision, which may be brought by a wider class of affected parties, rather than just the complainant, and which allows actions to be brought in the provincial superior courts in addition to the Federal Court, still requires a finding that the organization has breached the CPPA. This requirement creates potential limitations issues for follow-on civil proceedings if litigants wait for a finding. Conversely, litigants who bring suit without establishing the statutory preconditions may be faced with a motion to strike under Section 4.1 of the Class Proceedings Act, which gives defendants a presumptive right to bring such motions prior to certification.

_____________________________

For more information on any of the topics above, please contact the authors and visit our Technology and Cyber/Data pages.

Authors

Subscribe

Stay Connected

Get the latest posts from this blog

Please enter a valid email address