Skip to content.

Exceptions from consent in PIPEDA: facial recognition, privacy and Clearview

PIPEDA requires consent for the collection, use and disclosure of personal information. PIPEDA also has many exceptions where consents are not required. These exceptions are part of the balance in PIPEDA and enable uses of public information for beneficial purposes that are in the public interest. The collection of photos from online sources by Clearview AI, Inc. (“Clearview”) using facial recognition software to facilitate use by law enforcement was recently found to be illegal following a joint investigation by the federal Office of the Privacy Commissioner of Canada (“OPC” or the “Commissioner”) and privacy Commissioners from Quebec (“CAI”), Alberta (“OIPC AB”), and British Columbia (“OIPC BC”) (collectively, the “Offices”). In making the findings in PIPEDA Report of Findings #2021-001 ("Findings"), the Offices concluded that exceptions from consent in PIPEDA must be narrowly construed and that the publicly available exemption set out in the Regulations Specifying Publicly Available Information under PIPEDA (the “Regulations”) do not apply to websites and social media sites.

The conclusions that exceptions from consent must be construed narrowly and that the Regulations do not apply to websites and social media sites have significant implications for the interpretation of PIPEDA and provincial privacy laws as a whole and the Regulations in particular. These conclusions in the Findings are not supported by the authorities relied on by the Offices and deserve close scrutiny.

Background

The Offices conducted a joint investigation to examine whether Clearview’s collection, use and disclosure of personal information using its facial recognition tool complied with federal and provincial privacy laws applicable to the private sector.

Clearview’s facial recognition tool “scrapes” images of faces and associated data from publicly accessible online sources including public websites and social media sites such as Facebook, YouTube, Instagram, Twitter and Venmo. It stores that information in its database which has over 3 billion images including images of Canadians and images of children. It creates biometric identifiers for each image and allows users to upload an image to match to images in the database. If there is a match, Clearview provides a list of results containing matching images and metadata. If a user clicks on the results the user is directed to the original source page of the image.

Clearview’s tool is intended for use for law enforcement and investigative purposes. A variety of organizations, including private sector entities, used this service via a free-trial service. These included the RCMP and other Canadian police forces.

Following the commencement of the joint investigation, Clearview left the Canadian market. Clearview did not, however, promise to delete images of Canadians from its database on a permanent basis.

Findings of the Offices

Clearview did not seek consent from the individuals whose images were collected. Nor did Clearview obtain authorization to scrape the images from the owners of the major sites from which the images were collected. It relied on the “publicly available” exception in the Regulations and in the corresponding provincial privacy laws. Clearview claimed that neither PIPEDA nor the provincial privacy acts applied to it because it did not have servers in Canada. The Offices rejected both of these arguments. They also found that the collection of individuals’ images was for a purpose that reasonable persons would find to be inappropriate and for that reason also contravened applicable privacy laws.

The Findings of the OPC and other Offices are summarized below and are followed by our comments. For the sake of brevity, we focus primarily on the holdings made under PIPEDA.

The jurisdictional challenge

The Offices had little trouble concluding that PIPEDA and the provincial privacy laws applied to Clearview. Relying on prior cases, they applied the real and substantial connection test finding that PIPEDA applied to Clearview’s collection, use and disclosure of the images even though its servers were outside of Canada. In coming to this conclusion, they considered the relevant connecting factors “including the factors set out in A.T. v. Globe24h: (1) the location of the target audience of the website, (2) the source of the content on the website, (3) the location of the website operator, and (4) the location of the host server.”[1]

Did Clearview obtain requisite consents?

Clearview did not obtain any express consents from individuals for its collection, use and disclosure of the images. It took the position that no consent was necessary for its facial recognition harvesting of images in reliance on the “publicly available” exception prescribed by Section 1(e) of the Regulations. This exception from consent applies to the following class of information: “personal information that appears in a publication, including a magazine, book or newspaper, in printed or electronic form, that is available to the public, where the individual has provided the information.”

The Offices, relying on the OPC’s prior finding in PIPEDA Report of Findings #2018-002 (“Profile Technology”), rejected that information on public websites and social media sites was publicly available within the meaning of the Regulations. The Findings were founded on two interpretations of the Regulations. First, public websites and social media sites are not eligible “publications” within the meaning of the Regulations. Second, Clearview could not rely on Section 1(e) because its use of the images was not for the purpose for which they were made publicly available.[2] According to the Findings, “Information from sources such as social media or professional profiles, collected from public websites and then used for an unrelated purpose, does not fall under the ‘publicly available’ exception of PIPEDA.”

The Findings contain several reasons to support these conclusions.

First, the Offices concluded that the Regulations, as exceptions from consent, should be interpreted narrowly. The Findings state:

When interpreting the Regulations, we note that as privacy legislation is considered by the courts to be quasi-constitutional, the rights accorded under them should be given a broad, purposive and liberal interpretation, and restrictions on those rights should be interpreted narrowly.

Since the Regulations create an exemption to a core privacy protection – the requirement for collection, use and disclosure of personal information to be with consent – they should be interpreted narrowly. With this in mind, we do not accept Clearview’s arguments in favour of a wider “plain language” interpretation.

Second, social media sites are not expressly specified as a publication within the meaning of Section 1(e) of the Regulations:

… social media, from which Clearview obtained a significant proportion of the images in its database, is not specified as a “publication” in the language of the PIPEDA regulations.

Third, social media web pages differ from the specific examples of the types of publications that are listed in the Regulations:

It is the OPC’s view that social media web pages differ substantially from the sources identified in the PIPEDA regulations. As the OPC previously found in the matter of Profile Technology, there are a number of key differences between online information sources such as social media, and the examples of “publications” included in 1(e):

  1. social media web pages contain dynamic content, with new information being added, changed or deleted in real-time; and
  2. individuals exercise a level of direct control, a fundamental component of privacy protection, over their social media accounts, and over accessibility to associated content over time – for example, via privacy settings.

Fourth, the acceptance of Clearview’s interpretation of PIPEDA would create too broad an exception to consent under PIPEDA:

Ultimately, Clearview’s assertions that publication necessarily includes “public blogs, public social media or any other public websites,” taken to their natural conclusion, imply that all publicly accessible content on the Internet is a publication in some form or other. This would create an extremely broad exemption that undermines the control users may otherwise maintain over their information at the source. In this regard, it has been noted that control is a fundamental component of privacy protection.

Fifth, for at least some of the images, the individuals may not have provided the information:

Even if such web pages were to be considered “publications” in the meaning of the Regulations, which we do not accept, s. 1(e) of the PIPEDA Regulations and s. 7(e) of the PIPA AB Regulations specify that the exception only applies “where the individual has provided the information,” or where “it is reasonable to assume that the individual that the information is about provided that information,” respectively. As Clearview engages in mass collection of images through automated tools, it is inevitable that in many instances, the images would have instead been uploaded by a third party.

Was Clearview collecting, using or disclosing personal information for an appropriate purpose?

Under Section 5(3) of PIPEDA, “An organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances.” The Findings found that “Clearview’s purpose for collecting, using or disclosing personal information was neither appropriate nor legitimate.”

The conclusion was summarized as follows:

We find that the collection of images and creation of biometric facial recognition arrays by Clearview, for its stated purpose of providing a service to law enforcement personnel, and use by others via trial accounts, represents the mass identification and surveillance of individuals by a private entity in the course of commercial activity.

In our view, for the reasons outlined below, a reasonable person would not consider this purpose to be appropriate, reasonable, or legitimate in the circumstances…

The Findings gave several reasons to support the conclusions.

First, facial biometric data is particularly sensitive given that it is a key to an individual’s identity, supporting the ability to identify and surveil individuals.

Second, the collection of the images (and associated metadata) using facial recognition tools including from minors was indiscriminate.

Third, the purpose was a commercial one and did not have an appropriate purpose:

It is our view that Clearview does not, in the circumstances, have an appropriate purpose, for:

  1. the mass and indiscriminate scraping of images from millions of individuals across Canada, including children, amongst over 3 billion images scraped world-wide;
  2. the development of biometric facial recognition arrays based on these images, and the retention of this information even after the source image or link has been removed from the Internet; or
  3. the subsequent use and disclosure of that information for its own commercial purposes;

where such purposes:

  1. are unrelated to the purposes for which the images were originally posted (for example, social media or professional networking);
  2. are often to the detriment of the individual (for example, investigation, potential prosecution, embarrassment, etc.); and
  3. create the risk of significant harm to individuals whose images are captured by Clearview (including harms associated with misidentification or exposure to potential data breaches), where the vast majority of those individuals have never been and will never be implicated in a crime, or identified to assist in the resolution of a serious crime.

Fourth, Clearview’s collection of sensitive biometric personal information was not carried out in a legal manner as it was done without consent and in apparent violation of at least some of the social media websites’ terms.

Fifth, “[w]hereas law enforcement agencies rely on the broad collection authority for their operations found in public-sector privacy legislation, these actions are circumscribed by the Charter and Clearview enjoys no such collection authority as a private organization.” The Offices added: “Although some of the information collected may have ultimately been used for law enforcement, Clearview’s real purpose for the collection is a commercial for-profit enterprise and not law enforcement.”

Comments on the Findings

Should the Exceptions from Consent Be Interpreted Narrowly?

The reasons in the Findings raise a number of issues of statutory interpretation of importance that go beyond the facts of the Clearview case. In particular, we focus on the key holding in the Findings that exceptions from consent must be narrowly construed and that the publicly available exception in Section 1(e) of the Regulations does not apply to public websites or social media sites. These Findings do not appear to be supported by the leading authorities on the interpretation of PIPEDA or other privacy statutes relied on by the Offices.

The Findings concluded that the Regulations should be interpreted narrowly. In reaching this conclusion, the Findings followed the findings of the OPC in the Profile Technology case. However, neither the Clearview case nor the Profile Technology case considered the Regulations in light of the overarching purpose clause in Section 3 of PIPEDA, which requires the privacy Part of PIPEDA to be interpreted in a balanced way given its dual intent to protect individuals and facilitate reasonable uses by organizations:

The purpose of this Part is to establish, in an era in which technology increasingly facilitates the circulation and exchange of information, rules to govern the collection, use and disclosure of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.

Section 3 of PIPEDA is a clear direction to balance the rights of individuals and the needs of organizations. The balance in PIPEDA was described in Englander v. Telus Communications Inc., 2004 FCA 387 as follows:

The purpose of the PIPED Act is altogether different. It is undoubtedly directed at the protection of an individual’s privacy; but it is also directed at the collection, use and disclosure of personal information by commercial organizations. It seeks to ensure that such collection, use and disclosure are made in a manner that reconciles, to the best possible extent, an individual’s privacy with the needs of the organization. There are, therefore, two competing interests within the purpose of the PIPED Act: an individual’s right to privacy on the one hand, and the commercial need for access to personal information on the other. However, there is also an express recognition, by the use of the words “reasonable person,” “appropriate” and “in the circumstances” (repeated in subsection 5(3)), that the right of privacy is not absolute.

The PIPED Act is a compromise both as to substance and as to form.

With respect to the compromise on substance, Michael Geist, in Internet Law in Canada, 3rd ed., Concord, Ont.: Captus Press, 2002, at page 303, puts it as follows:

The subject of intense negotiation between business, consumer groups, and government in the early and mid-1990s, the Code represents a compromise between the need to protect individual privacy and the desire of organizations to collect personal data for marketing and other commercial purposes. This compromise remains intact in the new law, and is reflected in its purpose clause, which explicitly refers to the balance between the competing interests of individuals and business. (An early version of the bill referred only to personal privacy.) …

All of this to say that, even though Part 1 and Schedule 1 of the Act purport to protect the right of privacy, they also purport to facilitate the collection, use and disclosure of personal information by the private sector. In interpreting this legislation, the Court must strike a balance between two competing interests…

Like all provisions in PIPEDA, exceptions from consent must be construed as being remedial and be given such fair, large and liberal construction and interpretation as best ensures the attainment of its objects.[3]

The exceptions from consent might well be understood as “users’ rights”, as are exceptions to infringement of copyright. In order to maintain the proper balance between the rights of individuals and organizations’ interests in using public domain information, all PIPEDA exceptions including those in the Regulations must not be interpreted restrictively.[4]

Exceptions from consent serve public purposes, purposes established by Parliament that reflect the balance in the privacy law. The scope of the publicly available exception, for example, as the Commissioner told Parliament in the 2016-2017 Annual Report, “raises fundamental questions of freedom of expression and the right to access information in the public interest”.

Some of the exceptions from consent are necessary to offset the deleterious effects of absolute restrictions against using personal information that is public. The Supreme Court adverted to the complexity of regulating publicly discernable personal information in the United Foods case[5] when striking down Alberta’s privacy law PIPA because it unreasonably interfered with freedom of expression rights in the labour context:

It goes without saying that by appearing in public, an individual does not automatically forfeit his or her interest in retaining control over the personal information which is thereby exposed. This is especially true given the developments in technology that make it possible for personal information to be recorded with ease, distributed to an almost infinite audience, and stored indefinitely. Nevertheless, PIPA’s restrictions operate in the context of a case like this one to impede the formulation and expression of views on matters of significant public interest and importance.

PIPA’s deleterious effects weigh heavily in the balance. What is of the utmost significance in our view is that PIPA prohibits the collection, use, or disclosure of personal information for many legitimate, expressive purposes related to labour relations. These purposes include ensuring the safety of union members, attempting to persuade the public not to do business with an employer and bringing debate on the labour conditions with an employer into the public realm. These objectives are at the core of protected expressive activity under s. 2(b).

The Findings risk circumventing PIPEDA’s purpose clause and the persuasive guidance of the Supreme Court in United Foods with a kind of standalone syllogism: privacy legislation is quasi-constitutional in nature; human rights laws including Charter rights are constitutional or quasi-constitutional rights; constitutional human rights are broadly construed and exceptions to those human rights are narrowly interpreted; PIPEDA is a privacy law; therefore the rights in PIPEDA must be broadly construed and exceptions from consent in PIPEDA must be narrowly construed.

This syllogism can be seen by reviewing the reasons and authorities relied on in the Findings in paras 61 and 62 (set out below with the insertion of the authorities relied on at the footnotes shown):

61        When interpreting the Regulations, we note that as privacy legislation is considered by the courts to be quasi-constitutional,Footnote 37 [Footnote 37: For example in: Nammo v. Transunion of Canada Inc., 2010 FC 1284 at paragraphs 74 and 75; Bertucci v. Royal Bank of Canada, 2016 FC 332 at para. 34; Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers, Local 401, 2013 SCC 62 paragraphs 19 and 22; Cash Converters Canada Inc. v. Oshawa (City), 2007 ONCA 502 (CanLII) at para 29 citing Lavigne v. Canada (Office of the Commissioner of Official Languages), 2002 SCC 53 (CanLII), [2002] 2 S.C.R. 773 and Dagg v. Canada (Minister of Finance), 1997 CanLII 358 (SCC), [1997] 2 S.C.R. 403.] the rights accorded under them should be given a broad, purposive and liberal interpretation, and restrictions on those rights should be interpreted narrowly.Footnote 38 [Footnote 38: Québec (Commission des droits de la personne et des droits de la jeunesse) v. Montréal (City); Québec (Commission des droits de la personne et des droits de la jeunesse) v. Boisbriand (City), 2000 SCC 27 (CanLII), paras. 28-30.; New Brunswick (Human Rights Commission) v.Potash Corporation of Saskatchewan Inc., [2008] 2 SCR 604, 2008 SCC 45 (CanLII), paragraphs 19, 65-67.]

62        Since the Regulations create an exemption to a core privacy protection – the requirement for collection, use and disclosure of personal information to be with consent - they should be interpreted narrowly. With this in mind, we do not accept Clearview’s arguments in favour of a wider “plain language” interpretation.

This syllogism and the conclusions in the Findings that rely on it are open to doubt.

First, it is starkly at issue with the OPC’s own public submissions on the balance established by PIPEDA. Commissioner Therrien has consistently advocated for Parliament to give our federal privacy law protection as a human right because PIPEDA is “narrowly framed” as a data protection statute that merely codifies a set of rules. His claim is that PIPEDA is defective as framed because it merely sets out data protection rules rather than treating privacy as a human right. This view is clearly seen, for example, in Remarks at the University of Ottawa’s Centre for Law, Technology and Society, made in January of 2020:

My predecessors and I have for a long time called for reform of privacy laws. This was ignored for an equally long time, I think because privacy is such an abstract concept. Now it has become real. And government has promised to act.

The question is no longer whether privacy laws should be modernized, but how.

Because data-driven technologies have been shown to be harmful to privacy and other rights, I think the starting point to law reform should be to give privacy laws a right-based foundation.

A central purpose of the law should be to protect privacy as a human right in and of itself, and as an essential element to the realization and protection of other human rights.

Currently, Canada’s federal privacy laws are narrowly framed as data protection statutes.

As such, PIPEDA and the Privacy Act codify a set of rules for how organizations and federal government institutions are required to handle an individual’s personal information.

Privacy is much broader than data protection – although data protection seeks to participate in the protection of privacy.

Neither of the statutes I cited formally recognizes privacy as a right in and of itself…

Second, although the cases referred to in the Findings do support the proposition that PIPEDA has been construed as being quasi-constitutional, none of these cases go on to conclude that exceptions from consent in PIPEDA should be narrowly construed. In fact, the authorities re-iterate that privacy legislation is to be construed in a balanced manner consistent with the objectives in Section 3 and the general principles of statutory interpretation.

The Nammo decision of the Federal Court used the quasi-constitutional status of PIPEDA in awarding damages for violating the Act. But in so awarding damages, the Federal Court accepted that “such an award should only be made ‘in the most egregious situations’” and limited its award to $5,000.

The Bertucci decision held that while PIPEDA is quasi-constitutional legislation “in interpreting this legislation, the Court must strike a balance between protecting the right of privacy and facilitating the collection, use and disclosure of personal information (Englander at para 46)… A balanced approach must be adopted in light of the quasi-constitutional nature of the Act”.

The United Food case, though recognizing the quasi-constitutional status of Alberta’s PIPA, struck it down as being unconstitutional as its exceptions were not broad enough to avoid infringing the Charter’s rights to freedom of speech. In doing so, the Supreme Court found the PIPA to be deficient because “the Act does not include any mechanisms by which a union’s constitutional right to freedom of expression may be balanced with the interests protected by the legislation”.

The Cash Converters case addressed whether a municipal bylaw conflicted with Section 28(2) of the Municipal Freedom of Information and Protection of Privacy Act. It only, as the Findings state, cited the Supreme Court decisions in Lavigne and Dagg.

The Lavigne decision confirmed that the federal Privacy Act is quasi-constitutional in nature and that a purposive approach to its interpretation is justified. However, the Supreme Court explained that a tool of construction should never trump a full purposive approach in interpreting legislation:

that status does not operate to alter the traditional approach to the interpretation of legislation, defined by E. A. Driedger in Construction of Statutes (2nd ed. 1983), at p. 87:

Today there is only one principle or approach, namely, the words of an Act are to be read in their entire context and in their grammatical and ordinary sense harmoniously with the scheme of the Act, the object of the Act, and the intention of Parliament.

The quasi-constitutional status of the Official Languages Act and the Privacy Act is one indicator to be considered in interpreting them, but it is not conclusive in itself. The only effect of this Court’s use of the expression “quasi-constitutional” to describe these two Acts is to recognize their special purpose.

Put differently, statutory interpretation should never begin and end with a heuristic shortcut (“quasi-constitutional”) that supplants the harder and deeper work of interpreting the text, context and purpose of a given provision.

The Lavigne decision addressed circumstances in which a government institution may or must refuse to disclose personal information to the individual to whom the information relates (Sections 18 to 28). Given that one of the objectives of the Privacy Act is to provide individuals with access to personal information about themselves, the courts have generally interpreted the exceptions to the right of access narrowly.

A principal rationale for this narrow construction provide at para. 30 of Lavigne is that the Privacy Act’s access provisions must be interpreted to be consistent with the recitals to the Access to Information Act. The recitals set out in Section 2(2)(a) of the latter Act clearly state that it:

extends the present laws of Canada to provide a right of access to information in records under the control of a government institution in accordance with the principles that government information should be available to the public, that necessary exceptions to the right of access should be limited and specific and that decisions on the disclosure of government information should be reviewed independently of government. (emphasis added)

In other words, the contextual purpose clauses employed by Parliament dictated the outcome (as opposed to a de-contextualized rule of construction).

The Dagg decision addressed how to construe the exception in the Access to Information Act for personal information under the Privacy Act. The Supreme Court held that the exception should not receive “a cramped interpretation”. Rather, it had to be construed in accordance with the modern approach to statutory interpretation:

It is clear, therefore, that Parliament did not intend access to be given preeminence over privacy. The appellant correctly points out that under the Access to Information Act, access is the general rule. It is also true that exceptions to that rule must be confined to those specifically set out in the statute and that the government has the burden of showing that information falls into one of these exceptions. It does not follow, however, that the “personal information” exemption should receive a cramped interpretation. To do so would effectively read the Privacy Act as subordinate to the Access to Information Act.  As stated in s. 12 of the Interpretation Act, R.S.C., 1985, c. I-21, every enactment is to be given “such fair, large and liberal construction and interpretation as best ensures the attainment of its objects”. A court may not disregard, “in an effort to give effect to what is taken to be the purpose of the statute . . . certain provisions of the Act”; see St. Peter’s Evangelical Lutheran Church, Ottawa v. City of Ottawa, 1982 CanLII 60 (SCC), [1982] 2 S.C.R. 616, at p. 626. The Access to Information Act expressly incorporates the definition of “personal information” from the Privacy Act. Consequently, the underlying purposes of both statutes must be given equal effect.

This conclusion is consistent with the contextual approach set out at para. 27 of the Supreme Court’s decision in Bell ExpressVu Limited Partnership v. Rex, 2002 SCC 42, which explained that:

The preferred approach recognizes the important role that context must inevitably play when a court construes the written words of a statute: as Professor John Willis incisively noted in his seminal article “Statute Interpretation in a Nutshell” (1938), 16 Can. Bar Rev. 1, at p. 6, “words, like people, take their colour from their surroundings”. This being the case, where the provision under consideration is found in an Act that is itself a component of a larger statutory scheme, the surroundings that colour the words and the scheme of the Act are more expansive. In such an instance, the application of Driedger’s principle gives rise to what was described in R. v. Ulybel Enterprises Ltd., [2001] 2 S.C.R. 867, 2001 SCC 56, at para. 52, as “the principle of interpretation that presumes a harmony, coherence, and consistency between statutes dealing with the same subject matter”.

The cases cited in the Findings do not support the proposition that privacy laws must be construed using the same principles of statutory interpretation that are used to construe constitutionally-protected human rights. They instead point to the necessity of engaging in the difficult work of statutory construction in gleaning the purposes of Parliament in instituting a given scheme. The quasi-constitutional status of privacy rights may help guide the analysis, but it is not the end of the analysis.

Even in this regard, and the third reason to question the Findings on this point, the authorities referred to in the Findings do not go as far as the Findings suggest with respect to the interpretation of laws that protect human rights.

In the New Brunswick and Boisbriand decisions referred to in the Findings, the Supreme Court confirmed that protections conferred by human rights legislation should be interpreted broadly and that the exceptions to the prohibitions against discrimination are to be construed narrowly. However, in New Brunswick it also made clear that “ambiguous language must be interpreted in a way that best reflects the remedial goals of the statute. It does not, however, permit interpretations which are inconsistent with the wording of the legislation.” In Boisbriand, the Court also confirmed even human rights laws that have a constitutional status “must be interpreted contextually” which includes “the other provisions of the law, related statutes, the objective of both the law and the specific provision, as well as the circumstances which led to the drafting of the text.”

The interpretation of “publication”

The Findings reads the term “publication” narrowly. In part, this was influenced by the prior holding that exceptions in privacy laws should be narrowly construed. The balance of the reasons in the Findings on this issue must be examined with this in mind.

The construction of the exception expressly disregards the natural and ordinary meaning of the term “publication”, a term whose ordinary meaning might well apply to content, or at least some content, that is made publicly available (published) including on websites and social media sites.

The interpretation of the term “publication” does not take into account the clear intentions of Parliament which, according to the Regulatory Impact Statement that accompanied the Regulations, was to include the phrase “in printed or electronic form” precisely to include Internet media:

Several organizations questioned why the examples of publications “a magazine, book, or newspaper” were drawn from traditional rather than electronic media and whether “publication” included internet media. To clarify this point, the words “in printed or electronic form” have been added to the term “publication”. (Emphasis added.)

The interpretation ignores the use of the word “includes” within the definition of the term publication. As a matter of statutory interpretation, the word “includes” suggests that the Regulations intended the term publication to be given “its ordinary, popular, and natural sense whenever that would be properly applicable” and to expressly also encompass those things that the Regulations state are to be comprehended, such as here, a publication and the specifically enumerated examples of publications in electronic form.[6] The Offices did not engage with the meanings of those terms and explain why websites and social media were excluded, other than in the limited manner described below.

The holding did not take into account the principles of technological neutrality and media neutrality the Supreme Court has used repeatedly to construe legislation to ensure that laws are interpreted not to favour or discriminate between traditional and new media.[7]

In particular, the Findings purport to exclude from the term publication “social media web pages because they contain dynamic content”. There is no explanation given in the Findings as to why this non-technologically neutral interpretation should be accepted, particularly in an era in which both magazines and newspapers are dynamically updated in real time as events unfold. The “dynamic content” limitation in the Findings thereby even puts at risk the specific examples of publications provided in the Regulations.

The Findings also refer to individuals being able to “exercise a level of direct control… over their social media accounts, and over accessibility to associated content over time – for example, via privacy settings”. As the exception does not apply unless “the individual has provided the information”, one might have argued the social media accounts would be included because of this level of control, subject potentially to where the personal information is scraped or harvested in violation of social media site terms.

The Findings also suggests that, for Section 1(e) to apply, the information must also be used for an intended purpose. Such an interpretation would add words not contained in the exception and would conflict with the OPC’s own interpretation bulletin, and PIPEDA Case Summary #2009-13 Publisher collected and used e-mail addresses for marketing without consent on which it relied; namely that “Regulation 1(e) does not require that the collection or use of the information relate directly to the purpose for which the information appears in the publication.”

It may well be, as the Findings suggested, that many of the images were not provided by the individuals, and that the exemption relied on by Clearview would not apply to such images under the plain language of the Regulations, which require that “the individual has provided the information”. However, the Regulations may have applied to permit the without-consent collection of other images including those posted on personal websites.

Had the Findings found that at least some of the images could be collected, used, and disclosed without consent based on the Regulations, the OPC would have had to consider the further question as to whether the use of such images, though exempt from the consent requirements, was still prohibited by Section 5(3) of PIPEDA.

The Findings on Section 5(3) focused on the commercial objectives of Clearview in creating the database using its facial recognition tools. The Findings did not expressly consider whether, for the purposes of Section 5(3), the Offices should have also (or exclusively) focused on the perspective of the users of the database and Clearview’s stated goal of facilitating their uses including by the law enforcement officials using the Clearview database for investigative purposes.[8]

Properly construed, the Regulations may extend to online publications, but the Offices’ concerns about potentially detrimental uses of facial recognition surveillance might be curtailed because the collection, use and disclosure of personal information that is publicly available must still be “only for purposes that a reasonable person would consider are appropriate in the circumstances.”

Such guidance would have been valuable as the OPC would have needed to evaluate Clearview’s highest and best case in respect of a set of images for which consent was not required under PIPEDA. The OPC could then have evaluated difficult questions which would have included whether an individual publishing images on a platform has a reasonable reliance on the protections of “no scraping” clauses in that platform’s terms of use and privacy policy.  

Clearview has publicly stated its intention to challenge the Findings. It has not stated how it intends to do that. Clearview has no direct appeal rights under PIPEDA to challenge the Findings. It is possible that it could apply for judicial review of the Findings.

This blog post Exceptions from consent in PIPEDA: facial recognition, privacy and Clearview is being simultaneously published on barrysookman.com.

_____________________________

[1] The Finding states:

Regarding the location of Clearview’s target audience:

  1. While Clearview claims that its activity in Canada was limited, this is at odds with the fact that it actively marketed its services to Canadian organizations through promotional material, testimonials from Canadian law enforcement professionals, and agency-specific presentations and trials. Furthermore, Clearview publicly declared Canada to be part of its core market in statements to the media and its own promotional materials.
  2. The fact that only one agency became a paying customer is, in our view, immaterial. The colour and character of Clearview’s activities were commercial in nature, with trials existing for the express purpose of enticing the purchase of accounts. Clearview’s representations confirmed that 48 accounts (trial or otherwise) were created for law enforcement agencies and organizations across Canada, and thousands of searches were conducted through these accounts. In particular, we note that various provincial law enforcement agencies used trial accounts of the App for several months, with the number of searches conducted per trial account ranging from tens, to hundreds, or in one case, thousands. Furthermore, dismissing the RCMP as only “one Canadian entity” ignores the fact that the RCMP is Canada’s national law enforcement agency, operating all over Canada with national, federal, provincial, and municipal policing mandates.

Regarding the source of Clearview’s content:

  1. It is not a requirement that Clearview’s content be exclusively derived from Canadian sources for there to be a real and substantial connection to Canada.
  2. As set out in Lawson v. Accusearch Inc., it is not necessary to identify specific Canadian sources of content to determine we have jurisdiction.
  3. Clearview’s assertion that it collects images without regard to geography or source does not preclude our jurisdiction when a substantial amount of its content is sourced from Canada. The exact number of images derived from individuals in Canada is unknown due to the fact that Clearview does not retain the national source. However, the indiscriminate nature of Clearview’s scraping renders it a relative certainty that it collected millions of images of individuals in Canada, and used them to derive biometric image vectors for its database, including to market to Canadian law enforcement agencies.

Finally, regarding the location of Clearview’s website operations and host server:

  1. We note that Clearview’s activities take place exclusively through a website or app. As referenced in paragraph 54 of A.T. v. Globe24h.com, a physical presence in Canada is not required to establish a real and substantial connection when considering websites under PIPEDA, as telecommunications occur “both here and there.”
  2. Clearview’s operations necessitate the transmission and receipt of personal information between Canada and the USA, both when collecting information and disclosing it through its software.
  3. As set out by the Supreme Court of Canada: “Receipt may be no less “significant” a connecting factor than the point of origin (not to mention the physical location of the host server, which may be in a third country).”

[2] The Officers also held that Clearview’s creation of biometric information in the form of vectors constituted a distinct and additional collection and use of personal information for which there was no consent.

[3] See, Royal Bank of Canada v. Trang, 2016 SCC 50 at paras. 37-38; Bell ExpressVu Limited Partnership v. Rex, 2002 SCC 42 at para. 28.

[4] See, CCH Canadian Ltd. v. Law Society of Upper Canada, 2004 SCC 13 at para. 48.

[5] Alberta (Information and Privacy Commissioner) v. United Food and Commercial Workers, Local 401, 2013 SCC 62 (CanLII), [2013] 3 SCR 733

[6] Canadian Pacific Ltd. v. A.G. (Can.), 1986 CanLII 69 at paras 21-23.

[7] Entertainment Software Association v. Society of Composers, Authors and Music Publishers of Canada, 2012 SCC 34, at paras. 5-12 “The traditional balance between authors and users should be preserved in the digital environment” at para 8; Keatley Surveying Ltd. v. Teranet Inc., 2019 SCC 43 at para 86 “Ontario’s reliance on new technologies post-digitization does not, in my view, change the assessment of whether the Crown had copyright by virtue of s. 12 of the Act. In fact, the principle of technological neutrality demands that it does not.”; and Canadian Broadcasting Corp. v. SODRAC 2003 Inc., 2015 SCC 57 at para 66 “The principle of technological neutrality is recognition that, absent parliamentary intent to the contrary, the Copyright Act should not be interpreted or applied to favour or discriminate against any particular form of technology Para. 85 “[t]echnological neutrality is determined by functional equivalence”.

[8] See, Alberta (Education) v. Canadian Copyright Licensing Agency (Access Copyright), 2012 SCC 37 “When considering the first stage of CCH — whether the dealing is for an allowable purpose — the relevant perspective is that of the user. This does not mean, however, that the copier’s purpose is irrelevant at the second stage. The copier’s purpose will be relevant to the fairness analysis if the copier hides behind the shield of the user’s allowable purpose in order to engage in a separate purpose — such as a commercial one — that can make the dealing unfair. There is no separate purpose on the part of the teachers in this case. They have no ulterior or commercial motive when providing copies to students. They are there to facilitate the students’ research and private study and to enable the students to have the material they need for the purpose of studying. The teacher/copier shares a symbiotic purpose with the student/user who is engaging in research or private study.”

Authors

Subscribe

Stay Connected

Get the latest posts from this blog

Please enter a valid email address