|
Privacy - PIPEDA (3). Canada (Privacy Commissioner) v. Facebook, Inc. [failed to safeguard its users’ information]
In Canada (Privacy Commissioner) v. Facebook, Inc. (Fed CA, 2024) the Federal Court of Appeal allowed an appeal by the Privacy Commissioner from a decision of the Federal Court which dismissed an application that the respondent Facebook "breached the Personal Information Protection and Electronic Documents Act, S.C. 2000, c. 5 (PIPEDA) through its practice of sharing Facebook users’ personal information with third-party applications (apps) hosted on the Facebook platform".
Here the court considers whether Facebook "failed to safeguard its users’ information":The safeguarding obligation
[109] An organization can be perfectly compliant with PIPEDA and still suffer a data breach. However, the unauthorized disclosures here were a direct result of Facebook’s policy and user design choices. Facebook invited millions of apps onto its platform and failed to adequately supervise them. The Federal Court failed to engage with the relevant evidence on this point, and this was an error of law.
[110] Facebook did not review the content of third-party apps’ privacy policies, despite these apps having access to downloading users’ data and the data of their friends. Since Facebook never reviewed these privacy policies, and since friends of downloading users could not have reviewed these privacy policies either, the policing of an apps’ data use and disclosure was left in the hands of a small number of downloading users who may never have read the policies themselves.
[111] Facebook also did not act on TYDL’s 2014 request for unnecessary user information, despite this request being described as a "“red flag”" by Facebook’s affiant. While Facebook’s failure to review third-party apps’ privacy policies was a failure to take sufficient preventative action against unauthorized disclosure of user data, Facebook’s failure to take action upon seeing red flags amounted to Facebook turning a blind eye to its obligation to adequately safeguard user data.
[112] I would add that Facebook’s inaction here was part of a larger pattern: in December 2015, when Facebook became aware that TYDL had scraped and sold the data of users and users’ friends, contrary to Facebook’s own policies, it did not notify affected users and it did not ban Cambridge Analytica or Dr. Kogan from Platform. Facebook only banned Dr. Kogan and Cambridge Analytica in March 2018—two and a half years after the media reports emerged about TYDL’s scraping and selling of user data—when Facebook found out that Dr. Kogan and Cambridge Analytica may not have actually deleted the improperly obtained user data (Federal Court decision at para. 39; see also Facebook’s 2018 Partial Response to the Commissioner).
[113] To be clear, Facebook’s conduct following its disclosure of data to TYDL is not legally relevant. As held by the Federal Court, the safeguarding principle deals with an organization’s "“internal handling”" of data, not its post-disclosure monitoring of data. However, Facebook’s post-disclosure actions contextually support the finding that it did not take sufficient care to ensure the data in its possession prior to disclosure was safeguarded.
[114] Facebook argues that it would have been practically impossible to read all third-party apps’ privacy policies to ensure compliance, and that Facebook was entitled to rely on the good faith performance of the contracts it had in place.
[115] It may be true that reading all third-party apps’ privacy policies would have been practically impossible. But, this is a problem of Facebooks’s own making. It invited the apps onto its website and cannot limit the scope of its responsibilities under section 6.1 and Principle 3 of PIPEDA by a claim of impossiblitiy.
[116] Despite its obvious limitations, there is a loose analogy here to the commercial host liability line of cases (beginning with Jordan House Ltd. v. Menow, [1974] S.C.R. 239, 1973 CanLII 16 (SCC) at 248): having invited customers in with a clear profit motive, the host cannot now argue that too many came and some behaved badly for it to meet its obligations. Admittedly, the question before this court is not one of negligence—but it is one of whether Facebook took reasonable steps to protect the data of users that it invited onto its site. This observation has even greater resonance when considered in the context of Facebook’s business model: the more apps, the more users, the more traffic, the more revenue. Having created the opportunity for the data breach, Facebook cannot contract itself out of its statutory obligations.
[117] Facebook is entitled to rely on the good faith performance of contracts, but only to a point. As discussed above, Mark Zuckerberg admitted that it would be difficult to guarantee that there were no "“bad actors”" using its Platform. It is incongruent to expect a bad actor to carry out a contract in good faith. Facebook therefore should have taken further measures to monitor third-party contractual compliance.
[118] I conclude that Facebook breached its safeguarding obligations during the relevant period by failing to adequately monitor and enforce the privacy practices of third-party apps operating on Platform. . Canada (Privacy Commissioner) v. Facebook, Inc. [consent assessed on objective evidence]
In Canada (Privacy Commissioner) v. Facebook, Inc. (Fed CA, 2024) the Federal Court of Appeal allowed an appeal by the Privacy Commissioner from a decision of the Federal Court which dismissed an application that the respondent Facebook "breached the Personal Information Protection and Electronic Documents Act, S.C. 2000, c. 5 (PIPEDA) through its practice of sharing Facebook users’ personal information with third-party applications (apps) hosted on the Facebook platform".
Here the court considers whether Facebook "failed to obtain valid and meaningful consent for its disclosures to apps [SS: third party users]", and holds that that question is assessed with objective evidence inferred from a notional "reasonable person":The Federal Court’s call for subjective or expert evidence
[59] In assessing whether Facebook users gave meaningful consent to have their data disclosed, the Federal Court lamented the lack of both expert evidence, as to what Facebook could have done differently, and subjective evidence from Facebook users as to their expectations of privacy. While the Court acknowledged that "“such evidence may not be strictly necessary”", it, in the end, predicated its decision on the "“absence of evidence”" which forced the Court to "“speculate and draw unsupported inferences from pictures of Facebook’s various policies and resources as to what a user would or would not read; what they may find discouraging; and what they would or would not understand”" (Federal Court decision at paras. 71 and 77-78). Therefore, while subjective evidence was not necessary, the Federal Court considered it critical in determining whether a user provided meaningful consent.
[60] Subjective evidence does not play a role in an analysis focused on the perspective of the reasonable person.
[61] The meaningful consent clauses of PIPEDA, along with PIPEDA’s purpose, pivot on the perspective of the reasonable person. Section 6.1 of PIPEDA protects an organization’s collection, use, or disclosure of information only to the extent that a reasonable person would consider appropriate in the circumstances. Clause 4.3.2 of PIPEDA asks whether an individual could have "“reasonably underst[ood]”" how their information would be used or disclosed. (See also section 3 and clause 4.3.5 of PIPEDA).
[62] Importantly, the perspective of the reasonable person is framed by the legislation, which speaks of a corporation’s need for information. It does not speak of a corporation’s right to information. This is critical. The legislation requires a balance, not between competing rights, but between a need and a right.
[63] The reasonable person is a fictional person. They do not exist as a matter of fact. The reasonable person is a construct of the judicial mind, representing an objective standard, not a subjective standard. Accordingly, a court cannot arbitrarily ascribe the status of "“reasonable person”" to one or two individuals who testify as to their particular, subjective perspective on the question. As Evans J.A. wrote for this Court: "“determining the characteristics of the ‘reasonable person’ presents difficulties in a situation where reasonable people may view a matter differently, depending, in part, on their perspective… However, the view of the reasonable person in legal tests represents a normative standard constructed by the courts, not an actuality that can be empirically verified”" (Taylor v. Canada (Attorney General) (C.A.), 2003 FCA 55, [2003] 3 F.C. 3 at para. 95).
[64] Truer words cannot be said in the context of Facebook, with millions of Canadian users comprising the broadest possible sweep of age, gender, social, and economic demographics.
[65] Facebook argues that "“[c]ourts assess objective standards by reference to evidence”", including "“expert evidence about standard practices and knowledge in the field”", "“the availability of alternative designs”" when assessing product safety, or "“surrounding circumstances”" when assessing a party’s due diligence, citing Ter Neuzen v. Korn, [1995] 3 S.C.R. 674, 1995 CanLII 72 (SCC) [Ter Neuzen], Kreutner v. Waterloo Oxford Co-Operative, 50 O.R. (3d) 140, 2000 CanLII 16813 (ONCA) [Kreutner], and Canada (Superintendent of Bankruptcy) v. MacLeod, 2011 FCA 4, 330 D.L.R. (4th) 311 [MacLeod]). However, the cases relied upon by Facebook are patently irrelevant or otherwise distinguishable on the facts.
[66] Ter Neuzen and Kreutner deal with professional vocations and specialized industries. A court would, of course, need expert evidence to determine the standards applied to reasonable doctors (as in Ter Neuzen) or safely designed products (as in Kreutner); a judge is neither a practicing doctor nor a licensed engineer. The same cannot be said for the judge charged with the responsibility of determining the views of the reasonable person, who is both fictitious and yet informed by everyday life experience.
[67] It is true, of course, that in developing the perspective of a reasonable person a court benefits from evidence of the surrounding circumstances. This assists in framing the perspective a reasonable person would have on the situation. Here, there was evidence of surrounding circumstances; it came from the facts of the Cambridge Analytica disclosure itself and in the form of Facebook’s policies and practices. There was evidence before the Court which enabled the determination of whether the obligations under Principle 3 and section 6.1 of PIPEDA had been met.
[68] Facebook also argues that courts consider subjective expectations of privacy in assessing whether a reasonable expectation of privacy exists under section 8 of the Canadian Charter of Rights and Freedoms, Part I of the Constitution Act, 1982, being Schedule B to the Canada Act 1982 (U.K.), 1982, c. 11 [Charter] (citing R. v. Edwards, [1996] 1 S.C.R. 128, 1996 CanLII 255 (SCC) [Edwards]).
[69] In the context of criminal law and the protections against unreasonable search and seizure under section 8 of the Charter, the evidence of the accused, should they testify, as to their expectations of privacy can be received. This is because an assessment of the reasonableness of a search may be informed, in part, by subjective expectations. Nevertheless, the inquiry under section 8 is ultimately normative, with a person’s subjective expectation of privacy being but one factor considered by the courts (R. v. Tessling, 2004 SCC 67, [2004] 3 S.C.R. 432 at para. 42 [Tessling]; Edwards at para. 45). Indeed, and contrary to Facebooks’s argument, the Supreme Court cautioned against reliance on subjective expectations of privacy in assessing a reasonable expectation of privacy (Tessling at para. 42).
[70] It was the responsibility of the Court to define an objective, reasonable expectation of meaningful consent. To decline to do so in the absence of subjective and expert evidence was an error.
[71] Before leaving this section, there remains the question of the curious double reasonableness test in clause 4.3.2. This clause sets out that an organization must "“make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used”", and that for consent to be meaningful, "“the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed”". In other words, both the efforts of the organization, and the form in which consent is sought, must apparently be reasonable.
[72] This double reasonableness requirement does not affect this Court’s analysis. If a reasonable individual were unable to understand how their information would be used or disclosed—as here—this ends the inquiry. An organization cannot exercise reasonable efforts while still seeking consent in a manner that is itself inherently unreasonable. If the reasonable efforts of an organization could trump the reasonable person’s ability to understand what they are consenting to, the requirement for knowledge and consent would be meaningless. Put more simply, if the reasonable person would not have understood what they consented to, no amount of reasonable efforts on the part of the corporation can change that conclusion. Having regard to the purpose of PIPEDA, the consent of the individual, objectively determined, prevails.
[73] This conclusion is reinforced by both legal and practical considerations. Legally, the requirement for valid consent set out in section 6.1 of PIPEDA makes clear that the validity of an individual’s consent depends on that individual’s understanding of what they are consenting to. Practically, given the complexity of the issues, requiring a litigant to lead sufficient evidence demonstrating what an organization could have or should have done could present an unsurmountable evidential burden. . Canada (Privacy Commissioner) v. Facebook, Inc.
In Canada (Privacy Commissioner) v. Facebook, Inc. (Fed CA, 2024) the Federal Court of Appeal considered PIPEDA penalties, here in a high-profile Facebook case involving practices regarding personal information:[135] Facebook’s practices between 2013-2015 breached Principle 3, Principle 7, and section 6.1 of PIPEDA and a declaration should issue to that effect.
[136] The Commissioner also seeks, among other things, an order requiring Facebook to comply with PIPEDA by implementing "“effective, specific and easily accessible measures to obtain, and ensure it maintains, meaningful consent”" for the disclosure of users’ personal information to third parties. The Commissioner suggests specific steps to be taken by Facebook in implementing this order, including "“clearly informing Users about the nature, purposes and consequences of disclosure of their personal information to Third Parties”"; "“obtaining express consent from Users when Facebook uses and discloses sensitive personal information”"; "“ensuring that Users can determine, at any time, which Third Parties have access to their personal information”" and "“can alter their preferences so as to terminate or disallow some or all access by such Third Parties”"; and "“ensuring ongoing monitoring and enforcement of all Third Parties’ privacy communications and practices”".
[137] The Commissioner also requests an order that Facebook "“particularize the specific technical revisions, modifications and amendments to be made to its practices and to the operation and functions of the Facebook service to comply with the relief sought”" to the Commissioner’s satisfaction, and subject to the Court’s later approval.
[138] The Commissioner asks the Court to retain ongoing supervisory jurisdiction for the purposes of monitoring and enforcing the orders requested and determining disputes arising between the parties in the implementation of the orders.
[139] The Commissioner’s requested relief comes in the context of the legal and regulatory responses in other jurisdictions to the Cambridge Analytica disclosure.
[140] In the U.S., the Federal Trade Commission (FTC), among other things: imposed a fine of $5 billion dollars on Facebook; prohibited Facebook from misrepresenting the extent of its privacy and security practices; required Facebook to adopt more explicit and clear consent practices; required Facebook to undertake compliance reporting to the FTC; and required Facebook to adopt a privacy program by which Facebook must document the content, implementation, and maintenance of the program, assess privacy risks and corresponding safeguards, establish an independent privacy committee, and obtain ongoing independent privacy program assessments (Settlement Decision and Order, USA v. Facebook, Case 1:19-cv-02184).
[141] In the United Kingdom (U.K.), the Information Commissioner’s Office (ICO) issued a £500,000 fee against Facebook in 2018 for breaches of data privacy laws (namely, for a lack of transparency and a failure to keep user data secure due to insufficient checks on apps using its platform) (ICO News Release dated October 25, 2018).
[142] I note that Facebook settled with U.S. and U.K. regulatory authorities without admitting to any alleged wrongdoing (Settlement Decision and Order, USA v. Facebook; ICO News Release dated October 30, 2019).
[143] Facebook submits that there is no basis for the "“sweeping… remedies”" requested by the Commissioner, emphasizing the inadequate evidentiary foundation and the extraordinary nature of the remedies sought.
[144] Facebook also claims that the Commissioner’s application is effectively moot, as its "“privacy practices have evolved significantly since the events in question took place”"; for example, it has eliminated apps’ ability to request access to information about installing users’ friends, further strengthened its App Review process, continued to refine Graph API, and made its Terms of Service and Data Policy clearer. I note, parenthetically, that this argument is inconsistent with its argument that the 2008-2010 investigation and related communications are determinative of the current application. Facebook cannot have it both ways.
[145] I do not accept that the nature of the remedies sought constitutes a cogent ground for refusing a remedy. If there is a legal and evidentiary basis for the remedy, whether it is "“extraordinary”" or "“sweeping”" is of no moment. However, whether this Court should issue a remedial order in light of the assertion that the evidentiary record has shifted since the filing of the application is a different question, potentially one of mootness. The Court will not issue orders which would be of no force or effect.
[146] The events that gave rise to this application transpired a decade ago. Facebook claims that there have been many changes in its privacy practices since then, such that there may no longer be any nexus between the underlying breaches and the question of remedies sought. Further, the extent to which the evidentiary record in the Federal Court is sufficient to allow this Court to fairly adjudicate this question was not explored in argument before us. Absent further submissions or potentially, fresh evidence, this Court is not in a position to decide whether any of the Commissioner’s requests related to Facebook’s current conduct are reasonable, useful, and legally warranted. . Canada (Privacy Commissioner) v. Facebook, Inc. [purposive approach to PIPEDA interpretation]
In Canada (Privacy Commissioner) v. Facebook, Inc. (Fed CA, 2024) the Federal Court of Appeal comments on a contextual approach to PIPEDA interpretation:Purposive balancing under PIPEDA
[119] In rejecting the Commissioner’s application, the Federal Court noted that the parties were merely providing "“thoughtful pleas for well-thought-out and balanced legislation from Parliament that tackles the challenges raised by social media companies and the digital sharing of personal information”", and that to find a breach of PIPEDA would be "“an unprincipled interpretation from this Court of existing legislation that applies equally to a social media giant as it may apply to the local bank or car dealership”" (Federal Court decision at para. 90).
[120] This denies the importance of context. While it is true that the normative law applies equally, to all, its application varies with the context. Facebook’s business model centres around aggregating information and maintaining users on its platform for the purposes of selling advertising. The raison d’être of Facebook shapes the content and contours of its obligations to safeguard information and to obtain meaningful consent. There are no internal limits or brakes on Facebook’s "“need”" for information, given what it does with information, the demographic of its clientele, and the direct link between its use of information and its profit as an organization. A car dealership’s "“need”" for information is entirely different; the nature of the information and its uses are reasonably understandable, predicatable and limited. The analogy to a car dealership is inapt.
[121] I note in passing that the Federal Court referred to an organization’s "“""right to reasonably collect, use or disclose personal information”" (at para. 50, emphasis added). However, PIPEDA’s purpose, as set out in section 3, refers to an individual’s right of privacy, and an organization’s need to collect, use or disclose personal information. An organization has no inherent right to data, and its need must be measured against the nature of the organization itself. This distinction between the "“rights”" which are vested in the individual, and an organization’s "“need”" is an important conceptual foundation in the application of PIPEDA.
[122] The disposition of this case aligns with the purpose of PIPEDA as set out in section 3. It does not accord with the purpose of PIPEDA to find that Facebook users who downloaded TYDL (or other apps) agreed to a risk of mass data disclosure at an unknown time to unknown parties upon being presented with a generic policy, in digital form, which deemed to them to have read a second policy containing a clause alerting the user to the potential disclosure, all in the interest of Facebook increasing its bottom line.
[123] Parliament inserted the word "“meaningful”" into clause 4.3.2 of PIPEDA, and when reading legislation it is understood that each word has to be given meaning. If pure, contractual consent alone was the criteria, then the outcome of this case may be different. But that is not what Parliament has prescribed. Put otherwise, the question is not whether there is a provision buried in the terms of service whereby a user can be said to have consented. There will almost always be a provision to this effect on which a respondent can rely. This question is relevant, but not determinative of compliance with the twin obligations of PIPEDA; rather the inquiry is broader and contextual.
[124] Whether consent is meaningful takes into account all relevant contextual factors; the demographics of the users, the nature of the information, the manner in which the user and the holder of the information interact, whether the contract at issue is a one of adhesion, the clarity and length of the contract and its terms and the nature of the default privacy settings. The doctrines of unconscionability and inequality of bargaining power may also be in play. All of these considerations form the backdrop to the perspective of the reasonable person and whether they can be said to have consented to the disclosure. . Canada (Privacy Commissioner) v. Facebook, Inc. [application to Federal Court under PIPEDA s.15]
In Canada (Privacy Commissioner) v. Facebook, Inc. (Fed CA, 2024) the Federal Court of Appeal considered applications under PIPEDA s.15, initiated by the Commissioner after a complaint by third parties:[32] The Federal Court began its analysis by noting that applications under paragraph 15(a) of PIPEDA are de novo proceedings, with the basic question being whether Facebook breached PIPEDA, and if so, what remedy should flow. The Court observed that the purpose of Part 1 of PIPEDA (which governs the use of personal information in the private sector) is to balance a user’s right to protect their information and "“an organizations’ [sic] right to reasonably collect, use or disclose personal information”" (Federal Court decision at para. 50). The Court acknowledged that while PIPEDA is quasi-constitutional legislation, the ordinary exercise of statutory interpretation still applies, and the Court must interpret PIPEDA in a flexible and common-sense manner. . James v. Amazon.com.ca, Inc.
In James v. Amazon.com.ca, Inc. (Fed CA, 2023) the Federal Court of Appeal considered an appeal from a Federal Court application decision "that dismissed her application pursuant to section 14 ['Hearing by Court'] of the Personal Information Protection and Electronic Documents Act, S.C. 2000, c. 5 (PIPEDA)". These quotes illustrate some procedures involved in such applications, and involvement of the PIPEDA 'Principles' [Schedule 1: "Principles Set Out in the National Standard of Canada Entitled Model Code for the Protection of Personal Information, CAN/CSA-Q830-96"]:[2] Subsection 14(1) of PIPEDA contemplates an application concerning any matter in respect of which a complaint was made to the Office of the Privacy Commissioner (the Commissioner) pursuant to certain provisions of PIPEDA once the Commissioner has issued a report regarding the complaint or has indicated that the investigation of the complaint has been discontinued.
[3] In this case, Ms. James made a complaint pursuant to PIPEDA against the respondent, Amazon.com.ca, Inc. (Amazon) for denying her access to her personal information in its possession following her unsuccessful attempts to access the information. The Commissioner subsequently indicated that the investigation of the complaint would be discontinued because Amazon’s denial of access to personal information was due to an inability to verify her identity. The Commissioner found Amazon’s response to be fair and reasonable. Ms. James was unable to provide the password that Amazon had associated with the relevant information. Ms. James was also unwilling to take the steps required to reset the password.
[4] The Federal Court agreed with the Commissioner that Amazon had not been shown to violate Ms. James’ rights to access her personal information (pursuant to Principle 9 set out in Schedule 1 of PIPEDA) where it could not verify her identity. To the contrary, the Federal Court found that Amazon could have been faulted for disclosing such information without proper authorization.
[5] The Federal Court rejected Ms. James’ allegation that her inability to gain access to the information in question was because of some inaccuracy in such information, in contravention of Principle 6. The Federal Court noted that this allegation had not been raised in the complaint to the Commissioner, and further that there was no evidence to support the allegation.
[6] The Federal Court also rejected Ms. James’ argument that Amazon had failed to respond in a timely manner to her request for access to personal information. The Federal Court found that the timeframe for a response (as contemplated in subsection 8(3) of PIPEDA) would not begin until Amazon was able to confirm Ms. James’ identity.
[7] Ms. James raises several issues on appeal, which can be summarized as follows:That the Federal Court erred in raising new issues;
That the Federal Court erred in finding no violation of Principle 9 relating to individual access to information; and
That the Federal Court erred in limiting the scope of its jurisdiction under Principle 6 relating to accuracy of information. [8] Because the Federal Court hears an application under section 14 of PIPEDA de novo (without deference to the Commissioner), the normal appellate standard of review described in Housen v. Nikolaisen, 2002 SCC 33, [2002] 2 S.C.R. 235, applies. Questions of law are reviewed on a standard of correctness, and only in cases of palpable and overriding error will this Court intervene on questions of fact or of mixed fact and law from which no issue of law is extricable. Palpable and overriding error means an error that is obvious and goes to the very core of the outcome of the case: Canada v. South Yukon Forest Corporation, 2012 FCA 165, 431 N.R. 286 at para. 46.
....
[13] With regard to the Federal Court’s finding that Amazon did not violate Principle 9, we see no reviewable error. The Federal Court relied on the evidence before it to conclude that there was sufficient doubt as to Ms. James’ identity to justify Amazon seeking further information before providing the requested personal information. The Federal Court was entitled to reach such a conclusion. Although it would have been preferable to do so, its failure to mention explicitly subsection 8(7) of PIPEDA was not an error since it clearly considered the provision.
|