The Federal Court of Appeal Clarifies the Requirements for Consent under PIPEDA in Canada (Privacy Commissioner) v. Facebook, Inc.
8
minute read
May 7, 2025
published in
Appeals
Jennifer L. Hunter
Partner
Also authored by:
Haoran Wang
This article was originally published by Law360 Canada.
In 2018, during Donald Trump’s first presidency, news media reported that the UK-based company, Cambridge Analytica, had used personal information obtained from Facebook users without authorization. Cambridge Analytica used this information to build a system that profiled individual U.S. voters and targeted them with personalized political advertisements.[1] The scandal led to a congressional hearing before the U.S. Senate, where Facebook’s chief executive, Mark Zuckerberg, testified.[2] Following the media reports, the Privacy Commissioner of Canada (the “Commissioner”) received a complaint regarding Facebook’s compliance with the Canadian Personal Information Protection and Electronic Documents Act (“PIPEDA”). After the investigation, the Commissioner concluded that Facebook had failed to obtain valid and meaningful consent for disclosing users’ information to third-party applications and had not adequately safeguarded users’ data. As a result, in February 2020, the Commissioner commenced an application in the Federal Court pursuant to paragraph 15(a) of PIPEDA, asking the court to determine whether Facebook had breached PIPEDA.
The underlying facts, lower court decision and decision of the Court of Appeal are summarized below, but here are the key takeaways:
Meaningful consent under PIPEDA requires that a reasonable person would understand the nature, purpose, and consequences of data collection, use, and disclosure—not just that users clicked "agree" to lengthy policies.
Organizations must take active steps to ensure users are properly informed and must not rely on complex, lengthy, or obscure policies to claim consent.
Safeguarding obligations require more than contractual assurances; platforms must actively monitor and enforce privacy practices of third parties with access to user data.
The context of digital platforms and consumer contracts of adhesion means courts will scrutinize claims of consent and safeguarding more closely, especially where default settings favour disclosure.
The decision clarifies that privacy obligations under PIPEDA are to be interpreted from the perspective of a reasonable person, not based on subjective or expert evidence about individual users.
Background
In 2007, Facebook launched the platform technology, which allowed third parties to build and operate applications on Facebook. By installing these applications, users could have personalized social and entertainment experiences on Facebook, such as sharing photos and listening to music.[3] To enable these third-party applications to receive information from users, Facebook provided a communication protocol, called “Graph API,” which went through two phases of revisions. Under the first version (the “v1”), third-party applications could obtain information about both the installing users and their friends; under the second version (the “v2”), applications were prohibited from accessing information about friends of the installing users, with a few limited exceptions.[4] However, before switching to v2, Facebook gave existing applications a one-year grace period to continue operating under v1 and obtain information about the installing users’ Facebook friends.[5]
The third-party application at issue, “thisisyourdigitallife” (“TYDL”) was developed by Dr. Aleksandr Kogan, a former professor at the University of Cambridge. TYDL, presented as a personality quiz, accessed information including Facebook profiles of the installing users’ as well as information from their Facebook friends. Through approximately 272 Canadian users, TYDL obtained data from over 600,000 Canadian Facebook users. The user data collected by TYDL was later sold to Cambridge Analytica to develop the psychographic models used to target political messages towards Facebook users during the 2016 U.S. presidential election.[6]
The alleged breaches of PIPEDA occurred during the v1 phase, between November 2013 and December 2015.[7] During this time, Facebook had two platform-wide policies in place, the Terms of Service and the Data Policy. A user must consent to these policies in order to sign up for Facebook. The Terms of Service, which was approximately 4,500 words in length, provided that third-party applications may ask the user for permission to access their information. The Data Policy, which was approximately 9,100 words in length, explained how information was shared on Facebook.[8]
The Federal Court Decision
The Federal Court dismissed the Commissioner’s application because the Commissioner had failed to meet the standard of proof. In addressing the first allegation that Facebook had not obtained meaningful consent from users and their Facebook friends, Justice Manson held that the evidentiary vacuum was detrimental to the Commissioner’s case.[9] The Commissioner did not compel evidence from Facebook pursuant to section 12.1 of PIPEDA, nor did the Commissioner provide any expert evidence indicating what Facebook could have done differently. The Commissioner also did not submit any subjective evidence from users about their expectations of privacy. [10] As a result, Justice Manson rejected the Commissioner’s submissions which would have required the court to speculate and draw unsupported inferences.[11]
Regarding the second allegation, whether Facebook had adequately safeguarded user information, Justice Manson agreed with Facebook that PIPEDA’s “safeguarding obligations end once information is disclosed to third-party applications.”[12] In the absence of evidence to the contrary, Justice Manson declined to conclude that Facebook’s contractual agreements and policies failed to provide adequate protection for users’ information.[13]
The Federal Court of Appeal Decision
The Federal Court of Appeal allowed the appeal and granted the Commissioner’s application in part.
The Meaningful Consent Analysis
The Federal Court of Appeal rejected the Federal Court’s requirement for subjective and expert evidence. The court held that the standard for meaningful consent is the reasonable person standard as prescribed by the legislation, since both section 6.1 and clause 4.3.2 of PIPEDA expressly adopt the word “reasonable.”[14] Therefore, subjective evidence is not necessary when determining the reasonable person standard.[15] Unlike assessing a reasonable professional, which might require expert evidence due to limited experience with that particular profession, a judge in this case can rely on everyday life experience to inform their decision. Consequently, expert evidence would not be necessary.[16] Further, evidence of surrounding circumstances, including disclosure from Cambridge Analytica and Facebook’s policies and practices, was sufficient for the court to assess whether the reasonable person standard had been met.[17]
The Federal Court of Appeal clarified the double reasonableness standard in clause 4.3.2 of PIPEDA. The legislation states as follows:
The principle requires “knowledge and consent”. Organizations shall make a reasonable effort to ensure that the individual is advised of the purposes for which the information will be used. To make the consent meaningful, the purposes must be stated in such a manner that the individual can reasonably understand how the information will be used or disclosed.[18] (Emphasis added)
The first prong of the test focuses on organizations requesting information, requiring them to make reasonable efforts to inform an individual about the collection and use of data. The second prong centres on individuals, which requires the form of consent to be informative enough that an individual can reasonably understand the use and disclosure of information.[19]
Whether a form of consent is sufficiently meaningful depends on the specific circumstances.[20] In this case, the court identified two distinct circumstances based on two groups of Facebook users: those who installed third-party applications and their Facebook friends. While users who installed third-party applications had the opportunity to directly review the application’s privacy policies and consent to the collection and use of data, friends of these users did not.[21] The only policy available to them was Facebook’s high-level Data Policy, which the court found to be too broad and ineffective. It did not sufficiently inform users about the disclosure of information related to their friends’ use of third-party applications, nor did it contemplate large-scale data scraping such as what TYDL did.[22] Therefore, the court concluded that the Facebook friends of those users who installed third-party applications did not meaningfully consent to the disclosure of their information.
The Court of Appeal also held that the users who installed the TYDL application did not provide meaningful consent to the disclosure of their data either. The underlying question is whether a reasonable person would have understood that in downloading a third-party application, like a personality quiz in this case, they consent to the scraping of their data and to the use of it in a manner contrary to Facebook’s rules, such as developing models to target political advertisements.[23]The court considered the following contextual factors in reaching its conclusion:
Facebook obtained users’ consent to the Data Policy in a manner contrary to PIPEDA’s requirements, as users were deemed to accept the Data Policy when they accepted the Terms of Service;[24]
Mark Zuckerberg himself testified before the U.S. Senate that he imagined that most people do not read the policies;[25]
Facebook did not warn users of the possibility of bad actors on its Platform, such as TYDL;[26]
Facebook had no robust preventative measures in place which a reasonable user would have expected;[27]
When TYDL requested access to unnecessary information while transitioning to v2 in 2014, Facebook failed to act promptly despite labelling it as a “red flag;”[28] and
Facebook’s Terms of Service and Data Policy were adhesion contracts not subject to negotiation, which gave rise to heightened scrutiny.[29]
After considering these contextual factors, the Court of Appeal concluded that users who installed TYDL did not meaningfully consent to the disclosure of their information.
The Safeguarding Obligations Analysis
The Federal Court of Appeal overruled the Federal Court’s finding and held that Facebook had not met its safeguarding obligations to adequately protect user data. Although the Federal Court of Appeal agreed with the Federal Court that the safeguarding principles apply only to data in an organization’s possession, the appellate court took issue with Facebook’s inaction prior to the disclosure of data to third parties.[30] The court noted Facebook’s pattern of inaction. For instance, Facebook failed to review the content of privacy policies of the third-party applications or act on TYDL’s suspicious request for access to information in 2014. Although Facebook discovered TYDL’s breach of its policies in December 2015, it did not notify affected users or ban Dr. Kogan and Cambridge Analytica from Facebook until March 2018, contrary to its own policies.[31] The court concluded that having invited users to its platform, Facebook must ensure compliance with PIPEDA.
Balancing Interests under PIPEDA
The Federal Court of Appeal emphasized the importance of context when conducting a PIPEDA analysis, highlighting that an organization has no inherent right to information.[32] PIPEDA balances individuals’ rights of privacy with organizations’ need to collect, use and disclose personal information.[33] The organization’s need for information must be considered with regards to the nature of its business. For example, Facebook’s business model aims at obtaining information and selling advertising, and therefore, the direct link between the collected information and Facebook’s profits informs the degree required for a meaningful consent.[34] The interpretation of meaningful consent must take into account all relevant factors, including “the demographics of the users, the nature of the information, the manner in which the user and the holder of the information interact, whether the contract at issue is a one of adhesion, the clarity and length of the contract and its terms and the nature of the default privacy settings.”[35]
The Estoppel and Officially Induced Error Arguments
Between 2008 and 2009, the Commissioner investigated Facebook’s privacy policies and made several recommendations. In 2010, the Commissioner informed Facebook that it had satisfied the commitments.[36] Relying on this representation, Facebook argued that the Commissioner was estopped from pursuing the application. The court rejected the argument on three grounds. First, as technology is always evolving, Facebook should be expected to adapt its privacy measures. Second, the application was a de novo hearing, and consequently, the court owed no deference to the Commissioner’s earlier report. Third, estoppel in public law has narrow application, and the Commissioner cannot be estopped from exercising statutory duties based on a representation made over a decade ago.[37]
The Implications
This case highlights the importance for organizations to obtain meaningful consent and safeguard user data when conducting their business. Courts are encouraged to consider all relevant contextual factors in determining whether PIPEDA has been complied with. The reasonable person standard under PIPEDA developed by the Federal Court of Appeal, which does not necessarily require expert or subjective evidence, will make it easier for the Commissioner to establish breaches of PIPEDA. Organizations should be cautious when adopting non-negotiable boilerplate privacy policies, as such consumer contracts of adhesion may be subject to heightened scrutiny.
[1] Carole Cadwalladr & Emma Graham-Harrison, “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach”, Guardian (17 March 2018).
[2] Chloe Watson, “The Key Moments from Mark Zuckerberg's Testimony to Congress”, Guardian (11 April 2018).
[3] Canada (Privacy Commissioner) v. Facebook, Inc., 2024 FCA 140 at para 5 [Facebook].
[4] Facebook, supra at para 6.
[5] Facebook, supra at para 7.
[6] Facebook, supra at para 23.
[7] Facebook, supra at para 7.
[8] Facebook, supra at paras 9—13.
[9] Canada (Privacy Commissioner) v. Facebook, Inc., 2023 FC 533 at para 71 [Facebook FC].
[10] Facebook FC, supra at paras 71—72.
[11] Facebook FC, supra at para 78.
[12] Facebook FC, supra at para 86.
[13] Facebook FC, supra at para 91.
[14] Facebook, supra at para 61.
[15] Facebook, supra at para 60.
[16] Facebook, supra at para 66.
[17] Facebook, supra at para 65.
[18] Personal Information Protection and Electronic Documents Act, SC 2000, c 5, s 72, clause 4.3.2 [PIPEDA].
[19] Facebook, supra at paras 71—72.
[20] Facebook, supra at para 74.
[21] Facebook, supra at paras 75—76.
[22] Facebook, supra at paras 80—82.
[23] Facebook, supra at para 87.
[24] Facebook, supra at para 89.
[25] Facebook, supra at paras 89—90.
[26] Facebook, supra at paras 90.
[27] Facebook, supra at paras 92.
[28] Facebook, supra at paras 95.
[29] Facebook, supra at para 100.
[30] Facebook, supra at para 113.
[31] Facebook, supra at paras 110—112.
[32] Facebook, supra at para 120.
[33] Facebook, supra at para 121.
[34] Facebook, supra at para 120.
[35] Facebook, supra at para 124.
[36] Facebook, supra at paras 128—130.
[37] Facebook, supra at paras 131—134.
Insights