On November 29, 2011, the Federal Trade Commission (“FTC”) announced a proposed order against Facebook that builds upon both the FTC’s recommendations from its 2010 draft privacy report and precedents set in the order that it recently imposed on Google. Any business that collects personal information from consumers should pay close attention to this action because it makes clear that:
- The FTC will continue to remain vigilant in holding companies to their privacy-related promises to consumers. The FTC will pay particular attention when those promises involve consumers’ choices regarding their personal information, and it will continue to look for and prosecute companies who have certified their compliance with the U.S./EU Safe Harbor (allowing personal information collected in the EU to be transferred to the US) yet fail to abide by the principles underlying the Safe Harbor;
- The FTC will continue to require opt-in consent for material changes to a company’s privacy practices. This is not a new development, but it is worth repeating that the FTC has not backed away from its assertion that, when a company changes its privacy practices in a material way, it must obtain consumers’ opt-in consent to those changes before applying them retroactively (i.e., to information already collected);
- The FTC has a robust new template for privacy orders. The FTC will continue to impose onerous injunctive relief on companies that do not abide by their own privacy promises, including the obligation — even where there has been no alleged data breach — to obtain an independent privacy audit every other year for 20 years; and
- The FTC will continue to require companies subject to a privacy order to implement and maintain a comprehensive “privacy by design” program and, in fact, may begin to expect this from all companies. In its 2010 draft privacy report, the FTC proposed that businesses make privacy and data security a routine consideration by adopting a “privacy by design” approach. The report has not yet been finalized, but that has not stopped the FTC from moving this proposal closer toward becoming a legal requirement by way of its enforcement actions against Google and Facebook (the FTC often expresses its “expectations” of industry through settlement agreements). We take the inclusion of a “privacy by design” requirement in both orders to mean that the FTC thinks that all businesses should adopt such procedures and that, eventually, the FTC is likely to view a failure to adopt such procedures as deceptive or unfair, in violation of the FTC Act.
The proposed order would settle charges that a variety of Facebook’s information practices were deceptive or unfair. Highlights of the complaint and proposed order are summarized below. The proposed order was open for public comment until December 30, 2011; that period having closed, the FTC will now determine whether to make its order final or to modify its requirements.
The FTC’s Complaint
The FTC’s complaint against Facebook contains eight counts, each of which underscores the theme repeated in the FTC’s privacy enforcement actions over the years: Businesses must comply with the privacy-related promises that they make to their customers. Here, the FTC alleged that Facebook failed to comply with promises made to its users in a variety of contexts over time. Specifically:
- Facebook’s privacy settings:
Access to personal information. Facebook promised its users that, through the choices that they made in their Profile Privacy Pages, they could limit the categories of people who could access their personal information. According to the FTC, however, users’ choices were meaningless because Facebook permitted third-party applications used by a user’s Facebook friends to access the user’s personal information — including marital status, birthday, town, schools, jobs, photos, and videos — regardless of the privacy settings chosen by the user. The FTC has therefore alleged that the company’s representations were deceptive. - Facebook’s privacy settings:
Overriding user choice. Two counts in the FTC’s complaint address privacy policy changes that Facebook made in December 2009 — changes that Facebook claimed would not only give users more control over their personal information but also allow them to keep their existing privacy settings. According to the FTC, contrary to those promises, some information designated by users as private (such as a friend list) was actually made public under the new policy. The FTC has charged that this was deceptive because Facebook overrode users’ existing privacy choices without adequate disclosure. The FTC has further charged that the change constituted an unfair practice because Facebook retroactively applied material changes to personal information it had already collected from users without first obtaining their consent. In the FTC’s view, the practice met the standard for unfairness because it “has caused or has been likely to cause substantial injury to consumers, was not outweighed by countervailing benefits to consumers or to competition, and was not reasonably avoidable by consumers.” - Scope of applications’ access to user information. The FTC has alleged that, for more than three years from the debut of applications on the Facebook platform, Facebook deceived its users about the scope of the profile information accessible to apps. Specifically, Facebook told users that an app would have access to only the information “that it requires to work.” The FTC has charged that this promise was deceptive because, in many instances, Facebook gave apps unrestricted access to user profile information, including information that such apps often did not need to operate.
- Advertisers’ receipt of user information. According to the FTC’s complaint, Facebook represented to users numerous times that it would not share their information with advertisers without the users’ consent. For instance, in its Statement of Rights and Responsibilities, Facebook promised: “We don’t share your information with advertisers unless you tell us to. . . Any assertion to the contrary is false. Period . . . we never provide the advertiser any names or other information about the people who are shown, or even who click on, the ads.” The FTC has alleged that this representation and others like it were deceptive because, from at least September 2008 until the end of May 2010, Facebook’s site was designed and operated such that the User ID of a user who clicked on an advertisement was, in many cases, shared with the advertiser.
- Facebook’s “Verified Apps” program. Facebook promised its users that, under its “Verified App” program, Facebook reviewed apps so as to “offer extra assurances to help users identify applications they can trust — applications that are secure, respectful and transparent, and have demonstrated commitment to compliance with [Facebook] policies.” According to the FTC, however, because Facebook did not take any steps to verify an app in any of these ways, its promise was deceptive.
- Photo and video deletion. Facebook told users that, when they deactivated or deleted their accounts, their photos and videos would be inaccessible to others. The FTC has alleged, however, that Facebook continued to make available the photos and videos of both deactivated and deleted accounts to third parties, and, accordingly, the company’s promises were deceptive.
- Compliance with the U.S.-EU Safe Harbor Framework. The FTC has alleged that Facebook misrepresented its compliance with its Safe Harbor certification because — as described above — it failed to give its users notice and choice before using their information for a purpose different from that for which it was collected, in violation of the “Notice” and “Choice” principles required of Safe Harbor certified companies. Because Facebook’s Safe Harbor certification represented to consumers that Facebook was compliant with the principles, the FTC has charged that its failure to comply with them was unfair or deceptive.
The Proposed Settlement Agreement
No Privacy or Security Misrepresentations. Like all FTC orders settling charges of deception, the proposed order would prohibit Facebook from future misrepresentations. Specifically, the order would enjoin Facebook from express and implied misrepresentations about how it maintains the privacy or security of users’ information, including: (1) the extent to which a user can control the privacy of his or her information; (2) the extent to which Facebook makes user information available to third parties; and (3) the extent to which Facebook makes information accessible to third parties after a user has terminated his or her account.
Opt-In Consent for New Disclosures. The proposed settlement agreement would require Facebook to obtain users’ opt-in consent before sharing their information with a third party in a way that materially exceeds the restrictions imposed by the users’ privacy settings. This obligation ratifies a requirement that the FTC first imposed against Gateway Learning in 2004 and which it has repeated numerous times since then: A company that makes a material change to its privacy practice must obtain affected individuals’ opt-in consent to that change before applying it retroactively (i.e., to information already collected). The proposed order specifies the way in which Facebook must obtain such consent. It must: (1) clearly and conspicuously disclose to the user, separate and apart from any privacy policy or similar document, (a) the categories of information that will be disclosed, (b) the identity or categories of the recipients, and (c) the fact that such sharing exceed the restrictions imposed by the user’s privacy settings; and (2) obtain the user’s affirmative express consent to the disclosure.
Deletion of “Deleted” Content. The proposed settlement would require Facebook to implement procedures reasonably designed to ensure that the information of a user who has deleted his or her information or deleted or terminated his or her account is not accessible by any third party.
Privacy by Design. Like the FTC’s order against Google, the proposed Facebook order includes a “privacy by design” provision that would require Facebook to implement and maintain a comprehensive privacy program that (1) addresses the privacy risks related to the development and management of both new and existing products and services and (2) protects the privacy of user information. Specifically, Facebook would have to:
- designate one or more responsible employees;
- identify reasonably foreseeable material risks that could result in the unauthorized collection, use or disclosure of user information;
- design and implement reasonable controls and procedures to address identified risks and regularly test them;
- develop and implement reasonable steps to select service providers that will adequately protect user privacy and contractually require them to maintain appropriate protections; and
- evaluate and adjust the privacy program in light of the testing required by it, any material change to Facebook’s operations, or any other circumstances that may have a material impact on the program’s effectiveness.
In its 2010 draft privacy report, the FTC proposed that businesses make privacy and data security a routine consideration by adopting a privacy by design approach. Although it has not yet finalized the report, the FTC has moved this proposal closer to becoming a legal requirement through both its proposed order and its recent order against Google. The FTC often expresses its expectations of industry through a settlement agreement. For this reason, we take the inclusion of a privacy by design requirement in both orders to mean that the FTC thinks that all businesses should adopt such procedures and that, eventually, the FTC is likely to view a failure to have them as deceptive and/or unfair, in violation of the FTC Act.
Biannual Audits for 20 Years. The proposed settlement agreement would require Facebook to obtain an independent privacy audit every other year for 20 years. In light of the fact that this is the second time that the FTC has imposed such relief this year (after the Google matter), we expect that the 20-year audit requirement along with the privacy by design provision, will become a staple of FTC privacy settlements.
Safe Harbor Provisions. The proposed settlement marks the second time that the FTC has held a company accountable for its alleged failure to comply with substantive privacy provisions of the US/EU Safe Harbor framework. (The first was in the Google action.) The charges serve as an important reminder that Safe Harbor certification constitutes a representation to consumers that, if false, is actionable. The proposed order would bar Facebook from misrepresenting its compliance with the Safe Harbor or any other privacy or security compliance program.
Key Take Aways
The FTC’s complaint and proposed order against Facebook are noteworthy because they reinforce the precedents that the FTC set in its action against Google, thereby sending the following unmistakable signals to the market:
- The FTC will continue to hold companies to their privacy promises and apply strong injunctive relief where it finds that the promises are false;
- The FTC continues to believe that a company must obtain affected consumers’ affirmative consent to new privacy practices applied retroactively;
- The FTC will continue to look for and prosecute companies’ failures to abide by the principles underlying their US/ EU Safe Harbor certifications;
- The FTC has a new template for privacy settlement agreements — one that requires a privacy by design approach to business, as well as independent biannual audits for 20 years; and
- The FTC is beginning to consider privacy by design as a requirement under Section 5 of the FTC Act, which prohibits unfair and deceptive acts and practices.