Today’s companies compete not only for dollars but also for likes, followers, views, tweets, comments and shares. “Social currency,” as some researchers call it, is becoming increasingly important and companies are investing heavily in building their social media fan bases. In some cases, this commitment of time, money and resources has resulted in staggering success. Coca-Cola, for example, has amassed over 96 million likes on its Facebook page and LEGO’s YouTube videos have been played over 2 billion times.
With such impressive statistics, there is no question that a company’s social media presence and the associated pages and profiles can be highly valuable business assets, providing an important means for disseminating content and connecting with customers. But how much control does a company really have over these social media assets? What recourse would be available if a social media platform decided to delete a company’s page or migrate its fans to another page?
The answer may be not very much. Over the past few years, courts have repeatedly found in favor of social media platforms in a number of cases challenging the platforms’ ability to delete or suspend accounts and to remove or relocate user content.
Legal Show-Downs on Social Media Take-Downs
In a recent California case, Lewis v. YouTube, LLC, the plaintiff Jan Lewis’s account was removed by YouTube due to allegations that she artificially inflated view counts in violation of YouTube’s Terms of Service. YouTube eventually restored Lewis’s account and videos but not the view counts or comments that her videos had generated prior to the account’s suspension.
Lewis sued YouTube for breach of contract, alleging that YouTube had deprived her of her reasonable expectations under the Terms of Service that her channel would be maintained and would continue to reflect the same number of views and comments. She sought damages as well as specific performance to compel YouTube to restore her account to its original condition.
The court first held that Lewis could not show damages due to the fact that the YouTube Terms of Service contained a limitation of liability provision that disclaimed liability for any omissions relating to content. The court also held that Lewis was not entitled to specific performance because there was nothing in the Terms of Service that required YouTube to maintain particular content or to display view counts or comments. Accordingly, the court affirmed dismissal of Lewis’s complaint.
In a similar case, Darnaa LLC v. Google, Inc., Darnaa, a singer, posted a music video on YouTube. Again, due to allegations of view count inflation, YouTube removed and relocated the video to a different URL, disclosing on the original page that the video had been removed for violating its Terms of Service. Darnaa sued for breach of the covenant of good faith and fair dealing, interference with prospective economic advantage and defamation. In an email submitted with the complaint, Darnaa’s agent explained that she had launched several large campaigns (each costing $250,000 to $300,000) to promote the video and that the original link was already embedded in thousands of websites and blogs. Darnaa sought damages as well as an injunction to prevent YouTube from removing the video or changing its URL.
The court dismissed all of Darnaa’s claims because YouTube’s Terms of Service require lawsuits to be filed within one year and Darnaa had filed her case too late. In its discussion, however, the court made several interesting points. In considering whether YouTube’s Terms of Service were unconscionable, the court held that, although the terms are by nature a “contract of adhesion,” the level of procedural unconscionability was slight, since the plaintiff could have publicized her videos on a different website. Further, in ruling that the terms were not substantively unconscionable, the court pointed out that “[b]ecause YouTube offers its hosting services free of charge, it is reasonable for YouTube to retain broad discretion over [its] services.”
Although the court ultimately dismissed Darnaa’s claims based on the failure to timely file the suit, the decision was not a complete victory for YouTube. The court granted leave to amend to give Darnaa the opportunity to plead facts showing that she was entitled to equitable tolling of the contractual limitations period. Therefore, the court went on to consider whether Darnaa’s allegations were sufficient to state a claim. Among other things, the court held that YouTube’s Terms of Service were ambiguous regarding the platform’s rights to remove and relocate user videos in its sole discretion. Thus, the court further held that if Darnaa were able to amend the complaint to avoid the consequences of the failure to timely file, then the complaint would be sufficient to state a claim for breach of the contractual covenant of good faith and fair dealing.
By contrast, the court found no such ambiguity in Song Fi v. Google Inc., a case with facts similar to those in Darnaa. In Song Fi, the plaintiff asserted claims for, among other things, breach of contract and breach of the implied covenant of good faith and fair dealing. YouTube raised a defense under the Communications Decency Act (CDA) Section 230(c)(2)(A) which states that no provider of an interactive computer service is liable for removing content that it considers to be obscene, violent, harassing or “otherwise objectionable.”
The Song Fi court, interpreting this provision narrowly, found that although videos with inflated view counts could be a problem for YouTube, they are not “otherwise objectionable” within the meaning of Section 230(c)(2)(A), and thus YouTube did not have immunity under that provision. Specifically, the court concluded that, in light of the CDA’s history and purpose, the phrase “otherwise objectionable” relates to “potentially offensive material, not simply any materials undesirable to a content provider or user.” Further, the requirement that the service provider subjectively finds the blocked or screened material objectionable “does not mean anything or everything YouTube finds subjectively objectionable is within the scope of Section 230(c).” Therefore, the court held that videos with inflated view counts fell outside the statutory safe harbor granted by Section 230(c)(2).
Despite finding Section 230(c)(2) inapplicable, the court ultimately dismissed all of Song Fi’s claims. Notably, the court dismissed the contract-based claims with prejudice, holding that, although YouTube’s Terms of Service were “inartfully drafted,” they “unambiguously” reserved to the right for YouTube to remove content in its sole discretion and to discontinue any aspect of its service without liability. Therefore, the court held, the Terms of Service “unambiguously foreclose[d]” Song Fi’s claims for breach of contract and breach of the implied covenant of good faith and fair dealing.
Facebook had more luck than did Google in asserting a CDA Section 230 defense in Sikhs For Justice “SFJ”, Inc. v. Facebook, Inc., a case brought by a human rights group advocating for Sikh independence in the Indian state of Punjab. Sikhs for Justice (SFJ) alleged that Facebook had blocked its page in India at the behest of the Indian government. SFJ sued in the Northern District of California, asserting several causes of action including race discrimination, and sought damages and injunctive relief.
The Sikhs for Justice court ruled in favor of Facebook, citing CDA Section 230(c)(1), which states that “no provider of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Based on this statutory language, Section 230(c)(1) has been interpreted to provide a broad immunity for website operators against liability arising from user generated content. In dismissing the suit, the Sikhs for Justice court explained that the content at issue was provided by SFJ, not by Facebook, and that Facebook’s refusal to publish the SFJ page in India was “clearly publisher conduct” that is immunized by Section 230(c)(1).
Notably, the court did not mention the Section 230(c)(2) safe harbor for blocking user content, which YouTube had asserted in SongFi as discussed above. According to some commentators, the Sikhs for Justice court’s failure to discuss Section 230(c)(2) “highlights its weakness as a safe harbor.”
In another case against Facebook, Young v. Facebook, Inc., the plaintiff, Karen Beth Young, found herself suddenly banned from Facebook after sending friend requests to strangers. She sued for breach of the implied covenant of good faith and fair dealing as well as several other claims. In contrast to some of the cases discussed above, the Young court found that “it is at least conceivable that arbitrary or bad faith termination of user accounts … with no explanation at all could implicate the implied covenant of good faith and fair dealing,” particularly since Facebook had provided in its Statement of Rights and Responsibilities that users’ accounts should not be terminated for reasons other than those described in the Statement. Nonetheless, the court dismissed Young’s suit because her complaint did not sufficiently allege that the account termination was undertaken in bad faith or violated Facebook’s contractual obligations.
The cases above illustrate how difficult it is for social media users to object to deletion or suspension of accounts or to removal or relocation of content based on a platform’s contractual obligations under the applicable terms of service. Users have met with similar obstacles in asserting a property right in social media content.
For example, Mattocks v. Black Entertainment Television LLC (which we have discussed previously) involved a dispute between BET and Stacey Mattocks, whom BET had hired to help manage the unofficial Facebook fan page for one of its shows. When Mattocks restricted BET’s access to the fan page, BET asked Facebook to “migrate” the fans to another official page that BET had created and Facebook granted the request. Mattocks sued BET for conversion of her business interest in the Facebook fan page. The court, holding that Mattocks failed to establish that she owned a property interest in the page’s likes, granted BET’s motion for summary judgment. “If anyone can be deemed to own the ‘likes’ on a Facebook page,” the court stated, “it is the individual users responsible for them.” While the Mattocks case did not directly target the social media platform itself, it does demonstrate how difficult it can be for a plaintiff to challenge social media platforms’ decisions to remove or relocate content based on purported ownership of that content.
Safeguarding Your Social Media Currency
Ultimately, the cases discussed above show that social media platforms have significant control over what is (or isn’t) published on their websites, regardless of the amount of time and effort that users have spent building up their individual pages and profiles. With all of this in mind, what can individuals and companies do to protect their social media currency? How can you help ensure that your hard-earned fans, likes, comments and views do not suddenly disappear?
A good tip is to read the applicable terms of service carefully to understand the platform’s rules and the reasons for which a platform may delete or suspend accounts or remove or relocate content. Make sure to comply with the platform’s rules, including those regarding contests, collection and use of user information and content guidelines. Users should err on the side of caution, and avoid posting anything that could be deemed offensive or obscene or that might infringe upon other parties’ intellectual property rights. And it goes without saying that users should avoid fraudulent practices, such as artificially driving up view counts or posting fake comments.
Most of all, businesses and individuals should keep in mind that social media platforms have broad discretion when it comes to decisions about what to publish and where. As such, consider spreading your company’s social media marketing efforts across a number of different platforms to minimize the impact of sudden content removals or relocations on any one platform. At the end of the day, every social media account—even those with millions of likes or views—is controlled not by the user that created the account but by the platform that hosts it.