We have been monitoring a trend of cases narrowing the immunity provided to website operators under Section 230 of the Communications Decency Act (CDA). A recent decision by a state court in Georgia, however, demonstrates that Section 230 continues to be applied expansively in at least some cases.
The case, Maynard v. McGee, arose from an automobile collision in Clayton County, Georgia. Christal McGee, the defendant, had allegedly been using Snapchat’s “speed filter” feature, which tracks a car’s speed in real-time and superimposes the speed on a mobile phone’s camera view. According to the plaintiffs, one of whom had been injured in the collision, McGee was using the speed filter when the accident occurred, with the intention of posting a video on Snapchat showing how fast she was driving. The plaintiffs sued McGee and Snapchat for negligence, and Snapchat moved to dismiss based on the immunity provided by Section 230.
The plaintiffs alleged that Snapchat was negligent because it knew its users would use the speed filter “in a manner that might distract them from obeying traffic or safety laws” and that “users might put themselves or others in harm’s way in order to capture a Snap.” To demonstrate that Snapchat had knowledge, the plaintiffs pointed to a previous automobile collision that also involved the use of Snapchat’s speed filter. The plaintiffs claimed that “[d]espite Snapchat’s actual knowledge of the danger from using its product’s speed filter while driving at excessive speeds, Snapchat did not remove or restrict access to the speed filter.”
Section 230(c) of the CDA provides that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The plaintiffs argued that the Section 230 immunity did not apply to Snapchat in this case, however, because Snapchat’s negligence was based on its own content—i.e., the speed filter— rather than on content posted by McGee (note that McGee did not actually post a video to Snapchat before the collision occurred). Specifically, the plaintiffs asserted that Snapchat should have removed the speed filter after it learned of the previous accidents that the feature allegedly caused.
The court was not persuaded, however, and noted that “decisions about the structure and operation of a website—such as decisions about features that are part and parcel of the site’s overall design—reflect choices about what content can appear on the website and in what form and thus fall within the purview of traditional publisher functions” (internal quotation marks omitted).
The court also found that Snapchat’s knowledge of prior accidents allegedly caused by the speed filter was knowledge that Snapchat “would have obtained because it created the ‘speed filter’ and was aware of what was published on its application,” which further convinced the court that plaintiffs were seeking to impose a duty in Snapchat that “derives from Snapchat’s status or conduct as a publisher.” Finally, the court determined that a user’s choice to use the speed filter “is not Snapchat’s speech, but is ultimately the user’s speech using the voluntary options [of] Snapchat’s platform.”
For these reasons, the court held that the plaintiffs were seeking to hold Snapchat liable as a publisher, and that such liability was precluded by Section 230. Accordingly, the court dismissed the plaintiffs’ claims. But it’s worth noting that, while the court acknowledges that Section 230 applies only to third-party content and does not immunize a publisher’s own content, the actual analysis in the decision seems to treat any “publisher” activity as automatically immunized without regard to the source of the content at issue.
For example, while it is certainly true, as the court notes, that “decisions relating to the monitoring, screening, and deletion of content” are traditionally publisher activities, that fact is relevant to Section 230 only when the content at issue is provided by a user or other third party. If the publisher is making choices about its own content—as plaintiffs alleged was the case with Snapchat and its decision to continue providing the speed filter with knowledge that it had caused accidents in the past—then the mere fact that content-related decisions are traditional publisher activities does not necessarily mean that Section 230 applies.
In any event, while the court’s analysis may raise a few questions, the result is generally in line with prior cases applying Section 230 immunity to offline injuries caused by third-party defendants, as other commentators have noted. Ultimately, it seems that the court saw plaintiffs’ injuries as flowing from McGee’s choice to use the speed filter and determined that Section 230 precluded plaintiffs from holding Snapchat liable for her decision to use a feature that Snapchat, in its role as a publisher, made available. For fans of Section 230, Maynard v. McGee is a welcome indication that the statute’s “robust immunity” lives on, at least sometimes.