As we have frequently noted on Socially Aware, Section 230 of the Communications Decency Act protects social media sites and other online platforms from liability for user-generated content. Sometimes referred to as “the law that gave us the modern Internet,” Section 230 has provided robust immunity for website operators since it was enacted in 1996. As we have also written previously, however, the historically broad Section 230 immunity has come under pressure in recent years, with both courts and legislatures chipping away at this important safe harbor.
Now, some lawmakers are proposing legislation to narrow the protections that Section 230 affords to website owners. They assert that changes to the section are necessary to protect Internet users from dangers such as sex-trafficking and the doctored videos known as “deep fakes.”
The House Intelligence Committee Hearing
Recently, a low-tech fraudulent video that made House Speaker Nancy Pelosi’s speech appear slurred was widely shared on social media, inspiring Hany Farid, a computer-science professor and digital-forensics expert at the University of California, Berkeley, to tell The Washington Post, “this type of low-tech fake shows that there is a larger threat of misinformation campaigns—too many of us are willing to believe the worst in people that we disagree with.”
In the wake of the fraudulent video of Rep. Pelosi going viral, the House Intelligence Committee held a hearing to analyze the risks that this type of technology poses to national and election security.
“At the outset of the hearing,” The Verge reported, “[House Intelligence Chairman Adam] Schiff came out challenging the ‘immunity’ given to platforms under Section 230 of the Communications Decency Act, asking panelists if Congress should make changes to the law that doesn’t currently hold social media companies liable for the content on their platforms.”
Laws Intended to Impede Sex Trafficking Online
Rep. Schiff’s suggestion that Section 230 should be amended isn’t the first of its kind. The enactment in April 2018 of a package of laws referred to as SESTA-FOSTA (known as the “Stop Enabling Sex Traffickers Act” in the Senate and the “Allow States and Victims to Fight Online Sex Trafficking Act” in the House of Representatives) created an exception to Section 230 related to facilitation of sex trafficking.
That exception, in the words of the online publication Vox, “means website publishers would be responsible if third parties are found to be posting ads for prostitution—including consensual sex work—on their platforms. The goal of this is supposed to be that policing online prostitution rings gets easier. What FOSTA-SESTA has actually done, however, is create confusion and immediate repercussions among a range of internet sites as they grapple with the ruling’s sweeping language.”
The Answer to Statutes’ Imperfect Attempts to Correct Internet Evils
In response to Rep. Schiff’s comments about Section 230’s protections at the House Intelligence Committee hearing, Maryland Carey School of Law professor Danielle Keats Citron suggested, “Federal immunity [for websites] should be amended to condition the immunity on reasonable moderation practices rather than the free pass that exists today.”
Georgetown University Law Center technology scholar Joshua Geltzer agrees that websites should enact practices to reasonably police their own content. In an article for Slate, he writes, “In reality, Section 230 empowers tech companies to experiment with new ways of imposing and enforcing norms on new sites of discourse, such as deleting extremist posts or suspending front accounts generated by foreign powers seeking to interfere with our elections. . . . The real question lawmakers now face isn’t whether companies are somehow forfeiting their protections by trying to tackle today’s online challenges. It’s whether companies are doing enough to deserve the protections that Section 230 bestows.”