Rethinking Section 230: Fostering Transparency, Accountability, and User Protection Online

https://www.pexels.com/photo/photo-of-hand-holding-a-black-smartphone-2818118/
https://www.pexels.com/photo/photo-of-hand-holding-a-black-smartphone-2818118/

Section 230 of the Communications Decency Act, often called the “Magna Carta of cyberspace,”[1] grants powerful legal protection to online platforms by stating, “no provider or user of interactive computer services shall be treated as the publisher or speaker” of third-party content.[2] Though the term “immunity” isn’t explicitly used, courts have interpreted Section 230 as broadly shielding internet platforms—including social media platforms, search engines, and websites—from liability for nearly all user-generated content.[3] This interpretation has effectively preempted state laws and private lawsuits that would impose liability, aiming to prevent platforms from overly censoring user content out of fear of legal repercussions.[4]

The magnitude of the cost associated with social media platforms and the scope of harm they can cause was largely underestimated by lawmakers in 1996 when Section 230 was enacted. [5] The broad immunity afforded to these platforms under Section 230 has resulted in far-reaching societal damage.[6] Shielded by this legal provision, social media platforms have become incubators for various illegal and harmful activities, often with significant and far-reaching consequences.[7] The unchecked ability of these platforms to facilitate the coordination of violent events—such as the January 6, 2021 insurrection at the U.S. Capital—and their role in terrorist recruitment efforts illustrate the severe risks posed by this regulatory gap.[8] Moreover, these platforms have been leveraged to perpetuate the sexual exploitation of children and the illicit trafficking of drugs, firearms, and other contraband, amplifying the harms that Section 230 was never intended to protect.[9]

As the harms of social media platforms face heightened scrutiny, various attempts to regulate the internet across different countries are also coming under increased examination.[10] Although the internet is generally free from government intervention worldwide, a few countries do have government intervention to regulate.[11] Among those countries that regulate social media platforms, Saudia Arabia is noteworthy due to their extensive censorship that still allows the platforms to operate “relatively freely”.[12]

Saudi Arabia has responded to the challenges posed by social media by implementing various legislative measures aimed at ensuring a safer online environment.[13] The Anti-Cybercrime Law, enacted in 2007, criminalizes offenses such as hacking, unauthorized access to computer systems, and the dissemination of harmful content, seeking to deter cybercrime and protect users.[14] The Saudi Communications and Information Technology Commission (“CITC”) plays a critical role in monitoring and filtering online content that may be deemed offensive or contrary to societal norms.[15] Additionally, the government has launched campaigns to raise awareness about misinformation and promote digital literacy among citizens.[16] The introduction of the Personal Data Protection Law in 2020 further enhances he protection of individuals’ personal information, establishing clear rights and obligations for organizations regarding data handling.[17]

In conjunction with these measures, the Saudi government has established regulations to address intellectual property rights and prevent copyright infringement on social media platforms.[18] The legal framework also includes provisions targeting defamation and online harassment, emphasizing the importance of responsible online behavior.[19] While these regulations impose strict guidelines, the government’s goal is to balance the protection of its citizens with the preservation of freedom of expression.[20] As Saudi Arabia continues to modernize and adapt to the digital age, addressing the challenges associated with social media remains vital.[21] Through comprehensive regulations and a focus on promoting responsible online engagement, the Kingdom of Saudia Arabia aims to create a safe and dynamic digital environment that aligns with its cultural values while encouraging innovation and entrepreneurship. However, despite the strong legislature targeting social media, Saudia Arabia has been accused of manipulating online discourse.[22]

Due to the accusations Saudia Arabia has faced, any legislative approach in the United States should prioritize accountability and transparency. To achieve this, Congress should amend Section 230 to address the “moderator’s dilemma”[23] by explicitly overturning Stratton Oakmont, Inc. v. Prodigy Servs. Co. [24] and clarifying that content removal under Section 230(c)(2) or a platform’s terms of service does not make the platform liable for all hosted content.[25] This would allow platforms to moderate responsibly without incurring broad liability, thereby fostering targeted moderation while preserving immunity for user-generated content that they do not remove.[26] Additionally, in regards to Section 230(c)(2)(A), Congress should define “good faith” for moderation practices, limiting immunity to actions aligned with terms of service and backed by reasonable justification.[27] Clear standards would increase platform accountability and transparency, encouraging fair moderation and trust among users.[28] Revising the vague “otherwise objectionable”[29] language in Section 230(c)(2)(A) to terms like “unlawful” and “promotes terrorism” would further ensure moderation aligns with Section 230’s purpose, reducing arbitrary content removal.[30] These changes would balance platform moderation rights with transparent, accountable practices, supporting user engagement and free expression.

Amending Section 230 to support responsible moderation doesn’t inherently encourage platforms to address illicit content. Congress should consider targeted carve-outs, such as a “Bad Samaritan” provision excluding immunity for deliberate harm[31], exceptions for child abuse, terrorism, and cyber-stalking to enable civil remedies[32], and a carve-out for actual knowledge to remove immunity when platforms knowingly host unlawful content.[33] Together, these adjustments would promote responsible moderation and a safer online environment without discouraging user-generated content.

In conclusion, while Section 230 has fostered a vibrant online ecosystem, its broad immunity has inadvertently enabled significant societal harms, including the spread of misinformation and illegal activities. Lawmakers in 1996 could not foresee the scale of these challenges, which have intensified with the rise of social media. As seen in regulatory approaches in countries like Saudi Arabia, a balanced framework is essential to protect public safety, uphold user rights, and ensure fair platform accountability in digital spaces. To address the “moderator’s dilemma,” Congress must amend Section 230 to clarify platform responsibilities while preserving the benefits of user-generated content. This includes overturning Stratton Oakmont, defining “good faith” moderation, and implementing targeted carve-outs for harmful content. Such changes will encourage responsible platform practices, enhance accountability, and foster a safer online environment that promotes transparency and trust among users and operators, ultimately creating a healthier digital landscape.


[1] Kimberly A. Fry, The Fate of Section 230, 22 Colo. Tech. L.J. 361, 362 (2024).

[2] 47 U.S.C. § 230(c)(1).

[3] Fry, supra note 1, at 362.

[4] Id.

[5] Michael D. Smith & Marshall W. Van Alstyne, It’s Time to Update Section 230, Harv. Bus. J. (Aug. 12, 2021), https://hbr.org/2021/08/its-time-to-update-section-230.

[6] See Robert D. Atkinson Et Al., A Policymaker’s Guide to the “Techlash” —What It Is and Why It’s a Threat to Growth and Progress, Info. Tech. & Innovation Found. (Oct. 28, 2019), https://itif.org/publications/2019/10/28/policymakers-guide-techlash/ (talking about various societal and economic harm).

[7] See Id.

[8] Smith & Alstyne, supra note 5.

[9] Id.

[10] See Anshu Siripurapu & Will Merrow, Social Media and Online Speech: How Should Countries Regulate Tech Giants?, Council on Foreign Rels. (Feb. 9, 2021, 11:30 AM), https://www.cfr.org/in-brief/social-media-and-online-speech-how-should-countries-regulate-tech-giants.

[11] See id.

[12] Id.  

[13] Id.

[14] The Impact of Social Media on Saudi Arabia: A Digital Transformation, Hammad & Al-Medhar L. Firm (June 21, 2023), https://hmco.com.sa/the-impact-of-social-media-on-saudi-arabia-a-digital-transformation/.

[15] Id.

[16] Id.  

[17] Id.

[18] Id.see Intellectual Property and Technology, Dla Piper, https://www.dlapiperintelligence.com/goingglobal/intellectual-property/index.html?t=copyrights&c=SA (last modified Apr. 19, 2023) (discussing Saudi Arabia’s vast intellectual property legal framework).

[19] Id.; Council of Ministers Resolution, Royal Decree No. M/96, 16 Ramadan 1439 A.H. (May 31, 2018). https://laws.boe.gov.sa/BoeLaws/Laws/LawDetails/f9de1b7f-7526-4c44-b9f3-a9f8015cf5b6/2.

[20] Id.

[21] Id.

[22] Siripurapu & Merrow, supra note 10.

[23] The moderator’s dilemma is the challenge online platforms face in balancing content moderation with free speech. Overly strict moderation risks accusations of censorship, while too little oversight can allow harmful content to spread. See infra note 24.

[24] See No. 31063/94, 1995 N.Y. Misc. LEXIS 712 (Dec. 11, 1995) (Held that websites and platforms that attempted to moderate or remove any type of bad content, would be liable for all user-generated content which resulted in the “moderator’s dilemma.”)

[25] Department of Justice’s Review of Section 230 of the Communications Decency Act of 1996, The United States Department of Justice, https://www.justice.gov/ag/department-justice-s-review-section-230-communications-decency-act-1996 (last visited Oct. 25, 2024).

[26] Id.

[27] Id.

[28] Id.

[29] 47 U.S.C. § 230(c)(2)(a).

[30] Department of Justice’s Review of Section 230 of the Communications Decency Act of 1996supra note 25.

[31] Id.

[32] Id.

[33] Id.