Section 230 (Social Media) reform

Summary: Section 230 should be reformed, such that social media must essentially be a platform or a publisher. While social media must be protected from liability from user content or they will cease to exist, which would be detrimental to our freedom of speech, they must be held accountable for what they promote or throttle- whether manually or algorithmically. They must be held accountable if attempting to enforce hidden policies on their users. They also must stop data-mining their users. For this to happen, section 230 needs to be updated:

Any platform which polices its user content in a manner inconsistent with or outside of its clearly stated user policies should lose section 230 protections. This includes algorithmic policing of content.

Any temporary or permanent ban, throttling (shadow ban) or removal of user content, or refusal to post user content which is pre-reviewed by the platform, should be required to be made clear to the user. Under law the platform should be required to state clearly why such action was taken and which user policy or policies were violated. Failure to comply with this should remove their protections under section 230. Users should be permitted to revise content or to appeal, and have a path to suing if they believe this has been violated by the platform.

If there is algorithmic influence on which kind content is promoted on a platform’s site and/or which content is throttled, the platform should be held liable for such content and lose their protections. In this case the algorithm inherently becomes a part of the creation process by promoting, rewarding, or discouraging the creation of certain kinds of content. Inherently, this makes the platform in part responsible for the content created and therefore should lose their protections under section 230.

Even more ideally, all algorithms for content suggestions, promotion, throttling, or content approval or removal should be open-sourced in order for section 230 protections to apply. Users should always have an option to bypass algorithmic suggestion in favor of a chronological or similarly non-algorithmic timeline. Algorithmic content suggestion/promotion should be opt-in only.

All content algorithms which ever went live on the platform should remain cataloged for a period of time by the platform for review if legally subpoenaed.

Platforms should be protected from being sued by advertisers based on what content their advertisement appears next to. Platforms should be required to apply advertisement algorithms on an opt-in basis only. However platforms should always take user content into consideration when placing advertisements intended for children if and where appropriate.

Any algorithm which promotes or suggests content or advertisements to a user should be based on information voluntarily and actively given to the platform by the user in an opt-in manner, in the form of user preferences, and not collected automatically. This includes data collected by parsing the users’ created content for use on the platform. If content or advertisements are suggested or shown to users algorithmically using any data automatically collected by the platform, OR the platform knowingly uses an advertiser doing the same, the platform will lose its protections.

Considerations should be taken into account for all the above to not affect the operation of platforms which aren’t operating for profit and have a sufficiently small number of users, as to not overburden smaller or hobby sites.

1 Like

Please see my draft revision that really does much of what is discussed here.
Section 230 Revision - Draft changes provided - Liberty - Policies for the People

Why Section 230 Should Be Repealed

The internet has undergone a radical transformation since the mid-1990s. When Section 230 of the Communications Decency Act was passed in 1996 under the Clinton Administration, it was meant to protect nascent tech companies from lawsuits over user-generated content on their platforms. The rationale was that these companies were too small and vulnerable to survive without legal protection. Today, however, social media companies like X (formerly Twitter), Meta (Facebook), and Google are some of the largest and most influential corporations in the world. Given their current power and reach, the justifications for Section 230 no longer align with reality. It is time to repeal this outdated legislation, as its continued existence creates harmful consequences for society.

1. Indemnity Should Not Be Granted to Any Industry

The most compelling reason to repeal Section 230 is that no company—whether in social media, pharmaceuticals, or automotive manufacturing—should be granted blanket immunity from legal liability. Imagine if car manufacturers had the same legal protections that social media companies enjoy: they would have no incentive to improve safety features or issue recalls because they wouldn’t be held accountable for the consequences. This lack of accountability would be unacceptable for any other industry, and it should be no different for social media companies. By granting these tech giants indemnity, Section 230 enables them to operate with impunity, absolving them of the responsibility to address the real-world harms that occur on their platforms.

2. The Original Justifications No Longer Apply

When Section 230 was enacted, social media platforms were fledgling businesses that needed legal protection to grow. The internet was a new frontier, and lawmakers believed that these protections were necessary to foster innovation. Fast forward to today: social media companies have grown into some of the most powerful corporations in the world, with billions in revenue and unparalleled influence over public discourse. They no longer require the protections that Section 230 offers. In fact, these protections have allowed them to become monopolistic entities that prioritize profit over the public good. Continuing to shield them from liability only entrenches their unchecked power.

3. Government Regulation is the Inevitable Alternative

One of the main arguments in favor of maintaining Section 230 is that it prevents government overreach in regulating online content. However, by failing to act responsibly, social media companies are inviting regulation. Governments in the European Union, Australia, and other regions are already stepping in to impose laws aimed at curbing misinformation and harmful content. For instance, the EU’s Digital Services Act requires platforms to take greater responsibility for the content they host. Similarly, Australia’s attempts to regulate social media to combat disinformation indicate a trend toward government intervention.

If social media companies continue to evade responsibility for the content they allow on their platforms, governments will have no choice but to impose stricter regulations. This paradoxically creates the very situation that Section 230 proponents fear: increased government control over online speech. If social media companies are unwilling to self-regulate, the government will do it for them. Repealing Section 230 would force these platforms to take responsibility and could prevent the need for heavy-handed governmental regulation.

4. The Rise of Lawsuits and the Growing Pushback

There is mounting evidence that social media platforms can cause real harm, especially to vulnerable populations. State attorneys general in the United States are already pursuing wrongful death lawsuits related to cyberbullying and mental health impacts on these platforms. If these companies are distributing content and feeding news to users, they should be held to the same legal standards as traditional media outlets. By repealing Section 230, tech companies would be forced to take responsibility for the harmful effects their platforms can have on users, especially teenagers and children.

5. Social Media Companies are Now the Gatekeepers of News

One of the unintended consequences of Section 230 is that social media platforms have become the de facto news providers for millions of people. Unlike traditional media, however, these companies do not have to adhere to the same journalistic standards or accountability. This discrepancy creates a dangerous situation where social media platforms can disseminate misinformation, propaganda, and harmful content without consequence.

In the current era, where foreign governments actively use social media to spread disinformation, it is imperative that these platforms be held accountable for the content they allow. By continuing to shield these companies under Section 230, we are essentially allowing them to profit from chaos and division, while abdicating any responsibility for the consequences.

6. If They Curate, They Must Take Responsibility

Social media platforms argue that they are neutral intermediaries, merely hosting content without taking an editorial stance. However, their algorithms curate what users see, meaning they do not merely host content passively; they actively shape what information users consume. By controlling the flow of information, these platforms wield significant influence over public opinion and political discourse. If they are going to act as curators of content, they must be bound by the same legal and ethical standards that apply to traditional media companies.

Conclusion: Time for Change

The repeal of Section 230 is not about stifling innovation or silencing free speech; it is about holding powerful corporations accountable for their actions. The social media landscape has evolved dramatically since 1996, and the legal framework governing it must evolve as well. The indemnity granted by Section 230 is no longer justified, given the size, influence, and resources of modern tech giants. By repealing this outdated law, we can encourage social media companies to act more responsibly and align their practices with the standards expected of other industries.

In doing so, we can ensure that social media platforms contribute positively to society rather than profiting from division, misinformation, and harm. The time has come to recognize that these platforms are not just tech companies—they are the gatekeepers of information in the digital age. It is only fair that they be held accountable for the content they distribute.