All the Bills on Section 230
“The 26 words that created the internet.” That’s how some people refer to Section 230 of the Communications Decency Act, a part of the law that says tech companies aren’t legally responsible for content posted on their sites. It’s often said that the internet as we know it today would never have existed without this all-important provision.
But not everyone thinks Section 230 is a good thing. You might believe that it lets social media companies censor conservative content, or that it lets them turn a blind eye to online hate speech. Members of Congress are wrestling with those questions too and have introduced a whopping 18 different bills to change the way tech companies operate. To help you understand them, we’ve summarized and provided links for each.
Justice Against Malicious Algorithms Act of 2021 (H.R. 5596)
What It Does: Under this bill, a tech company could be legally liable if it used a personalized algorithm to recommend harmful content to users, and did so knowingly or recklessly.
But Wait, There’s More!
- This bill targets content that contributes to “physical or severe emotional injury to any person.”
- It wouldn’t apply to small platforms or to search rankings.
A bill to repeal Section 230 of the Communications Act of 1934 (S. 2972)
What It Does: The bill completely repeals Section 230 of the Communications Decency Act, making tech companies liable for content posted on their platforms.
Federal Big Tech Tort Act (H.R. 5449)
What It Does: This bill doesn’t actually change Section 230. It does, however, make social media companies liable if they cause “bodily injury to children or harm the mental health of children.”
But Wait, There’s More!
- This would apply to kids under age 16.
- Companies would have a defense if they could prove that they took reasonable steps to ensure that their users were 16 or older.
The Accountability for Online Firearms Marketplaces Act of 2021 (S.2725)
What It Does: Today, online marketplaces for guns are protected by Section 230. If this bill passes, they wouldn’t be. The sponsors of this bill say they want to hold online gun marketplaces accountable for illegal gun sales.
But Wait, There’s More!
- The bill would also apply if a website includes instructions on how to 3D print a gun.
Health Misinformation Act of 2021 (S. 2448)
What It Does: Under this bill, a tech company could be liable if its algorithm promotes health misinformation.
But Wait, There’s More!
- Some algorithms would be ok under this bill, like one that orders content chronologically.
- This bill would take effect during a public health emergency.
- The Secretary of Health and Human Services would define what constitutes health misinformation.
Preserving Political Speech Online Act (S. 2338)
What It Does: Today, tech companies are protected by Section 230 when they remove content “in good faith” because they consider it obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable. Under this bill, “good faith” removal would only include obscene, illegal, or excessively violent content.
But Wait, There’s More!
- The bill also defines what it calls “bad faith” moderation: blocking content on the grounds of race, religion, sex, national origin, or political affiliation or speech.
- Tech platforms would be exempt from the “bad faith” clause if they are “dedicated to a specific set of issues, policies, beliefs, or viewpoints.”
What It Does: This bill would protect platforms from liability when they moderate content, but only certain kinds of content: obscene, lewd, lascivious, filthy, excessively violent, harassing, or unlawful or when it promotes terrorism, violent extremism, or self-harm. Platforms would have to base their decisions to moderate content on an “objectively reasonable belief.”
But Wait, There’s More!
- The bill would also make it harder for platforms to get Section 230 protections when they moderate content in a way that “burdens” religious exercise.
- In addition, it would remove Section 230 protections from what it calls “information content providers” – those that use algorithms to determine how content shows up on their platforms; promote or suppress a “discernible viewpoint;” or solicit, comment on, fund, contribute to, or modify information provided by another person.
- Finally, the bill takes away Section 230 protections for the most popular platforms if they fail to notify customers of available parental control protections.
Protect Speech Act (H.R. 3827)
What It Does: Under this bill, a tech platform could only use Section 230 as a defense for content removal if it publicly states terms of service explaining in detail how they moderate content – and comply with those terms.
But Wait, There’s More!
- When a platform restricts content, it would need to explain why and give the user an opportunity to respond.
- The bill provides some exceptions for law enforcement and imminent safety threats.
Safeguarding Against Fraud, Exploitation, Threats, Extremism and Consumer Harms (SAFE TECH) Act (H.R. 3421; companion bill to S.22)
What It Does: Under this bill, tech companies couldn’t use Section 230 as a defense for issues related to ads or other content they are paid to show.
But Wait, There’s More!
- If a court requires a tech company to remove or restrict access to harmful material, the company can’t use Section 230 to fight back.
- Also under this bill, a company couldn’t use Section 230 as a defense in civil rights, antitrust, stalking and harassment, human rights, or wrongful death cases.
- Lastly, the bill says that if a company wants to use Section 230 as a defense, it has to prove that it’s incorrectly being treated as a publisher or speaker, rather than a platform.
21st Century Foundation for the Right to Express and Engage in Speech Act (21st Century FREE Speech Act) S. 1384
What It Does: This bill fully repeals Section 230. Companies that had protection under Section 230 would instead be considered common carriers, like a phone company or the Postal Service.
But Wait, There’s More!
- The bill calls these companies “common carrier technology companies,” and defines them as interactive computer services that offer services to the public and have at least 100 million global active monthly users.
- These companies would be required to provide their service to anyone without discriminating against users – including political or religious affiliation or region.
- Platforms would have to publicly disclose their policies on content moderation, promotion, curation, and account suspension.
- The bill says that tech platforms won’t be treated as the publisher or speaker of content from another source – but if a platform does something to change visibility of content (by recommending certain content, for example), that treatment goes out the window.
- The bill would protect good-faith moderation of content that is “obscene, lewd, lascivious, filthy, excessively violent, harassing, promoting self-harm, or unlawful, whether or not such material is constitutionally protected.” To get that protection, companies need to be operating according to their moderation policies.
- Lastly, this bill has a twist: not only could states sue under this bill – individuals could, too.
Protecting Americans From Dangerous Algorithms Act (H.R. 2154)
What It Does: Under this bill, a platform can’t use Section 230 as a defense in cases about civil rights violations or terrorist acts if the company uses algorithms to spread and amplify the problematic content in question.
But Wait, There’s More!
- Companies could still use Section 230 as a defense if they moderate content using methods that are “obvious, understandable, and transparent” to a reasonable user (for example, chronological order or average user rating rankings).
- Search results would be exempt.
- Unlike an earlier version of this bill, companies that provide web hosting, domain registration, data storage, and some other services would also be exempt.
- Small businesses would also be exempt.
Stop Shielding Culpable Platforms Act (H.R. 2000)
What It Does: This bill clarifies that under Section 230, a provider or a user of a tech platform could be treated as the distributor of information that someone else provides.
Platform Accountability and Consumer Transparency (PACT) Act (S. 797)
What It Does: Under this bill, a tech company could only get immunity under Section 230 if it published a policy explaining what content it allows, how it moderates that content, and how users can report content.
It would also require companies to provide customer service to users who have complaints about content moderation, and take those complaints by phone, email, or an online form. Companies would also have to create an online system where users could file and track complaints and appeals. (Companies wouldn’t have to respond to spam, trolls, or abusive complaints.)
But Wait, There’s More!
- This bill says that if a company wants Section 230 protections, it would have to review and remove content that’s illegal or violates its policies in a timely manner. It would also have to notify users that their content had been removed, explain why, and give them a chance to appeal.
- Tech companies must also issue biannual transparency reports, which would include 1) the number of content-related complaints filed by users, 2) how many times the company acted on those complaints and what the method of enforcement was, and 3) the number of appeals users filed.
- Small businesses wouldn’t have to provide a live customer service call center and would have more time to respond to complaints. If you’re an individual with a small operation (for example, an independent blogger), you’d only be required to provide users with a contact system to alert you to problematic content on your site, and to remove illegal content within four days.
- Companies that provide internet infrastructure (web hosting, for example) are exempt.
Abandoning Online Censorship (AOC) Act (H.R. 874)
What It Does: This bill repeals Section 230 entirely.
Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms (SAFE TECH) Act
What It Does: Under this bill, companies couldn’t use Section 230 as a defense for issues related to ads or other content they are paid to show.
But Wait, There’s More!
- If a court requires a company to remove or restrict access to harmful material, the company couldn’t use Section 230 to fight back.
- Also under this bill, a company couldn’t use Section 230 as a defense in civil rights, antitrust, stalking and harassment, human rights, or wrongful death cases.
- Lastly, the bill says that if a company wants to use Section 230 as a defense, it has to prove that it’s incorrectly being treated as a publisher or speaker, rather than a platform.
See Something, Say Something Online Act of 2021 (S. 27)
Summary: This bill requires tech companies to tell the U.S. Department of Justice when they detect suspicious activity showing that users are engaged in terrorism, serious drug offenses, or violent crime.
But Wait, There’s More!
- Companies wouldn’t have to scan all content on their site to identify problems; they would have to report issues they’re aware of.
- Lastly, companies would have to take “reasonable steps” to prevent and address suspicious activity.
Curbing Abuse and Saving Expression in Technology (CASE-IT) Act (H.R. 285)
Summary: If a company creates, develops, posts, or contributes to illegal content online, it wouldn’t be allowed to use Section 230 as a defense for a year.
But Wait, There’s More!
- The most popular tech platforms would have to follow content moderation policies that are consistent with the First Amendment
- Lastly, this bill gives individuals the right to sue companies if their moderation policies aren’t consistent with the First Amendment.
Protecting Constitutional Rights From Online Platform Censorship Act (H.R. 83)
Summary: This bill makes it illegal for a company to restrict the availability of content. A user would be allowed to sue a company that did restrict content.
Thanks to Future Tense; the Tech, Law, & Security Program at the Washington College of Law at American University; and the Center on Technology Policy at the University of North Carolina at Chapel Hill, whose research powers this feature. You can read more about their project here.