Two pending Supreme Court cases interpreting a 1996 law could drastically alter the way we interact online. That law, section 230 of the Communications Decency Act, is often decried as a subsidy for Big Tech, but it misses the point. Section 230 promotes free speech by removing strong incentives for platforms to limit what we can say and do online.
Under Section 230, platforms generally cannot be held responsible for content posted by users. Without this protection, important speeches such as communication about abortion, especially in states where abortion is banned, could be silenced. Movements like #MeToo and #BLM may have failed to catch on if platforms were worried about being sued, even improperly, for defamation or other allegations. People might have found their voices censored, especially when speaking about ideas that are under political attack today: race and racism, sexuality and gender justice. The internet as we know it would be a very different place.
Prior to Section 230, companies that cultivated online communities were legally responsible for what their users posted, while those that exercised no editorial control were not. The natural consequence of this was that some platforms would choose to limit conversations to only the most non-controversial topics, while other platforms had an incentive to host spaces open to all, tolerating pornographic, offensive or other unwanted content to avoid any legal liability. . Congress wisely recognized that the Internet could be so much more than that and passed Section 230.
While Section 230 indemnifies online platforms from legal liability for posts, comments, and other messages contributed by their users, it does not exempt platforms from liability for content that violates federal criminal law, intellectual property rights, or certain other categories. of legal obligations. Section 230 also does not apply to platform conduct that does not fall within the scope of posting other content, such as discriminatory targeting of housing or employment listings on the basis of race or gender.
It also doesn’t provide a safe haven for platforms that provide advertisers with tools designed to target ads to users based on gender, race, or other status protected by civil rights laws. Nor does it provide immunity from claims that a platform’s ad serving algorithms are discriminatory. The ACLU recently explained why this conduct falls outside the scope of Section 230. In these scenarios, where the alleged basis of liability is platform discrimination, the ACLU seeks to prevent platforms from abusing or misinterpreting the immunity of Section 230.
Today, the Internet allows people to communicate with each other on a previously impossible scale. It is a major source for learning about current events, checking job postings, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge, as the Supreme Court recently recognized in Packingham versus North Carolina. At the same time, platforms are free to manage user content, removing problematic posts that contain nudity, racial slurs, spam, or fraudulent information.
This term, the Supreme Court will examine the extent of the protections of laws in Twitter versus Taamneh AND González vs. Google. These cases have been brought in by family members of US citizens who have been killed by ISIS in terrorist attacks. The lawsuits allege that platforms, including Google’s Twitter and YouTube, are aiding and abetting ISIS attacks by failing to adequately block or remove content that promotes terrorism.
But Twitter and YouTube did not and do not intend to promote terrorism. The videos identified by the plaintiffs were posted by ISIS agents and, while legal, violate Twitter and YouTube’s terms of service. The companies would have removed them if they were reported. There is also no allegation that the people behind the terrorist attack were inspired by these videos.
The amicus of the ACLU inquires Twitter versus Taamneh he claims that the imposition of accountability in these circumstances would inappropriately stifle the discourse. Of course, a platform could be promoting terrorism through its policies and actions. But impose responsibility only for hosting content with no malicious intent or specific knowledge that any specific posting furthered a particular criminal act would suppress online discourse and association. It happens before, like when Instagram confused a post about a historic mosque with one about a terrorist group. These relatively common errors would become the new norm.
THE Gonzalez case begs a different question: whether Section 230 immunity applies to amplified content. The plaintiffs argue that when platforms suggest content to users, such as in Up Next, You Might Like or Recommended For You, such suggestions are not protected by Section 230. So while a provider would remain immune for the simple hosting content, would be responsible highlighting It.
The ACLU filed an amicus brief in Gonzalez case to explain why online platforms have no choice but to prioritize some content over others and should be exempt from liability for such choices when they include third-party content. Given the large amount of material posted every minute, platforms need to sort and organize content to display it in any usable way. There is no way to visually present information to app or web page users without making editorial choices that are, at the very least, implicit recommendations.
Additionally, organizing and recommending content helps us find what we’re looking for, receive and create information, reach an audience, and build community. If Section 230 doesn’t apply to this type of content organization, platforms will have an incentive to present information in an unorganized mishmash and feel pressure to include only the most innocuous content that lawyers can be sure would not inspire anyone to subpoena. judgment.
Section 230 allowed public expression on the Internet to flourish. It created space for social movements; platforms enabled to host the interventions of activists and organizers; and has enabled users and content creators on sites like Instagram, TikTok, and Twitch to reach an audience and make a living. Without it, the Internet will be a far less hospitable place for human creativity, education, policy, and collaboration. If we lose Section 230, we risk losing the Internet as we know it.