Washington (CNN) This week, the Supreme Court will hear consecutive oral arguments in two cases that could significantly reshape online discourse and content moderation.
The outcome of the oral arguments, scheduled for Tuesday and Wednesday, could determine whether technology platforms and social media companies can be sued for recommending content to their users or for supporting acts of international terrorism by hosting terrorist content. It marks the Court’s first-ever review of a federal shortcut button law that largely protects websites from lawsuits over user-generated content.
The closely followed cases, known as Gonzalez v. Google and Twitter v. Taamneh, have significant stakes for the internet at large. An expansion in the legal risk of apps and websites for hosting or promoting content could lead to major changes at sites including Facebook, Wikipedia and YouTube, to name a few.
The litigation has produced some of the most intense rhetoric in recent years from the tech industry about the potential impact on the future of the Internet. US lawmakers, civil society groups and more than two dozen states have also entered the debate by submitting documents to the Court.
Central to the legal battle is Section 230 of the Communications Decency Act, a nearly 30-year-old federal law that courts have repeatedly said provides broad protections to technology platforms, but which has since come under scrutiny amid mounting criticism. to Big Tech content. moderation decisions.
The law has critics on both sides of the aisle. Many Republican officials say Section 230 gives social media platforms a license to censor conservative viewpoints. Prominent Democrats, including President Joe Biden, have argued that Section 230 prevents tech giants from being held accountable for spreading misinformation and hate speech.
In recent years, some in Congress have pushed for changes to Section 230 that could expose tech platforms to greater liability, along with proposals to change US antitrust rules and other bills aimed at curbing dominant tech platforms. But those efforts have largely stalled, leaving the Supreme Court as the most likely source of change in the coming months on how the US regulates digital services.
Rulings on the cases are expected by the end of June.
González vs. Google
The case involving Google focuses on whether it could be sued over its YouTube subsidiary’s algorithmic promotion of terrorist videos on its platform.
According to the plaintiffs in the case, the family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris, YouTube’s targeted recommendations violated a US counterterrorism law by helping to radicalize viewers and promote YouTube viewing. world of ISIS.
The indictment seeks to carve out content recommendations so they don’t receive protections under Section 230, potentially exposing tech platforms to greater liability for how they run their services.
Google and other tech companies have said such an interpretation of Section 230 would increase the legal risks associated with the placement, sorting and curation of online content, a key feature of the modern internet. Google said that in such a scenario, websites would try to play it safe by either removing far more content than necessary or foregoing content moderation altogether and allowing even more harmful material onto their platforms.
Court filings by friends of Craigslist, Microsoft, Yelp and others have suggested that the stakes aren’t limited to algorithms and could also end up influencing just about anything on the web that could be interpreted as a recommendation. This could mean that even average Internet users who volunteer as moderators on various sites could face legal risks, according to a Reddit filing and several Reddit volunteer moderators. Oregon Democratic Senator Ron Wyden and California Republican former Representative Chris Cox, the original co-authors of Section 230, argued to the Court that Congress’s intent in passing the law was to give websites a broad discretion to moderate content as they see fit.
The Biden administration also took action on the case. In a brief filed in December, you said Section 230 protects Google and YouTube from lawsuits “for failing to remove third-party content, including content it has recommended.” But, the government said, these protections don’t extend to Google’s algorithms because they represent the company’s speech, not that of others.
Twitter versus Taamneh
The second case, Twitter v. Taamneh, will decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when platforms have hosted user content expressing general support for the group behind the violence without referencing the specific terrorist act in question.
The plaintiffs in the case of the family of Nawras Alassaf, who was killed in a 2017 ISIS attack in Istanbul, alleged that social media companies, including Twitter, had knowingly aided ISIS in violation of a US anti-terrorism law. United States by allowing some group content to persist on their platforms despite policies intended to restrict that type of content.
Twitter said that the mere fact that ISIS used the company’s platform to promote itself does not constitute Twitter’s “knowing” assistance to the terrorist group, and that in any case the company cannot be held liable under the law. anti-terrorism law because the content at issue in the case was not specific to the attack that killed Alassaf. The Biden administration, in its brief, agreed with that view.
Twitter had also previously claimed it was immune from the lawsuit under Section 230.
Other tech platforms like Meta and Google have argued in the case that if the Court finds tech companies can’t be sued under US counterterrorism law, at least in these circumstances, it would avoid a Section 230 debate altogether either way. , because the claims in question would be rejected.
In recent years, however, several Supreme Court justices have shown an active interest in Section 230 and have appeared to invite opportunities to hear cases related to the law. Last year, Supreme Court Justices Samuel Alito, Clarence Thomas and Neil Gorsuch wrote that new state laws, like Texas’s, that would force social media platforms to host content they’d prefer to remove, raise questions of “major importance” on the “power of the dominant social media company to shape public discussion about the important issues of the day”.
Several petitions are currently pending asking the Court to review the Texas law and a similar law passed by Florida. The court last month delayed deciding whether to hear such cases, instead asking the Biden administration to present its views.
#Supreme #Court #cases #week #rock #entire #internet