The year 1996 was 27 years ago — “dog years” in terms of the pace of change in the digital world. That’s when the Communications Decency Act became law. That’s eight years before Mark Zuckerberg founded Facebook. “Netscape” was the browser of choice and people were using America Online for dial-up Internet. The well-intentioned motivation of the CDA was to encourage robust speech and the exchange of ideas.
Fast-forward to Feb. 21, 2023. That’s when the U.S. Supreme Court will hear oral arguments in a case, Gonzalez v. Google, that could have profound consequences for not only the First Amendment but also the business model of any outlet that makes use of third-party content on the Internet.
Section 230 of the act is the controversial provision that in essence exempts internet content providers from liability over third-party content. Say you believe you were libeled or slandered by someone on the Internet. As long as a platform like your favorite local news website, Google or Facebook didn’t edit or change the posts of that third party, you might win a suit against the posting party, but you can’t successfully sue the platform.
The section protects more than just posted content. Less attention has been paid to another thing some platforms do: Recommend more content that might be relevant or interesting in relation to what you’re looking at. Platforms use algorithms and software automation to generate these recommendations.
Do those recommendations bring liability? That’s at the heart of Gonzalez v. Google. The family sued Google over how its YouTube division served related videos produced by ISIS terrorists after the 2015 terror attack in Paris. Nohemi Gonzalez, a 23-year-old California student, was a U.S. citizen killed in that attack. The family argued that YouTube’s recommendation engine radicalized the terrorist attackers and thus YouTube has liability for her death.
Google has won in lower courts, arguing that it works to remove such material and that taking away the protections of Section 230 for recommendations would have huge, negative consequences.
The Court’s decision also may provide clarity around what, exactly, is an internet content provider.
Big tech companies would prefer that you think of them like the phone company. You can’t sue the phone company for defamation if someone slanders you in a phone call. It’s a completely neutral pipe. Verizon, for example, takes no role whatsoever in what you say or don’t say. The other side is a typical mass-media outlet like a newspaper, TV station or cable news broadcaster. A newspaper can be held just as liable for a letter to the editor as an article written by one of its staff members. They’re expected to check everything out before they publish or broadcast. But Section 230 provides protections when the content is online.
YouTube, Facebook, Google and others live in shades of gray. Even when they don’t create their own content, they are posting material from others while generating recommendations and creating feeds that control what you see and in what order you see it — in ways that few anticipated in 1996. Does that make them more like traditional publishers or the phone company?
The Gonzalez case has attracted dozens of “friend of the court” briefs from all sides. In one camp, there’s an argument that YouTube’s role in stoking terrorism that led to the death of Nohemi Gonzalez is the exact consequence of an overbroad interpretation of Section 230. They say that providing recommendations is more akin to the role of a traditional editor shaping content than simply posting unedited third-party content of others.
Those in favor of preserving Section 230 say that carving out recommendations from the Section 230 protections for posted content is a distinction without a difference and to decide otherwise could lead to major, undesirable consequences. Here are three of those assertions:
- Internet content providers will err on the side of great caution if everything done by third parties has to be vetted, sharply undermining what Congress intended in the CDA. It won’t just affect sites like Google and Facebook, but will ripple out to Wikipedia, Craigslist, Yelp, ZipRecruiter, Reddit and others. Smaller local content sites won’t have the resources to manage third-party content. The sheer scale and volume of internet content makes analogies to “analog media” useless.
- It will sharply limit the value of the internet as a starting point for journalists in covering major breaking news in which it’s often critical to share information as fast as possible to save lives and property, as well as doing investigative journalism.
- It will reduce or eliminate the use of the internet among the communication tools media outlets use to connect with readers and larger communities. Facebook Live is one of many examples.
Of course, few would deny that portions of the internet have turned into cesspools of hate, bullying and misinformation, and efforts to manage it have had mixed success and sparked controversy. The question for both courts and policy makers in Congress is whether those negative consequences outweigh the need to retain the broad protections of Section 230. As the cliché goes, will the cure be worse than the disease?
The Free Speech Center newsletter offers a digest of First Amendment and news media-related news every other week. Subscribe for free here: https://bit.ly/3kG9uiJ