Gonzalez v. Google LLC

LII note: The U.S. Supreme Court has now decided Gonzalez v. Google LLC .

Issues 

Can online platforms be held liable for algorithmically recommending harmful third-party content to users?

Oral argument: 
February 21, 2023

This case asks the Supreme Court to decide whether online platforms can be held liable for algorithmically recommending third-party content to users. Petitioner Reynaldo Gonzalez argues that Google LLC can be held liable for YouTube’s recommendations of ISIS recruitment videos because YouTube does not qualify for immunity under Section 230 of the Communications Decency Act. Respondent Google LLC argues that since YouTube did not create the ISIS videos at issue, it should be able to claim immunity under Section 230. This case will affect the availability of remedies for victims of harmful online content, internet company accountability, moderation, and online speech.

Questions as Framed for the Court by the Parties 

Whether Section 230 of the Communications Decency Act immunizes interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limits the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information.

Facts 

In 2015, Nohemi Gonzalez, a United States citizen studying abroad in France, was killed in a terrorist attack in Paris. Gonzalez v. Google LLC. The following day, The Islamic State of Iraq and Syria (“ISIS”) claimed responsibility by issuing a written statement and posting a YouTube video. Id. YouTube, which is owned by Respondent Google LLC (“Google”), is an online platform for users to post, share, view, and comment on videos. See id. As YouTube’s owner, Google has the ability to review and remove any content on the platform if there is a complaint against the content and the content violates Google’s policies. See id.

Reynaldo Gonzalez, the victim’s father, along with other family members, filed a complaint against Google for damages pursuant to 18 U.S.C. § 2333, the Antiterrorism Act (“ATA”). Id. at 880, 882. The ATA gives United States citizens the opportunity to recover damages for injuries suffered “by reason of an act of international terrorism.” Id. at 880. According to the complaint filed by Gonzalez against Google, Google is secondarily liable for Nohemi’s death because Google “aided and abetted an act of international terrorism” in violation of § 2333(d) by allowing ISIS to “recruit members, plan terrorist attacks, issue terrorist threats, instill fear, and intimidate civilian populations” through YouTube. Id. at 881–82. Specifically, Gonzalez alleged that Google is liable because it used computer algorithms to recommend videos published by ISIS or related to ISIS to its users. Id. at 881. In this way, Gonzalez claimed that Google was helping ISIS spread the organization’s message. Id. Moreover, Gonzalez claimed that Google was liable because it failed to remove, or sometimes removed only a portion of, the content posted by ISIS or related to ISIS, although it has the authority, as YouTube’s owner, to remove such content completely from YouTube. Id. at 881–82. Based on the aforementioned acts or failures to act by Google, Gonzalez asserted that YouTube had become “an essential and integral part” as well as “a unique and powerful tool of communication” of ISIS’s terrorist campaign program. Id. at 881.

Google moved to dismiss the complaint pursuant to 47 U.S.C. § 230 of the Communications Decency Act (“Section 230”), which provides immunity to online platforms, such as YouTube, from civil liability based on third-party content. See § 230(c)(2). The district court granted Google’s motion to dismiss based on the reasoning that Google is entitled to immunity under Section 230 because the producer of the recommended videos was ISIS, not YouTube. Gonzalez at 882–83. The Ninth Circuit affirmed the district court’s decision, holding that Section 230 immunity applied to content recommendations so long as YouTube recommended non-harmful content based on the same neutral criteria as the harmful content. Id. at 880, 894. Plaintiffs filed a timely petition for rehearing en banc, and a majority of the court of appeals voted to deny rehearing en banc. Brief for Petitioner, Reynaldo Gonzalez, et al. at 1. The United States Supreme Court granted certiorari on October 3, 2022. Id.

Analysis 

DEFINITION OF “PUBLISHER” UNDER SECTION 230

Petitioner Reynaldo Gonzalez, et al. (“Gonzalez”), argues that YouTube, a subsidiary of Respondent Google LLC (“Google”), is not a publisher or speaker as defined under Section 230 when it recommends third-party content to users, and therefore should not be able to raise a Section 230 defense for recommending ISIS-related content on YouTube. See Brief for Petitioners, Reynaldo Gonzalez, et al. at 18–19; 47 U.S.C. § 230(c)(1). Gonzalez explains that a Section 230 defense is available when the defendant can show that (1) the plaintiff’s claim treats the defendant “as the publisher or speaker” of the third-party content; (2) the content is provided by “another information content provider”; and (3) the defendant acts as a “provider . . . or user of an interactive computer service.” Id. at 8; 47 U.S.C. § 230(c)(1).

To find the first element inapplicable to YouTube, Gonzalez distinguishes between the everyday and legal meanings of the term “publisher,” arguing that Congress intended for “publisher or speaker” under Section 230 to refer to its narrower legal meaning under defamation law. Id. at 19–21. Gonzales points out that Section 230 was originally enacted to overrule Stratton Oakmont, Inc. v. Prodigy Services, a state defamation case in New York which found that publishers that censor or remove problematic materials can be held liable for they content they affirmatively choose to publish. Id. at 21–22. Gonzalez argues that YouTube does not qualify under the defense, because the claim here is not focused on YouTube’s dissemination of harmful materials, but rather on YouTube’s recommendation of them. Id. at 24.

To further illustrate the distinction, Gonzalez concedes that some recommendation-based claims might qualify for a Section 230 defense, but for Gonzalez, the key issue is whether the recommendation directly sends harmful third-party content to users in the form of a file, or whether it merely includes information about third-party content in the form of a URL or a notification. Id. at 26–28. Gonzalez asserts that sending a file is an act of publication, whereas sending a URL or notification is not. Id. Thus, according to Gonzalez, when YouTube recommends ISIS-related videos to users, YouTube is not acting as a “publisher or speaker” under Section 230 because it is merely sending users information about the video in the form of a URL, and thus the Section 230 defense cannot shield Google from liability under the ATA. Id. at 28.

According to Gonzalez, even if Congress intended for the everyday meaning of “publisher” to apply, YouTube’s recommendation function still does not fall under the protection of Section 230. Id. at 29. To illustrate this, Gonzalez explains that book clubs and movie review websites recommend third-party content, but are not considered “publishers” of those books and movies in the everyday sense of the word. Id. at 30. Similarly, YouTube, like book clubs and movie review websites, connects users to content, but cannot be said to a publisher of that content. Id.

In opposition, Respondent Google LLC (“Google”) argues that YouTube is a “publisher” under Section 230. See Brief for Respondent, Google LLC at 22–23. Google contends that the Supreme Court typically reads statutory language according to its everyday meaning and that, in consequence, “publisher” in Section 230 should be read according to the word’s everyday meaning. Id. at 23. Google explains that, under the term’s everyday meaning, publishing refers to the act of selecting and ordering content; newspapers and broadcasters publish, for example, by selecting and ordering content to match a print layout or to fill specific time slots. Id. Google contends that YouTube, similarly, selects and orders content using algorithms for publication to its “Up Next” feed; since YouTube applies the same neutral criteria to all of its content, YouTube should qualify for immunity as a publisher under Section 230. Id. at 27.

Google also claims that regardless of whether Congress intended for the legal or everyday meaning of “publisher or speaker” to apply, YouTube should still qualify as a “publisher or speaker” under Section 230. Id. at 23. Even if the Supreme Court interprets the statute according to the narrower meaning of “publisher” under defamation law, Google points out that the Supreme Court deems defamation defendants to be “publishers” simply because they arrange content. Id. at 25. Referring to the surrounding text of the statute, and specifically 47 U.S.C. §§ 230(f)(2) and (f)(4), Google points out that Congress explicitly intended for “publishing” in Section 230 to include selecting content via algorithms. Id. at 26. Thus, according to Google, YouTube is a “publisher” under the plain text of the statute because it filters and recommends videos for users using algorithms. Id. at 27. In response to Gonzalez’s distinction between the acts of recommendation and dissemination, Google counters that the internet has obscured the difference between publication and distribution, causing them to happen at almost the same time. Id. at 50.

RECOMMENDATION OF THIRD-PARTY CONTENT AS “INFORMATION PROVIDED BY ANOTHER INFORMATION CONTENT PROVIDER”

Regarding the statute’s second element, Gonzalez argues that YouTube’s recommendations do not qualify as “information provided by another information content provider” under Section 230 because they are generated by YouTube, itself, not by another information content provider. Brief for Petitioner at 33; 47 U.S.C. § 230(c)(1). Specifically, Gonzalez explains that when YouTube’s recommendations contain URLs, these URLs are not “information provided by another information content provider” for the purposes of Section 230. Brief for Petitioner at 38. Gonzalez points out that the function of URLs is to provide users with the video’s address on the computer server. Id. Gonzalez reasons that since URLs supply location information, they are “information” within the meaning of Section 230. Id. Importantly, Gonzalez distinguishes YouTube’s URLs from URLs that appear in Google’s search feeds, pointing out that YouTube generates its own URLs, whereas URLs found in Google’s search results are generated by third-party websites. Id. at 39, 44.

Finally, regarding the inapplicability of the statute’s third element, Gonzalez points out that a defendant can only trigger the Section 230 defense when it acts as a “provider . . . of an interactive computer service.” Id. at 44–45; 47 U.S.C. 230(f)(2). According to Gonzalez, to qualify as interactive, a computer service must send content in response to a user’s request for information. Brief for Petitioner at 44–45. To clarify the distinction, Gonzalez explains that Google’s search engine does qualify as an interactive computer service because it supplies search results in response to a user’s search request. Id. at 47. In contrast, Gonzalez points out that YouTube sends unrequested videos to users via an automated recommender algorithm. Id. at 39, 47.

In opposition, Google contends that plaintiffs cannot hold defendants liable for merely providing access to “information provided by another information provider” when they did not, themselves, create or develop the harmful content – content.” Brief for Respondent at 28–29. Google cites a Sixth Circuit case, O’Kroley v. Fastcase, Inc., where the plaintiff based their claim on Google’s previews of third-party websites on its search feed. Id. at 28. According to Google, the Sixth Circuit held that under Section 230, Google could not be held liable for merely providing access to third-party content in the form of automatically-generated excerpts from those websites. Id. at 28. From Google’s perspective, Google’s previews are analogous to YouTube’s “Up Next” feed, and therefore the phrase “information provided by another information content provider” cannot be interpreted to exclude YouTube’s recommender function from the protections of Section 230. Id. at 28–29. Since YouTube did not create or develop the ISIS videos, but merely provided access to them, Google argues that YouTube should be able to assert immunity under Section 230. Id. at 29.

In response to Gonzalez’s argument that YouTube’s recommendations do not qualify as an “interactive computer service,” Google contends that nothing in the text of Section 230 requires computer services to exclusively provide users with content they request. Id. at 36. Google points out that all websites display all kinds of information for users that they do not actively request, including disclaimers and banner images. Id. Google therefore concludes that YouTube qualifies as “interactive computer service” under Section 230(f)(2) because it allows, and thus “enable[s]”, “access by multiple users” to content stored on YouTube’s servers. Id.

Discussion 

PROVIDING REMEDIES FOR INJURED USERS AND ENHANCING PLATFORM ACCOUNTABILITY

Free Speech For People (“FSFP”), in support of Gonzalez, argues that broad Section 230 immunity would afford megaplatforms an opportunity to escape accountability for their role in causing harm to the public. See Brief of Amicus Curiae Free Speech For People, in Support of Petitioners at 26. FSFP claims that, by exposing people to disinformation, violence, and hate speech through algorithmic selection processes, platforms are subjecting the public to significant social harms, including physical and mental disorders amongst the youth, the facilitation of sexual predation and exploitation, and interference with fair and free elections. Id. at 25–27.

Scholars of Civil Rights and Social Justice (“SCRSJ”), in support of Google, counters that Section 230 does not provide sweeping immunity that prevents courts from holding the platforms accountable. See Brief of Amici Curiae Scholars of Civil Rights and Social Justice, in Support of Respondent at 20–21. SCRSJ points to several housing discrimination cases in which the courts refused to grant the platforms Section 230 immunity to platforms who functioned as more than “neutral tools” of communication, but rather fuctioned as content creators that designed their systems in a manner that facilitated housing discrimination. Id. at 21–22. SCRSJ argues that it is not necessary to remove Section 230 immunity for recommender algorithms because the “neutral tools” standard already exists to provide an effective means for holding platforms accountable for the social harms they produce. See id.

EFFECTS ON THE GROWTH OF THE MODERN INTERNET

Counter Extremism Project and Hany Farid (“CEP”), in support of Gonzalez, states that rejecting the application of Section 230 immunity to online platform service companies, such as Google in this case, would not necessarily threaten the growth of the modern internet. See Brief of Amici Curiae Counter Extremism Project and Hany Farid, in Support of Petitioners at 23–24. CEP points out that the modern internet is no longer “a fragile new means of communication” that could easily be damaged in its growth by law enforcement’s efforts to police harmful content. See id. at 22. CEP emphasizes that Google and other megaplatforms possess considerable power to stem the spread of harmful content; they have the ability to act to lessen the risk of injury to users. See id. at 24. CEP concludes that broad Section 230 immunity would provide those platforms with “a shield they do not deserve” when they fail to exercise their control with respect to harmful content. Id.

Wikimedia Foundation (“Wikimedia”), in support of Google, counters that the modern online community needs the protection of Section 230 immunity to flourish. See Brief of Amici Curiae The Cyber Civil Rights Initiative and Legal Scholars, in Support of Respondent at 26–27. Wikimedia argues that Section 230 allows small companies and non-profits with limited funds to exist and thrive by ensuring them protection from the risk of litigation and high litigation costs. See id. at 8–9. Wikimedia states that Section 230 promotes the development of the modern internet by permitting a diverse portfolio of companies, as opposed to an oligarchy of a few megaplatforms, to thrive without the fear of litigation. See id. at 9.

PRESERVATION OF ONLINE FREE SPEECH

Giffords Law Center to Prevent Gun Violence (“Giffords”), in support of Gonzalez, states that granting broad Section 230 immunity to online platforms would reduce checks on hate speech. See Brief of Amicus Curiae Giffords Law Center to Prevent Gun Violence, in Support of Petitioners at 19. Giffords argues that when platforms lack adequate mechanisms to combat hate speech, users tend to “self-censor” to avoid being targeted by prevalent hate-speech groups. See id. at 19–20. Giffords claims that this consequence directly runs contrary to the ideal of an online free “marketplace of ideas.” See id.

Internet Law Scholars (“ILS”), in support of Google, counters that Section 230’s legal immunity has been vital to the development of a safer and more diverse online environment because it liberates online platforms from the fear of liability for their users’ actions. See Brief of Amici Curiae Internet Law Scholars, in Support of Respondent at 18. ILS warns that denying Section 230 immunity to algorithmic selection processes, such as YouTube’s recommendation algorithm, would lead platforms to overmoderate their content to avoid liability. Id. at 20. ILS points out that this censorship would directly contradict the free speech values that characterize many online communities. See id.

Conclusion 

Acknowledgments 

Additional Resources