If the U.S. Supreme Court decides to hear (as is widely expected) either or both of the NetChoice cases challenging state efforts to regulate content moderation on social media, that decision would be grounds enough to call the upcoming term a blockbuster for media law. Not content to stop there, apparently, the justices added to their docket last week their first opportunity to interpret Section 230 of the Communications Decency Act — the statutory provision that Professor Jeff Kosseff memorably nicknamed “the twenty-six words that created the internet.”
The case, Gonzalez v. Google, involves YouTube’s liability or not for allegedly aiding and abetting terrorism by serving users automated recommendations that they view ISIS propaganda. As a general matter, Section 230 — per its text — prevents litigants from attempting to treat an “interactive computer service” as “the publisher or speaker of any information provided by another information content provider.” In other words, the bare fact that an ISIS video appears on YouTube doesn’t leave YouTube on the hook for its contents, and the Gonzalez petitioners don’t dispute “that section 230 immunizes Google from liability for permitting those postings.” Still, they maintain that the recommendations “are communications by and from [the] service itself” and that holding Google liable for them doesn’t treat it as the speaker of ISIS’s videos.
The U.S. Court of Appeals for the Ninth Circuit rejected that claim, and the Supreme Court’s decision to review it further comes as something of a surprise. The justices typically prefer to let an issue percolate — waiting for disagreement to emerge in the lower courts — before weighing in. But as the Gonzalez petition granted, this issue “has not resulted in a conflict in the precedents in the circuits at issue”; the two courts that have addressed the question agree that Section 230 protects recommendations. And while Justice Clarence Thomas had previously expressed an interest in weighing in on Section 230’s scope, no other justice had spoken up.
We’ll be watching this case with concern. As we’ve written before, while Section 230 is sometimes framed as a shield for large social media companies in particular, its protections come into play in contexts as diverse as a news organization’s comment section or an individual user’s retweet. And as Google’s brief in opposition highlighted, it’s hard to see where the line between “recommending” and displaying content lies when “every publication tells a reader how to access content — ’click here’ or ‘read on’ — and implicitly represents that the content may be worth reading.” A win for the Gonzalez petition could not only chill the moderation that news organizations engage in themselves but also discourage platforms from hosting unvetted user content — cutting a key avenue for the free flow of information to the press and public.
Expect this case to draw enormous attention, and stay tuned for more.
Like what you’ve read? Sign up to get The Nuance newsletter delivered straight to your inbox!
The Technology and Press Freedom Project at the Reporters Committee for Freedom of the Press uses integrated advocacy — combining the law, policy analysis, and public education — to defend and promote press rights on issues at the intersection of technology and press freedom, such as reporter-source confidentiality protections, electronic surveillance law and policy, and content regulation online and in other media. TPFP is directed by Reporters Committee attorney Gabe Rottman. He works with RCFP Staff Attorney Grayson Clary and Technology Press Freedom Project Fellow Emily Hockett.