For years, YouTube has been accused of enabling terrorist recruitment. This allegedly happens when a user clicks on a terrorist video hosted on the platform, then spirals down a rabbit hole of extremist content automatically queued “up next” through YouTube’s recommendation engine. In 2016, the family of Nohemi Gonzalez—who was killed in a 2015 Paris terrorist attack after extremists allegedly relied on YouTube for recruitment—sued YouTube owner Google, forcing courts to consider YouTube’s alleged role in aiding and abetting terrorists. Google has been defending YouTube ever since. Then, last year, the Supreme Court agreed to hear the case.
Now, the Gonzalez family is hoping that the high court will agree that Section 230 protections designed to shield websites from liabilities for hosting third-party content shouldn’t be extended to also protect platforms’ right to recommend harmful content.
Google thinks that’s exactly how the liability shield should work, though. Yesterday, in a court filing, Google argued that Section 230 protects YouTube’s recommendation engine as a legitimate tool “meant to facilitate the communication and content of others.”
“Section 230 includes sorting content via algorithms by defining ‘interactive computer service’ to include ‘tools’ that ‘pick, choose,’ ‘filter,’ ‘search, subset, organize,’ or ‘reorganize’ content,” Google argued. “Congress intended to provide protection for these functions, not for simply hosting third-party content.”
Google claimed that denying that Section 230 protections apply to YouTube’s recommendation engine would remove shields protecting all websites using algorithms to sort and surface relevant content—from search engines to online shopping websites. This, Google warned, would trigger “devastating spillover effects” that would devolve the Internet “into a disorganized mess and a litigation minefield”—which is exactly what Section 230 was designed to prevent.
It seems that in Google’s view, a ruling against Google would transform the Internet into a dystopia where all websites and even individual users could potentially be sued for sharing links to content deemed offensive. In a statement, Google general counsel Halimah DeLaine Prado said that such liability would lead some bigger websites to overly censor content out of extreme caution, while websites with fewer resources would probably go the other direction and censor nothing.
“A decision undermining Section 230 would make websites either remove potentially controversial material or shut their eyes to objectionable content to avoid knowledge of it,” DeLaine Prado said. “You would be left with a forced choice between overly curated mainstream sites or fringe sites flooded with objectionable content.”
The Supreme Court will begin hearing oral arguments in this case on February 21.
Google has asked the court to affirm the 9th Circuit of Appeals judgment, which found that Section 230 indeed shields YouTube’s recommendation engine. The Gonzalez family seeks a ruling that Section 230 immunity does not directly cover YouTube’s action of recommending terrorist videos posted by third parties.
Ars could not immediately reach either legal team for comment.
Up next: Deciding the fate of Section 230
In the court filing, Google argued that YouTube already works to counter recruitment efforts with community guidelines that prohibit content promoting terrorist organizations.
Since 2017, Google has taken steps to remove and block the reach of any violating content, including refining YouTube algorithms to better recognize extremist content. Perhaps most applicable to this case, at that time, YouTube also implemented a “redirect method” using targeted advertising to guide potential ISIS recruits away from radicalization videos.
Today, Google said in the court filing, YouTube functions differently from how it did in 2015, with the video-sharing platform investing more heavily in prioritizing stronger enforcement of its violent extremism policy. In the last quarter of 2022, YouTube automatically detected and removed approximately 95 percent of videos that violated its policy, the court filing said.
According to Google, companies operating under protections from Section 230 are already motivated to make the Internet safer, and the Supreme Court must consider how any decision reforming how Section 230 is interpreted risks disturbing that delicate balance.
Google argues that it shouldn’t be up to the Supreme Court to make decisions that would reform Section 230, but instead up to Congress. So far, lawmakers’ recent attempts to reform Section 230 have failed, but this week, Joe Biden urged Congress to join him in reversing course on how the liability shield operates. If Biden gets his way, platforms like YouTube may be held liable for hosting offensive third-party content in the future. Such a regulation change could give the Gonzalez family peace of mind, knowing YouTube would legally have to proactively block all terrorist videos, but Google’s argument suggests that such extreme Section 230 reform would inevitably “upend the Internet.”