The U.S. Supreme Court on Monday agreed to hear a challenge to federal protections for internet and social media companies freeing them of responsibility for content posted by users in a case involving an American student fatally shot in a 2015 rampage by Islamist militants in Paris.
The justices took up an appeal by the parents and other relatives of Nohemi Gonzalez, a 23-year-old woman from California who was studying in Paris, of a lower court’s ruling that cleared Google LLC-owned YouTube of wrongdoing in a lawsuit seeking monetary damages that the family brought under a U.S. anti-terrorism law. Google and YouTube are part of Alphabet Inc.
The Supreme Court also agreed to hear a separate appeal by Twitter Inc of the lower court’s decision to revive a similar lawsuit against that company, though not based on Section 230.
The lawsuit against Google accused it of materially supporting terrorism, violating the Anti-Terrorism Act. This federal law allows Americans to recover damages related to “an act of international terrorism.” The lawsuit alleged that YouTube, through computer algorithms, recommended videos by the Islamic State militant group, which claimed responsibility for the Paris attacks, to certain users.
The San Francisco-based 9th U.S. Circuit Court of Appeals in 2021 dismissed the lawsuit in a ruling relying largely on another law, known as Section 230 of the Communications Decency Act of 1996.
Section 230, enacted before the rise of today’s major social media companies, protects “interactive computer services” by ensuring they cannot be treated as the “publisher or speaker” of any information provided by other users.
The lawsuit argued that such immunity should not apply when the company’s platform recommends certain content via algorithms that identify and display content most likely to interest users based on how people use the service.
Section 230 has drawn criticism from across the political spectrum. Democrats have faulted it for giving social media companies a pass for spreading hate speech and misinformation. Republicans painted it as a tool for censorship of voices on the right, especially after Twitter and other platforms banned then-President Donald Trump after a mob of his supporters attacked the U.S. Capitol in a deadly riot on Jan. 6, 2021. Trump as president, unsuccessfully sought its repeal.
Gonzalez was among 130 people killed in Paris during the 2015 attacks, including suicide bombings and mass shootings. She was at a bistro called La Belle Equipe when militants fired on the crowd of diners.
The plaintiffs said that YouTube’s algorithm helped Islamic State spread its militant message by recommending to users the group’s videos, including those aimed at recruiting jihadist fighters and that the company’s “assistance” was a cause of the 2015 attacks.
Gonzalez’s family appealed the 9th Circuit ruling to the Supreme Court, noting that while algorithms may suggest benign dance videos to some, “other recommendations suggest that users look at materials inciting dangerous, criminal or self-destructive behavior.”
The family added that removing Section 230 protections would prompt websites to stop recommending harmful materials while saying that allowing the immunity “denies redress to victims who could have shown that those recommendations had caused their injuries or the deaths of their loved ones.”
In the case against Twitter, American family members of Nawras Alassaf, a Jordanian citizen who died in a nightclub mass shooting in 2017 in Istanbul also claimed by Islamic State, accused the social media company of violating the anti-terrorism law by failing to police the platform for Islamic State accounts or posts.
The 9th Circuit, in the same ruling, reversed a federal judge’s decision to throw out the case against Twitter but did not assess Twitter’s claim of immunity under Section 230.