WASHINGTON (AP) — The Supreme Court is weighing whether Facebook, Twitter and YouTube can be sued over a 2017 Islamic State group attack on a Turkish nightclub based on the argument the platforms assisted in fueling the growth of the terrorist organization.
What the justices decide to do in this case and a related one it heard yesterday, Feb. 21, is important particularly because the companies have been shielded from liability on the internet, allowing them to grow into the giants they are today.
On the first day of arguments, the justices suggested they had little appetite for a far-reaching ruling that would upend the internet. Today’s case about the nightclub attack in which 39 people died could provide an off-ramp for the justices if they want to limit the impact of what they do. Although they seemed unlikely to side with the families of people killed in separate attacks, they indicated they were wary of Google’s claims that the law gives it and other companies immunity from lawsuits.
At the heart of the cases before the justices are two federal laws. The first is Section 230 of the federal Communications Decency Act, which protects tech companies from being sued over material put on their sites by users. The second is the Justice Against Sponsors of Terrorism Act, which allows Americans injured by a terrorist attack abroad to sue for money damages in federal court.
In today’s case, the family of a man killed in the Reina nightclub attack in Istanbul sued Twitter, Facebook and YouTube parent Google under the terrorism law. Nawras Alassaf’s family members, who are U.S. citizens, say the companies aided and abetted the attack because they assisted in the growth of the Islamic State group, which claimed responsibility for the attack. A lower court let the lawsuit go forward.
The platforms argue that they can’t be sued because they did not knowingly or substantially assist in the Reina attack. If the justices agree, then they don’t have to reach bigger questions about Section 230 of the Communications Decency Act and whether it protects platforms when they recommend content.
The broader questions about Section 230 were at the center of the case the justices heard Feb. 21. In that case, the family of an American college student who was one of 130 people killed in the Paris attacks sued under the terrorism law.
The family of Nohemi Gonzalez argued that the Islamic State group used YouTube to spread its message and recruit people to its cause. They said YouTube’s algorithm, which recommends videos to users based on their viewing habits, was critical to the Islamic State group’s growth. Lower courts ruled Section 230 barred the lawsuit.
The Free Speech Center newsletter offers a digest of First Amendment and news media-related news every other week. Subscribe for free here: https://bit.ly/3kG9uiJ