Supreme Court justices searched Wednesday for a way to determine when large social media companies used by terrorists cross the line into aiding and abetting them in their attacks.
Relatives of a person killed in a terrorist attack in Istanbul have sued three large tech companies, arguing they didn’t do enough to keep the terrorists from using the platforms, which fueled the terrorists’ power and reach.
The companies said they can’t be held liable for general use of their services.
Justices sought to figure out how to apply anti-terrorism laws to a world where the internet dominates transactions and communications.
“We’re used to thinking about banks as providing very important services to terrorists. Maybe we are not so used to [it], but it seems to be true that various kinds of social media platforms also provide very important services to terrorists,” said Justice Elena Kagan. “If you know that you’re providing an important service to terrorists, why aren’t you providing substantial assistance?”
Twitter, the chief defendant in the case, said it depends on the actors involved, their intent and the level of assistance they provide.
“You have to have a general awareness that you are assisting in illegal or tortious activity,” said Seth Waxman, Twitter’s lawyer. “You had to have provided substantial assistance to an act of international terrorism that injured the plaintiff.”
He said if the court were to rule that social media platforms are liable for acts of terrorism because Islamic State adherents used those services to win recruits and spread their ideology, it would implicate all sorts of other services. He wondered whether a taxi cab company used by terrorists could be held liable.
“That is a line-drawing problem,” said Justice Samuel A. Alito Jr.
The issue before the justices now is whether victims’ claims can even be brought under anti-terrorism laws such as the Justice Against Sponsors of Terrorism Act.
Justice Kagan seemed skeptical of social media companies’ claims of broad immunity, suggesting there should be room for a trial to explore the specifics of a case.
“This should be a jury question, shouldn’t it?” Justice Kagan said. “You’re helping by providing your service to those people.”
The Biden administration largely sided with Twitter.
Deputy Solicitor General Edwin Kneedler said that simply using a computer service is too “remote from the act of terrorism.”
“You can’t come up with a test that will answer every case,” Mr. Kneedler said. “They turn on the level of knowledge. They turn on the level of culpability.”
Wednesday’s case was brought by U.S. relatives of Nawras Alassaf, a Jordanian who was killed in an Islamic State-inspired mass shooting in Istanbul. They said Twitter, Facebook and Google aided and abetted ISIS by hosting its content and, in some cases, deriving ad revenue from it.
The case was the second before the high court this week dealing with tech giants’ liability. On Tuesday, the justices heard a challenge to Google, which owns YouTube. That case was brought by relatives of Nohemi Gonzalez, an American who was killed in Paris in an ISIS attack, who said YouTube was liable for building the Islamic State through its algorithms that promote ISIS content to those who go searching for it.
The tech companies say they are shielded by Section 230 of the Communications Decency Act, which generally extends immunity for content posted by third parties on their platforms.
Tuesday’s case goes further, treading into JASTA’a aiding-and-abetting provisions, which allow victims to recover civil damages against people or entities that provide material support in connection to international acts of terrorism.
Eric Schnapper, who represented the victims’ families, said the companies are responsible because they “recommend things … knowing what’s happening.”
“It’s recruiting and fundraising,” he said “The assistance doesn’t have to be connected to a specific act.”
“Aiding and abetting can include encouragement,” Mr. Schnapper added.
He noted that media and federal officials had warned the platforms that they were being used by terrorists.
The justices, however, appeared hesitant in the first case this week dealing with Section 230 about chipping away all legal protection for big tech, suggesting that would open a floodgate of lawsuits.
On Wednesday, they appeared open to there being at least some level of responsibility in assisting ISIS by promoting the group’s violent content on their platforms.
A ruling in both cases is expected by the end of June.