Did Mark Zuckerberg make Facebook vulnerable to anti-terrorism lawsuits?
Facebook CEO Mark Zuckerberg was extremely careful in his testimony to Congress earlier in April.
Innumerable times throughout the questioning he answered that he had to check with his team and would respond later.
It is therefore somewhat astonishing that one answer he gave may have put Facebook in hot water and lead to anti-terrorism lawsuits being filed against it.
Zuckerberg said that Facebook is responsible for the content on its platform, a position that Shurat Hadin – a civil rights non-government organization focused on representing terrorism victims, Jewish issues, and Israeli causes – and many others have been trying to push for some years.
Following Zuckerberg’s statement, Shurat Hadin filed a legal brief to use his own statement to hold Facebook responsible for the content posted by terrorists on its platform, which the NGO alleges has led to the murders of five Americans.
Where does this legal brief fit into Facebook’s previous legal battles?
In May 2017, Facebook successfully fought off a different lawsuit by Shurat Hadin, led by director Nitsana Darshan-Leitner, to try to hold Facebook liable for $1 billion in damages for providing “material support to designated terrorist groups when it allows Hamas, Hezbollah, ISIS and the PLO to utilize its social media pages.”
“There is no difference between providing banking services to terrorists and providing a Facebook or Twitter account,” she argued.
Shurat Hadin’s innovative argument had been that Facebook was liable for terrorists posting on its social media platform not just because it should be responsible for content, but also because its platform concretely aided terrorists in carrying out attacks by helping them recruit, communicate and plan logistics.
A federal court in Brooklyn, while showing strong sympathy for Shurat Hadin’s criticism of Facebook, rejected this new legal angle.
The court ruled that the “plaintiffs claim that Facebook contributed to their harm by allowing Hamas to use its platform to post particular offensive content that incited or encouraged those attacks. Facebook’s role in publishing that content is thus an essential causal element of the claims.”
The bottom line on this confusing formulation is that the US Communications Decency Act (1996), which bars all legal claims against an entity for posts by third parties using its platform, also gave Facebook immunity for posts by third parties – including terrorists.
This was exactly what Shurat Hadin had hoped to avoid by taking a new angle: While it was attacking Facebook as a publisher, it was also attacking it as an aide and abettor of terrorists, something they had hoped the CDA would not be able to absolve.
FAST-FORWARD to the present.
Shurat Hadin is now asking the same court to revisit its May 2017 ruling.
Essentially, the NGO is arguing: “Okay, maybe in May 2017 you could let Facebook off the hook because of the CDA and Facebook saying that it was not responsible for third-parties’ content on its platform – but now Zuckerberg himself says Facebook is responsible!”
Facebook did not respond to inquiries for a clarification of its position on the issue, but Shurat Hadin’s argument sounds like it may be a slam dunk. Plus, if Shurat Hadin can keep things simple with the court, the particular judge had been sympathetic to its argument in the earlier case, so it may stand a chance.
But it also may be far from a slam dunk.
The truth is that even though this is the farthest Zuckerberg has gone with regards to taking responsibility for the content posted by third parties, in the past he has made statements showing an evolving shift toward taking more responsibility for the content on Facebook and framing the company as somewhat of a media outlet. Yet, despite these statements, Facebook has successfully fought off all lawsuits filed against third-party content.
Likewise, Zuckerberg has a few specific outs: He could say he meant that the company holds moral responsibility for, not legal. Plus, while he was under oath before Congress, that is not the same as giving a binding answer to a complex question of legal interpretations.
But his answer to Congress seems to go deeper into identity-defining issues that could have legal implications.
“When people ask if we’re a media company what I heard is, ‘Do we have a responsibility for the content that people share on Facebook,’ and I believe the answer to that question is yes,” Zuckerberg said.
Shifting Facebook into having responsibility for all content like a media company, even if it is primarily a tech company, could have legal consequences.
Another possible out is that, even on a legal level, what Zuckerberg may have had in mind when he made that statement was that he was moving toward agreeing that Facebook be bound in the future by certain advertising regulations. That is a different legal issue than whether the CDA still gives Facebook an out for terrorists’ posts or whether the US’s Anti-Terrorism Act applies.
BUT THERE are two bigger problems with Shurat Hadin’s case.
Firstly, the second part of Shurat Hadin’s lawsuit was a claim by 20,000 Israeli victims of terrorism asking for an injunction to get Facebook to better police posts by terrorists.
Though Facebook was not effectively addressing this issue back when Shurat Hadin initially filed this case (in 2015), even top Israeli ministers have acknowledged that Facebook has since gotten serious about the issue starting around 18 months ago.
In a recent interview with The Jerusalem Post, Facebook’s global counterterrorism chief, Erin Saltman, described a massive 18-month effort to combat terrorists using its platform and to counter hate speech.
The effort includes around 200 experts – some with law enforcement backgrounds – directing a team of over 7,500 people who review and remove posts, as well as using complex algorithms to detect terrorism.
This means the injunction part of the case is likely moot. In fact, in a past article with the Post, Shurat Hadin took credit for shifting Facebook’s conduct through its legal pressure.
The second big problem with Shurat Hadin’s case is that Zuckerberg’s statement really has nothing to do with this anti-terrorism issue.
He did not admit that Facebook provides material support to terrorists, which means that Shurat Hadin’s innovative idea likely received no help from his testimony.
Which means the real question is whether their radical idea of claiming Facebook’s platform was a tool that materially aided terrorism is even necessary, whether his taking responsibility for content on Facebook opens it to a barrage of lawsuits – related to terrorism, pornography or otherwise – for the content that third-parties have posted on the platform.
As unambiguous as Zuckerberg was on a moral level, it may be a hard sell to argue that his statement meant that Facebook concedes it should now lose all of the flood of legal cases against it for third-party content.
Comments are closed.