Jesus' Coming Back

Facebook accused of promoting terrorism with auto-generated content

After five months of research a whistleblower says Facebook allows terrorist sympathizers to thrive not only by failing to remove their accounts, but also by auto-creating celebratory videos and pages for terrorist groups.

The National Whistleblowers Center in Washington (NWC) has published a 48-page comprehensive study into how Facebook neglects to curb, and even boosts Al-Qaeda affiliated militants’ outreach, arguing that Facebook’s lack of action facilitates the recruitment of terrorists.

The study, conducted by an unnamed whistleblower from August through December 2018, showed that out of 3,228 accounts belonging to ‘friends’ of self-identified terrorists, only some 30 percent were removed by the platform.

Also on rt.com ‘Morally bankrupt pathological liars’: Facebook slammed by NZ privacy commissioner

These “friends” hail from around the world. Those who were singled out for the survey are based in the Middle East, Europe, Asia and Latin America. Among them, “many openly identified as terrorists themselves and shared extremist content,” the executive summary of the report states.

The social media giant, which recently prided itself on taking down 99 percent of Al-Qaeda and Islamic State (IS, formerly SIS/ISIL)-related content before it is reported by users, actually creates such content itself with its auto-generating tools, according to the report.

As an example, it refers to an auto-generated page Facebook “set up” for Al Shabaab militant group, complete with an ISIS logo. Somalia-based Al Shabaab has been long associated with Al-Qaeda. In 2015, a splinter group of its fighters pledged allegiance to IS.

The page came about because some users listed their membership in the terror group as their job occupation, prompting Facebook’s AI to create a “local business” page.

Another popular feature embedded in the Facebook algorithm allows for glorification of terrorism and extremism, the report says, referring to “celebration” and “memories” videos. It turns out that the seemingly innocuous feature makes no distinction between a user sharing a video with, say, birthday celebrations, and violence-filled extremist content. One of such chilling “celebration” videos featuring puddles of blood racked up over 2,000 likes, the report notes.

The “frame” feature that allows users to pledge allegiance to a favorite sports team or express support for a country, has been abused by terrorist sympathizes to pledge allegiance to extremist groups – and Facebook is apparently ok with it, since before a frame can be used publicly it must be approved by a Facebook moderator, the study points out.

The whistleblower was able to establish that at least a portion of users who self-identify as terrorists on their account pages, actually are terrorists. Out of 63 profiles of those who liked Al-Qaeda affiliate Tahrir Al Sham’s (former Al-Nusra) auto-generated “business page” on Facebook, 31 were confirmed to be actual terrorists active in Syria’s Idlib province by a local NGO.

The NWC says it has filed a petition with the US Securities and Exchange Commission (SEC) for it to “sanction Facebook for its dishonesty about terror and hate content on its website.”

Facebook has come under fire for what some believed was a slow and half-hearted response to the recent mosque attack in New Zealand, which saw an attacker livestreaming the massacre of some 50 Muslims. Although the video was taken down, its copies spread through social media.

Responding to the allegations that it aids terrorism supporters, Facebook told AP that it has been investing heavily into stamping out terrorism content and is “detecting and removing terrorism content at a far higher success rate than even two years ago.”

“We don’t claim to find everything and we remain vigilant in our efforts against terrorist groups around the world,” the company said.

Think your friends would be interested? Share this story!

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More