CEP Statement On Documentary Investigating Facebook Content Moderation Policies

Findings Show Multiple Instances Where Company Valued Revenue Over Safety – Contrary to Facebook’s Congressional Testimonies

(New York, NY) – Counter Extremism Project (CEP) Executive Director David Ibsen released the following statement today regarding the United Kingdom’s Channel 4 investigation into Facebook’s content moderation policies, which showed multiple instances of incendiary and questionable content that were not taken down simply because they generate revenue:

“The findings from Channel 4’s documentary expose how Facebook has focused on spinning the public instead of making substantive changes to help prevent misuse of their platform. Despite multiple claims to lawmakers that Facebook is committed to removing extremist content and that safety is a top priority, it is clear these regurgitated talking points are simply not true. To the contrary, the Channel 4 report shows explicitly that Facebook employees believe it is in Facebook’s interest to limit removal of controversial content because it drives revenue,” stated Ibsen.

“As content that depicts and incites violence widely persists on social media platforms, the Channel 4 investigation reinforces the notion that talk has not led to adequate action. This investigation is a reminder that Facebook and other entities in the tech industry are for-profit companies and will therefore place revenue above all else. Profit margins should never be the top priority when lives are at stake,” Ibsen concluded.

On Tuesday, Channel 4 released an undercover documentary shedding light on how Facebook’s content moderation policies work in practice. In the course of their investigation, a Facebook content moderator was told to ignore pages by the far-right extremist group Britain First because “they have a lot of followers so they’re generating a lot of revenue for Facebook.” The comments strongly juxtapose those made by company leadership, even on that same day, to lawmakers.

On July 17, Facebook’s Head of Global Policy Management Monika Bickert testified before the U.S. House Judiciary Committee saying, “We feel a tremendous sense of accountability for how we operate our service, and it is in our business interest to make sure that our service is a safe place.” The foregoing, in addition to Zuckerberg’s statements during his testimony are clearly contradicted by Facebook practices and are therefore ignorant or deceptive. 

In April, Facebook CEO Mark Zuckerberg testified before the U.S. Senate Committees on the Judiciary and Commerce, Science and Transportation, saying, “[T]here is certain content that clearly we do not allow, right? Hate speech, terrorist content, nudity, anything that makes people feel unsafe in the community.”

Daily Dose

Extremists: Their Words. Their Actions.

In Their Own Words:

We reiterate once again that the brigades will directly target US bases across the region in case the US enemy commits a folly and decides to strike our resistance fighters and their camps [in Iraq].

Abu Ali al-Askari, Kata’ib Hezbollah (KH) Security Official Mar. 2023
View Archive