For immediate release | Thursday, November 15, 2018

CEP Statement in Response to Facebook’s Most Recent & Dubious Claims

Press Contact:

Media at CEP

Counter Extremism Project (CEP) Executive Director David Ibsen released the following statement today in response to Facebook’s claims to the media of having removed more than one billion fake accounts and eliminated content exhibiting hate speech, terrorist propaganda and child exploitation: 

“Facebook has predictably gone on a public relations blitz, distracting from the issues at hand and touting the company’s so-called efforts in removing hate speech, extremist propaganda, and more from its website. But their smokescreen, complete with unverifiable claims of millions of posts and accounts having been removed, is little more than window dressing. 

“Just the same as the claims Mr. Zuckerberg made months ago before both houses of the U.S. Congress that the company’s artificial intelligence algorithms remove 99 percent of ISIS and al Qaeda content online, Facebook is intent on garnering headlines in an effort to ‘delay, deny and deflect’ from its scandal of the day. 

“With more than two billion active users, throwing out numbers that appear to be substantive, but in reality only mislead and represent a small fraction of activity on its platform, is a well-rehearsed practice in Menlo Park. Further, examples of terrorist propaganda remain readily available on Facebook, as CEP points out routinely. Rather than continue to tout false or misleading claims, Facebook should fix its platform once and for all.”

BACKGROUND

The Washington Post: “Facebook Says It Removed A Flood Of Hate Speech, Terrorist Propaganda And Fake Accounts From Its Site”:

“Facebook said Thursday it had removed more than a billion fake accounts and taken action against millions of posts, photos and other forms of content that violated its prohibition against hate speech, terrorist propaganda and child exploitation, the latest sign that the social-networking giant faces an onslaught of online abuse as it builds tools to spot it. The report shows that Facebook still struggles to identify hate speech and bullying, in particular, even at a time when social media companies are grappling with the rising tide of racist, sexist and anti-Semitic content online and the United States is experiencing a rise in hate crimes. In the new report reflecting the company’s activities between April and September, Facebook said it had found and removed roughly 1.5 billion fake accounts, while targeting 12.4 million pieces of terrorist propaganda, 2.2 billion pieces of spam and 66 million pieces of content that ran afoul of rules barring adult nudity and sexual activity. In doing so, Facebook said it had made progress at deploying its thousands of newly hired reviewers – and powerful artificial-intelligence tools – to enforce its community standards more aggressively.  The company said that it catches more than 95 percent of nudity, fake accounts and graphic violence before users report it to Facebook. But for hate speech and a related category, bullying, the company catches 51.6 percent and 14.9 percent of incidents before they are flagged by Facebook users.” (Tony Romm & Elizabeth Dwoskin, “Facebook Says It Removed A Flood Of Hate Speech, Terrorist Propaganda And Fake Accounts From Its Site,” The Washington Post, 11/15/18)

The New York Times: “Delay, Deny And Deflect: How Facebook’s Leaders Fought Through Crisis”:

“Sheryl Sandberg was seething. Inside Facebook’s Menlo Park, Calif., headquarters, top executives gathered in the glass-walled conference room of its founder, Mark Zuckerberg. It was September 2017, more than a year after Facebook engineers discovered suspicious Russia-linked activity on its site, an early warning of the Kremlin campaign to disrupt the 2016 American election. Congressional and federal investigators were closing in on evidence that would implicate the company. But it wasn’t the looming disaster at Facebook that angered Ms. Sandberg. It was the social network’s security chief, Alex Stamos, who had informed company board members the day before that Facebook had yet to contain the Russian infestation. Mr. Stamos’s briefing had prompted a humiliating boardroom interrogation of Ms. Sandberg, Facebook’s chief operating officer, and her billionaire boss. She appeared to regard the admission as a betrayal. ‘You threw us under the bus!’ she yelled at Mr. Stamos, according to people who were present.” (Sheera Frenkel, Nicholas Confessore, Cecilia Kang, Matthew Rosenberg & Jack Nicas, “Delay, Deny And Deflect: How Facebook’s Leaders Fought Through Crisis,” The New York Times, 11/14/18)