Zuckerberg’s 99% Myth Exposed: CEO Described as “Frustrating & Evasive” in Latest PR-Stunt

European Union Cracks Down on Tech Companies & CEP Calls for More Stringent Measures

Facebook was back on the public relations trail this week, as Mark Zuckerberg provided an in-depth interview to The New Yorker. When speaking to Mr. Zuckerberg, reporter Evan Osnos recalled Facebook’s failures to moderate hate speech and extremism, from the WhatsApp-driven riots, lynchings and beatings in India, to the facilitation of weapons trading in Libya, to fueling false rumors leading to religious violence in Sri Lanka. Most striking however, was Facebook’s five-year long struggle to clamp down on ethnic violence in Myanmar, despite frequent promises to do so.  When questioned about what was taking so long, Mr. Zuckerberg stated, “I think, fundamentally, we’ve been slow at the same thing in a number of areas, because it’s actually the same problem. But, yeah, I think the situation in Myanmar is terrible.” Osnos described that answer as “frustrating and evasive.”

Because of Mr. Zuckerberg laissez-faire attitude towards problems on Facebook-owned websites, European Union (EU) officials proposed legislation this week that will crack down on terrorist propaganda and recruitment online. Under the proposal, tech companies would be forced to remove terror content within an hour of it being flagged. The legislation marks a welcome step forward in how Europe fights extremism online, but the Counter Extremism Project (CEP) believes it does not go far enough – especially when technology, such as eGLYPH, exists which can remove terrorist propaganda the second it is uploaded. As CEP Senior Advisor Dr. Hany Farid stated, the fact that tech companies have dragged their feet “for the last three, four, five years is particularly offensive given that we had already solved this problem in the child pornography space. It wasn’t that they couldn’t do it – they honestly didn’t want to do it.”

The fact that terrorists have for years strongly leveraged the Internet and social media for their own gain should not be surprising to anyone – what should be abhorrent however, is how tolerant tech companies are of the brutal and violent propaganda found on their platforms. This week, CEP identified two posts inspired by deceased ISIS leader and spokesperson Abu Mohammad al-Adnani. An audio file paired with a static image of his likeness criticized democracy and called for violence.  Another is a fan-made ISIS propaganda video that contains footage from at least seven different official ISIS sources. The video, which features footage of combat, corpses and severed heads, accompanies a voiceover of a call for Adnani.

To read more about this issue and see the aforementioned examples, as well as other examples of extremist content, please see the background below.

EXTREMIST FACEBOOK CONTENT

1. ISIS's Call For Immigration

  • Located on Facebook: September 12, 2018
  • Time on Facebook: 10 hours
  • Views: 40, five likes/reacts and five shares
  • URL: Link
  • Profile Language: Indonesian
  • Description: A segment from the ISIS video “To Establish the Religion” originally released on March 30, 2016. The video includes an English-speaking ISIS fighter in Libya, shown with his daughter, describing the moral and healthy conditions in ISIS-held territory and calling for others to immigrate to the Islamic State.

2. Violent Fan-Made ISIS Propaganda Video

  • Located on Facebook: September 12, 2018
  • Time on Facebook: Two days
  • Views: 17 and one like/react
  • URL: Link
  • Profile Language: Arabic
  • Description: A fan-made ISIS propaganda video that contains footage from at least seven different official ISIS sources. The video, which includes footage of combat, corpses and severed heads, accompanies a voiceover of a call for violence from the deceased ISIS leader and spokesperson Abu Mohammad al-Adnani.

3. ISIS Propaganda On Facebook

  • Located on Facebook: September 12, 2018
  • Time on Facebook: One day
  • Views: N/A, 35 likes/reacts
  • URL: Link
  • Profile Language: Arabic
  • Description: An ISIS martyrdom notice for Asif Nazir Dar, one of the leaders of the group in Jammu and Kashmir.

4. ISIS Call For Violence

  • Located on Facebook: September 12, 2018
  • Time on Facebook: Approximately two months
  • Views: 2,000-plus, 289 likes/reacts and 34 shares
  • URL: Link
  • Profile Language: Arabic
  • Description: An audio file paired with a static image for the Abu Mohammad al-Adnani speech, “Now, Now the Fighting Comes,” in which the now-deceased ISIS leader and spokesperson criticized democracy and called for violence.

5. Violent ISIS Propaganda On Facebook

  • Located on Facebook: September 12, 2018
  • Time on Facebook: One day
  • Views: N/A, 25 likes/reacts
  • URL: Link
  • Profile Language: Arabic
  • Description: An ISIS propaganda photo that shows the assassination of a member of the Somali security forces. The photo was originally released on September 11, 2018.

BACKGROUND

Facebook Has Had Trouble With Extremism Online In India, Libya And Sri Lanka. “As Facebook expanded, so did its blind spots. The company’s financial future relies partly on growth in developing countries, but the platform has been a powerful catalyst of violence in fragile parts of the globe. In India, the largest market for Facebook’s WhatsApp service, hoaxes have triggered riots, lynchings, and fatal beatings. Local officials resorted to shutting down the Internet sixty-five times last year. In Libya, people took to Facebook to trade weapons, and armed groups relayed the locations of targets for artillery strikes. In Sri Lanka, after a Buddhist mob attacked Muslims this spring over a false rumor, a Presidential adviser told the Times, ‘The germs are ours, but Facebook is the wind.’” (Evan Osnos, “Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?,” The New Yorker, 9/17/18)

Its Problems Have Been Worst In Myanmar, Where The Rohingya Minority Has Been “Subject To Brutal Killings, Gang Rapes, And Torture.” “Nowhere has the damage been starker than in Myanmar, where the Rohingya Muslim minority has been subject to brutal killings, gang rapes, and torture. In 2012, around one per cent of the country’s population had access to the Internet. Three years later, that figure had reached twenty-five per cent. Phones often came preloaded with the Facebook app, and Buddhist extremists seeking to inflame ethnic tensions with the Rohingya mastered the art of misinformation. Wirathu, a monk with a large Facebook following, sparked a deadly riot against Muslims in 2014 when he shared a fake report of a rape and warned of a ‘Jihad against us.’ Others gamed Facebook’s rules against hate speech by fanning paranoia about demographic change. Although Muslims make up no more than five per cent of the country, a popular graphic appearing on Facebook cautioned that ‘when Muslims become the most powerful’ they will offer ‘Islam or the sword.’” (Evan Osnos, “Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?,” The New Yorker, 9/17/18)

When Questioned Why Facebook Was Taking So Long To Address Issues In Myanmar, Zuckerberg’s Answer Was Described As Frustrating And Evasive. “More than three months later, I asked Jes Kaliebe Petersen, the C.E.O. of Phandeeyar, a tech hub in Myanmar, if there had been any progress. ‘We haven’t seen any tangible change from Facebook,’ he told me. ‘We don’t know how much content is being reported. We don’t know how many people at Facebook speak Burmese. The situation is getting worse and worse here.’  I saw Zuckerberg the following morning, and asked him what was taking so long. He replied, ‘I think, fundamentally, we’ve been slow at the same thing in a number of areas, because it’s actually the same problem.  But, yeah, I think the situation in Myanmar is terrible.’ It was a frustrating and evasive reply.” (Evan Osnos, “Can Mark Zuckerberg Fix Facebook Before It Breaks Democracy?,” The New Yorker, 9/17/18)

The European Union On Wednesday Unveiled New Legislation To Fight Extremism Online. “On Wednesday, the EU published details of how the bloc will seek to make these recommendations obligatory. The new legislation will force tech companies to comply with at least one of the recommendations passed earlier this year: that they remove terror content within an hour of it being reported by local law enforcement agencies. If the bill is passed – it would need to be approved by the European Parliament and a majority of the member states – they would face fines for noncompliance, although it’s unclear what those would be.” (David Gilbert, “The EU Is Done Warning Tech Companies About Removing Extremist Content — Now They Want To Act,” Vice News, 9/12/18)

CEP Senior Advisor Dr. Hany Farid: “The Dragging Of Their Feet For The Last Three, Four, Five Years Is Particularly Offensive Given That We Had Already Solved This Problem In The Child Pornography Space. It Wasn’t That They Couldn’t Do It – They Honestly Didn’t Want To Do It.” “Hany Farid, a digital forensics expert at Dartmouth University and senior adviser to the Counter Extremism Project, said that the tech giants are still not doing enough, and that the legislation doesn’t go far enough to force them to solve that problem. ‘The dragging of their feet for the last three, four, five years is particularly offensive given that we had already solved this problem in the child pornography space,’ he said. ‘It wasn't that they couldn't do it – they honestly didn’t want to do it.’ Tech companies have a shared database of terror content that they can use to prevent the same videos or photos being uploaded repeatedly. By the end of 2018, the database will have more than 100,000 entries, according to The Global Internet Forum to Counter Terrorism, which maintains the database – a fraction of the number of photos and videos identified and removed by tech giants. Farid believes the new legislation should make it mandatory for companies to prevent this reuploading of content. ‘If you don’t want to play the whack-a-mole problem with this content and with these groups, once it has been identified, you should say this must be entered into your hashing database – which the tech companies claim to have.’” (David Gilbert, “The EU Is Done Warning Tech Companies About Removing Extremist Content – Now They Want To Act,” Vice News, 9/12/18)

Daily Dose

Extremists: Their Words. Their Actions.

In Their Own Words:

We reiterate once again that the brigades will directly target US bases across the region in case the US enemy commits a folly and decides to strike our resistance fighters and their camps [in Iraq].

Abu Ali al-Askari, Kata’ib Hezbollah (KH) Security Official Mar. 2023
View Archive