Nick Clegg Sets Out Collaborative Vision for Facebook Content Regulation

(Brussels, Belgium) - Nick Clegg’s appearance in Brussels shows the world’s largest social media platform’s commitment to addressing concerns. However, after this and three other appearances from Facebook executives in the last nine months, there are still important unanswered questions. What’s new is that Facebook is now saying clearly that it welcomes a collaborative approach between governments and large tech platforms. This is emblematic of the approach of a company that has typically been reactive in addressing serious issues with its platform only after they have come under public scrutiny.

Facebook’s repeated claim that its Artificial Intelligence (AI) catches 99% of the ISIS and al-Qaeda-related content ultimately removed from its platform, before users flag it, was repeated again. What we don’t know, and what Facebook still won’t tell us, is how long terrorist-related content stays online, how many people see it before it gets removed, and how much content the company is missing.

“Until Facebook starts to engage seriously with all of those concerned about extremism, as well as releasing the data needed to assess harm caused by the platform, serious questions will remain over its approach to tackling terrorist content," said Counter Extremism Project Executive Director David Ibsen. "For Facebook to be credible, it needs to work with outside actors dealing with the negative externalities of their platform. This means civil society and global experts, not just selected governments.” 


The appearance of Nick Clegg marks the fourth public appearance in Brussels from a Facebook executive since the Cambridge Analytica scandal. The others were:

  • May 2018: CEO Mark Zuckerberg
  • June 2018: COO Cheryl Sandberg
  • July 2018: VP of Public Policy for EMEA Richard Allan

Daily Dose

Extremists: Their Words. Their Actions.

View Archive

CEP on Twitter