On July 2, 2021, a suicide bomber blew himself up at a crowded tea shop in Mogadishu, Somalia, killing 10 people and injuring dozens. Al-Shabaab claimed responsibility for the attack.
This week, Google filed a lawsuit in administrative court in Germany to challenge a portion of the country’s expanded 2018 Network Enforcement Act (NetzDG). NetzDG regulates online content moderation and requires online platforms to remove “manifestly illegal” content within 24 hours only after it has been reported by users. In challenging NetzDG, Google took issue with a broadened provision requiring tech companies to share personal data of those sharing content suspected to be illegal with federal law enforcement. The company claimed that it would be a violation of user privacy—a “massive intervention in the rights of our users.”
“NetzDG is a critical milestone in legislative efforts to address the tech industry’s deficiencies in moderating illegal content. Rather than present a false argument about privacy when discussing a provision that helps law enforcement officials do their jobs and protect the public, tech companies should instead address the hypocrisy in their own rhetoric,” said Counter Extremism Project (CEP) Executive Director David Ibsen. “Companies like Google make billions in revenue by selling access to their users’ data. So long as they continue to thrive off that business model, their arguments for protecting user privacy remain shallow and only serve bolster their public image.”
In 2018, CEP and the Centre for European Policy Studies (CEPS) authored a joint report on NetzDG. When it first entered into law, critics predicted that the German law would lead to both overreporting, therefore suppressing free speech, as well as the stifling of innovation, because smaller companies would not be able to swallow the cost of resources needed to comply with the law. The CEP-CEPS report proved that these arguments were baseless and the concerns unfounded. Six months after NetzDG’s implementation, the law did not result in a flood of reports or over-blocking. Furthermore, the study found that the expense of implementing NetzDG was minimal at 1 percent of total revenue.
Last April, CEP Germany released a follow-up NetzDG study that tested the tech industry’s compliance with the law between January 31 and February 14 of 2020. The law in its current form requires online platforms to remove “manifestly illegal” content within 24 hours only after it has been reported by users. CEP’s study revealed that YouTube, Facebook, and Instagram removed a mere 43.5 percent of clearly extremist and terrorist content, even after that material was reported for its illegal nature under the NetzDG. CEP Germany’s findings suggested that this “notice and takedown” method for removing illegal content can only be effective if platforms are being searched continuously and systemically for such material.
To read CEP and CEPS’ joint study on NetzDG, Germany’s NetzDG: A Key Test For Combatting Online Hate, please click here.
To read the results of CEP Germany’s NetzDG study and the policy paper, please click here.
To read about the failings of social media companies’ “notice and action” content moderation systems, please click here.
Get the latest news on extremism and counter-extremism delivered to your inbox.