On July 14, The Counter Extremism Project (CEP) hosted a webinar discussion on the upcoming European regulation of extremist content online in the context of the Digital Services Act.
Lucinda Creighton, Senior Europe Advisor to CEP Europe and former Europe Minister
Dr. Irene Roche Laguna is an expert on intermediary liability and content policies. She is the Deputy Head of Unit for the implementation of the regulatory framework in DG CONNECT, European Commission.
Dr. Hany Farid is a global leader in digital forensics, image analysis, and human perception and serves as a Professor at the University of California, Berkeley with a joint appointment in Electrical Engineering & Computer Science and the School of Information. Dr. Farid is also a Senior Advisor to CEP.
Alexander Ritzmann is in charge of the internet regulation and online content moderation file at CEP Germany, with a particular focus on the German NetzDG law, the proposed EU Terrorist Content Online (TCO) Directive and the proposed EU Digital Services Act (DSA).
Dr. Hans-Jakob Schindler, CEP Senior Director
In early June, the European Commission launched a public consultation on the Digital Services Act (DSA) package to update the e-Commerce Directive for the digital age. As part of that package, the European Commission is seeking feedback on how best to increase and harmonise the responsibilities of online platforms and information service providers in the EU. Stakeholders have until 8 September to provide feedback on the liability regime for intermediaries and the means of keeping users safe from illegal content, while at the same time protecting their fundamental rights.
This webinar will explain the political process the proposed DSA will undergo; introduce the current position of the European Commission concerning the DSA; and discuss concrete suggestions on how to increase the effectiveness of content moderation systems to tackle illegal extremist content more effectively, including requirements for platforms to provide explainable and comprehensible transparency of their systems.
We will also highlight the role of algorithmically amplified illegal or harmful content and discuss some relevant lessons learned from the German Network Enforcement Act (NetzDG).
To read CEP’s recommendations for amending Germany’s NetzDG law and CEP’s recommendations on how to build comprehensible transparency for automated decision-making systems (ADM), please go to: