For immediate release | Thursday, April 11, 2019

Tech & Terrorism: Presence of Banned Neo-Nazi Content Demonstrates YouTube’s Failure to Enforce its Own Rules

Press Contact:

Media at CEP

Tech Giant’s Inaction Shows Once Again that Reality Does Not Match up With Corporate Rhetoric

Earlier this week, the Counter Extremism Project (CEP) released findings that show that supporters of the neo-Nazi group Atomwaffen Division (AWD) have re-uploaded AWD’s videos yet again on YouTube. CEP also flagged in January that the group’s supporters were using a two-month-old YouTube channel to upload 15 videos with a total of 7,000 views. Despite the tech giant banning AWD’s channel more than a year ago, YouTube failed to prevent the upload of this known extremist content. Clearly, YouTube’s promises – as touted by Google’s Counsel for Free Expression and Human Rights Alexandria Walden before the U.S. House Judiciary Committee on Tuesday – to ban content that “promotes and incites violence against individuals or groups, or promotes hatred against individuals or groups based on their characteristics, including race, gender, ethnicity, religion” remain categorically unfilled.

“The fact that known extremist content, which even YouTube claims to ban from its platform, is being re-uploaded speaks to the company’s lack of seriousness when it comes to fighting online extremism. AWD supporters on YouTube have posted an audiobook of Siege multiple times and the organization’s fans have repeatedly uploaded the same material. This is a book that glorifies terrorism against the U.S. government and religious and ethnic minorities,” said CEP Executive Director David Ibsen. “YouTube continues to rely on its corporate spin tactics to placate critics and lawmakers – long enough to evade tough questions and forestall the possibility of discussing regulation of the tech industry. YouTube owes lawmakers and the general public answers, not the same old sales pitches for hopelessly failed safety and security measures.”

Below is a brief selection of Google’s testimony from this week:

CONGRESSMAN HANK JOHNSON: “Ms. Walden, many white nationalists have used misinformation, propaganda to radicalize social media users. How is YouTube working to stop the spread of far-right conspiracies intent on skewing users’ perspective of fact and fiction?”

ALEXANDRIA WALDEN: “Congressman, thank you for the question. Most recently, we have made a – updates to our recommendation algorithm so that content that’s on the ‘borderline’ is not pushed out through our recommendation system. So content that violates our guidelines – our hate speech guidelines, which prohibit anything that promotes and incites violence against individuals or groups or promotes hatred against individuals or groups based on their characteristics, including race, gender, ethnicity, religion – all of that content is violative against our community guidelines. But content that’s on the border is content that we no longer include in our recommendation algorithm and it can also be demonetized and comments are disabled, etcetera. So we do our best to ensure that content that’s on the border isn’t fully distributed across the platform.”