(New York, N.Y.) – Counter Extremism Project (CEP) Executive Director David Ibsen issued a statement today in response to Google-owned YouTube announcing a new policy to ban “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion” in its plan to remove more neo-Nazi, white supremacy, and other hateful content:
“Google-owned YouTube is well within its rights to adjust, amend, and add to its Terms and Conditions to justify removal of objectionable content. Of course, YouTube’s terms have long justified the company’s removal of any content on its platform without explanation and without notice.
"Instead of simply enforcing its long-standing community guidelines consistently and transparently, YouTube has once again resorted to big tech’s usual tired PR playbook: publicizing a new policy change via the media only after a highly publicized incident, namely that of a well-known YouTube personality harassing a Vox journalist, which resulted in negative publicity for the company.
“YouTube has also issued similarly vague announcements and in the past, subsequently followed by inconsistent and non-transparent enforcement. This is why YouTube is still a preferred platform for hateful material promoting far-right and Islamic extremism, which continues to radicalize people around the world.”
For example, CEP has previously released findings showing that supporters of the neo-Nazi group Atomwaffen Division had been using a two-month-old YouTube channel to post 15 of the group’s videos, which accumulated 7,000 views despite the fact that YouTube announced the deletion of the Atomwaffen Division’s channel more than one year prior. Despite that announcement, YouTube did not prevent the re-upload of additional Atomwaffen content, nor did it act with any urgency to remove it even after it was reported for violating YouTube’s Community Guidelines.
CEP has also specifically worked to document the numerous instances in which Google / YouTube made express policy changes following public scandals, negative media coverage, or pressure from lawmakers. While one would hope that Google is continuously working to improve security on YouTube and its other platforms, there is no excuse as to why so many policy changes have been reactionary, and it raises the question as to what other scandals are in the making due to still-undiscovered lapses in Google’s current policies.
In addition, CEP has previously documented YouTube’s inability to follow through on its extremist content removal policies in its report, OK Google, Show Me Extremism. On July 21, 2017, Google announced the launch of one such measure––its Redirect Method Pilot Program. The program is intended to target individuals searching for ISIS-related content on YouTube and direct them to counter-narrative videos, which try to undermine the messaging of extremist groups. Between August 2 and August 3, 2018, CEP manually reviewed a total of 649 YouTube videos and found that viewers were four times more likely to encounter extremist material than counter-narrative messaging.
In July 2018, CEP deployed its own hashing technology, eGLYPH, to better understand how ISIS content was being uploaded to YouTube, how long it was staying online, and how many views those videos received. The report called into question YouTube’s claims of being able to remove ISIS videos quickly and effectively. Using a narrow set of 229 previously identified ISIS terror-related videos, CEP found that over a three-month period, no less than 1,348 videos were uploaded via 278 separate accounts, garnering at least 163,000 views, and 91 percent of all the videos were uploaded more than once.