Tech & Terrorism: Dr. Hany Farid & UC Berkeley Researchers Release Report On YouTube’s Recommendation Algorithm

Report Analyzes YouTube’s Role With Disinformation, Supporting Extremist Ideologies

(New York, N.Y.) – This week, researchers at the University of California, Berkeley, including Counter Extremism Project (CEP) Senior Advisor Dr. Hany Farid, released a new report analyzing YouTube’s policies and efforts towards curbing its recommendation algorithm’s tendency to spread divisive conspiracy theories. Such content has been known to exacerbate forms of online extremism and can lead to real-life violence.

After reviewing eight million recommendations over 15 months, researchers determined the progress YouTube claimed in June 2019 to have reduced the amount of time its users watched recommended videos including conspiracies by 50 percent—and in December 2019 by 70 percent—did not make the “problem of radicalization on YouTube obsolete nor fictional.” The study, A Longitudinal Analysis Of YouTube’s Promotion Of Conspiracy Videos, found that a more complete analysis of YouTube’s algorithmic recommendations showed the proportion of conspiratorial recommendations is “now only 40% less common than when the YouTube’s measures were first announced.”

Dr. Farid, who also worked with CEP to develop eGLYPH hashing technology, told the New York Times on March 2 that YouTube’s posture on misinformation, “is a technological problem, but it is really at the end of the day also a policy problem… If you have the ability to essentially drive some of the particularly problematic content close to zero, well then you can do more on lots of things. They use the word ‘can’t’ when they mean ‘won’t’.”

From March 8 to June 8, 2018, CEP and Dr. Farid conducted a study to better understand how ISIS content was being uploaded to YouTube, how long it stayed online, and how many views these videos received. To accomplish this, CEP conducted a limited search for a small set of just 229 known ISIS terror-related videos. CEP used two computer programs to locate these ISIS videos: a web crawler to search video titles and descriptions for keywords in videos uploaded to YouTube, and eGLYPH. The study found that hundreds of ISIS videos were being uploaded to YouTube every month, which in turn garnered thousands of views.

For its report, OK Google, Show Me Extremism, CEP reviewed a total of 649 YouTube videos for extremist and counter-narrative content between August 2 and August 3, 2018. For the 649 videos checked, CEP was 4 times more likely to encounter extremist material than counter-narratives. The result of CEP’s searches highlighted the extent of extremist content on YouTube and undermined YouTube’s claims touting the efficacy of its efforts to promote counter-narrative videos.

Lastly, in 2019, CEP released a nine-part video series in which Dr. Farid called upon tech companies to take more responsibility for what happens on their platforms.

To read CEP’s report, The eGLYPH Web Crawler: ISIS Content On YouTube, please click here.

To read CEP’s report, OK Google, Show Me Extremism: Analysis Of YouTube’s Extremist Video Takedown Policy And Counter-Narrative Program, please click here.

To view CEP’s nine-part video series featuring Dr. Farid, please click here.

Daily Dose

Extremists: Their Words. Their Actions.

In Their Own Words:

We reiterate once again that the brigades will directly target US bases across the region in case the US enemy commits a folly and decides to strike our resistance fighters and their camps [in Iraq].

Abu Ali al-Askari, Kata’ib Hezbollah (KH) Security Official Mar. 2023
View Archive