On Wednesday, Counter Extremism Project (CEP) Senior Advisor Dr. Hany Farid, a professor at University of California, Berkeley with a joint appointment in Electrical Engineering & Computer Sciences and the School of Information, testified before a joint subcommittee of the U.S. House Committee on Energy & Commerce on the effect online disinformation has had on the country. In his testimony, Dr. Farid criticized tech firms, including Facebook and Google-owned YouTube, for their unwillingness to moderate harmful content on their respective platforms. Because much of the content consumption on social media platforms is determined by algorithms, tech firms have an incentive to amplify divisive content, which increases user engagement and drives revenue.
Dr. Hany Farid
For years, the influence of fake news and the manipulation of public and political perception has been a threat to political systems. During the time of the COVID-19 pandemic, the spread of misinformation and conspiracy theories have risen to new heights, as non-state actors—among them extremists and populists—have tried to take advantage of the situation. State actors have also engaged in the distribution of fake news concerning the virus, putting their own citizens at risk.
Tech & Terrorism: YouTube Moves To Undermine Its Program Protecting Advertisers From Extremist Videos
This week, the Wall Street Journal reported on efforts by Google-owned YouTube to restrict the platform’s “brand safety partners” from being able to share critical information about advertising on its video platform. According to the report, YouTube has altered its contractual Terms of Service which, if signed, prevents third-party auditors from disclosing to clients “when ads have run in videos with sensitive subject matter, including hate speech, adult content, children's content, profanity, violence and illegal substances.”
Tech & Terrorism: Following Public Outcry and Government Criticism, Facebook Makes Another Reactive Policy Change
Last week, Facebook announced a new policy change in their aim to eliminate deepfakes and the spread of manipulated media ahead of the 2020 presidential election. In their updated Manipulated Media section of their Community Standards, Facebook said they would remove misleading manipulated media if “that has been edited or synthesized, beyond adjustments for clarity or quality, in ways that are not apparent to an average person, and would likely mislead an average person to believe that a subject of the video said words that they did not say AND is the product of artificial intelligence or machine learning, including deep learning techniques (e.g., a technical deepfake), that merges, combines, replaces, and/or superimposes content onto a video, creating a video that appears authentic.”
CEP’s Dr. Hany Farid: “There is Real, Measurable Harm Happening From Moving Fast And Breaking Things”
CEP today released the second in a nine-part video series featuring CEP Senior Advisor Dr. Hany Farid, titled "Internet." In the video, Dr. Farid warns of how the tech industry’s and Facebook’s famed “Move Fast and Break Things” mantra is no longer acceptable in today’s highly-developed internet.
ICYMI: CEP Senior Advisor Dr. Hany Farid, in NPR Interview, Says Suspend Facebook Live Following Mosque Shootings
CEP Senior Advisor and a leading expert on digital forensics Dr. Hany Farid was interviewed by National Public Radio about tech companies’ failure to prevent the reuploading of the New Zealand attack video. Said Farid: "The repeated uploading is an absolute failure, and it is inexcusable because we have the technology to stop it."
CEP issued a statement and research today in response to Facebook CEO Mark Zuckerberg’s claims that the company removes 99 percent of extremist content on its platform, that read in part: “There is no way to independently verify Mr. Zuckerberg’s claims that 99 percent of terrorist and extremist content is removed from Facebook,” said CEP Executive Director David Ibsen. “In addition, even if those claims are true, given the volume of content uploaded to Facebook by the platform’s estimated 2.2. billion active users on a daily basis, the one percent of terrorist content that is not removed is a significant amount that needs to be addressed."
Following Facebook CEO Mark Zuckerberg's testimony before members of the U.S. Senate Committees on the Judiciary and Commerce, CEP points out that extremist and terrorist material remains on Facebook. "Clearly, Mr. Zuckerberg's claims on successfully removing extremist and terrorist material from Facebook is absolutely false," said David Ibsen, CEP Executive Director.
In advance of the upcoming Congressional hearings featuring testimony from Facebook CEO Mark Zuckerberg, CEP called on Facebook and other tech companies to take a more aggressive approach to the spread of fake news enabled by widely available artificial intelligence (AI) tools. “As we have seen, the issue of fake news proliferating online is something that will not simply disappear,” said Dr. Hany Farid, CEP Senior Advisor and Dartmouth College Computer Science Professor.
Journalist Greta Van Susteren will moderate a panel discussion on the myriad challenges to public safety posed by extremists and terrorists operating online at CEP's November 13 public policy event at the Newseum in Washington, D.C. Senator Ron Johnson of Wisconsin will keynote the event. Panelists will include: Fran Townsend, former U.S. Homeland Security Advisor; Dr. Hany Farid, Dartmouth College professor and the world’s leading authority on digital forensics; and national security expert and Vice President of New America Peter Bergen.