Columbia Journalism Review: With a shrinking user base and executive exits, a watershed moment for Twitter
Most platforms rely on users to flag content to trigger moderation—in other words, platforms don’t police content before someone flags it as offensive. Hany Farid, a computer scientist at Dartmouth, is one of the original developers of PhotoDNA, a tool that allows companies to recognize and remove problematic images from their networks. It works by identifying the digital signature of images that are flagged by users and determined to be graphic. The tool was originally intended for removing child pornography and preventing its repeated dissemination, but Farid says it can be extended to “any content deemed in violation of the terms of service of a social media platform.”
Stay up to date on our latest news.
Get the latest news on extremism and counter-extremism delivered to your inbox.