Columbia Journalism Review: With a shrinking user base and executive exits, a watershed moment for Twitter

Body

Most platforms rely on users to flag content to trigger moderation—in other words, platforms don’t police content before someone flags it as offensive. Hany Farid, a computer scientist at Dartmouth, is one of the original developers of PhotoDNA, a tool that allows companies to recognize and remove problematic images from their networks. It works by identifying the digital signature of images that are flagged by users and determined to be graphic. The tool was originally intended for removing child pornography and preventing its repeated dissemination, but Farid says it can be extended to “any content deemed in violation of the terms of service of a social media platform.”

 

Date
August 8, 2016
Article Source
Show On
Digital Disruption

Daily Dose

Extremists: Their Words. Their Actions.

Fact:

On April 3, 2017, the day Vladimir Putin was due to visit the city, a suicide bombing was carried out in the St. Petersburg metro, killing 15 people and injuring 64. An al-Qaeda affiliate, Imam Shamil Battalion, claimed responsibility. 

View Archive