On September 15, 2019, a truck bomb exploded outside of the Al-Rai Hospital in Syria’s Aleppo Governorate, killing 12 civilians and injuring many more. There were no immediate claims of responsibility.
Last weekend, Facebook CEO Mark Zuckerberg, called for governments and regulators to become involved in censoring harmful content online. However, Mr. Zuckerberg’s call was intentionally heavy on idealism and rhetoric, and light on specifics. Once more, the company’s PR strategy is carefully calibrated to give the public very little understanding of how Mr. Zuckerberg plans to support his words with action.
Mr. Zuckerberg suggested an idea “for third-party bodies to set standards governing the distribution of harmful content and to measure companies against those standards.” Prior to Mr. Zuckerberg’s op-ed, CEP called for this very concept, indicating that the third-party body could include governments, civil society and even the tech industry’s much touted Global Internet Forum to Counter Terrorism (GIFCT). However, two years after Facebook, Google/YouTube and Twitter established the GIFCT, the group has failed to establish an industry-wide restriction on the “worst of the worst” content and allowed that material to remain online. Moreover, the tech industry has actively and repeatedly opposed EU government efforts to establish baseline standards for content removal – the very action Facebook’s CEO is now claiming to support. Clearly, given the GIFCT’s failures, the government must step in to protect the public, and Facebook must be denied any lobbying influence.
Further, if Facebook and Mr. Zuckerberg are sincere about making the Internet safer and more secure, then they must be prepared to support specific proposals for regulation by public officials. CEP is calling on Facebook to stay true to Mr. Zuckerberg’s pronouncement that Internet companies should be held “accountable for enforcing standards on harmful content” and support amending Section 230 of the Communications Decency Act (CDA). Section 230 must be amended to remove companies’ blanket protections from liability for content posted by third-parties on their platforms when that content is incontrovertibly known to be extremist in nature or otherwise harmful. This will expose companies to civil liability for content they host that incites or could incite terrorist or violent attacks and incentivize them to more closely monitor their platforms. A similar amendment was signed into law to ensure that the CDA would not be used to shield violators of sex trafficking laws.
CEP is also calling on Facebook to not only voluntarily release transparency reports about their efforts to monitor and remove extremist or otherwise harmful content, but to also support amending Securities Laws governing the Securities and Exchange Commission (SEC). Mr. Zuckerberg’s suggestion that companies would voluntarily release information about their inner workings without the government mandating and standardizing the release of reports is unsupported by the track record of these companies. They have repeatedly demonstrated that they cannot be so trusted. With that in mind, Congress must amend Securities Laws to require that publicly traded Internet companies file SEC reports on how much extremist content exists on these platforms as well as the efficacy of their removal efforts. Releasing the information is a responsibility companies have not only to the general public, but also to their shareholders.
Lastly, Mr. Zuckerberg’s upcoming trip to Washington, D.C., reported as a trip to “get ahead of global regulators by proposing rules of his own,” is itself a contradiction to his op-ed. Facebook said behind closed doors to European lawmakers that it believes “the industry does not need a regulatory push to improve.” And, between 2017 and 2018, the tech industry expanded lobbying expenditures in Washington, D.C. alone by 10 percent, spending a small fortune of $64 million to quash regulations. If the company insists on continuing its lobbying presence, it should instead commit only to bringing its Silicon Valley counterparts on board with meaningful legislation.
Facebook must definitively demonstrate its commitment to keeping communities safe and pledge to not spend any more money or time lobbying against government regulation. Rather than simply issuing platitudes, the tech giant must lead by example and allow lawmakers to discuss and negotiate – without industry interference – the most appropriate legislation in the interest of public safety and security.
Get the latest news on extremism and counter-extremism delivered to your inbox.