Facebook At Center Of Storm Over Child Sexual Exploitation Online

There has been an explosion of child sexual abuse material (CSAM) online and it is likely going to get much worse–unless tech companies take more aggressive action.   What was once the province of individual child predators taking photos for their own use has–through the proliferation of smart phones, social networks, and data storage–increased exponentially with the growth of the internet and children going online.  (One third of Internet users are children and 800 million kids now are on social media.)

Twenty years ago, there were about 3,000 reports of child sexual abuse imagery known by law enforcement. Ten years ago this increased to more than 100,000 reports.  By 2018, there was a quantum leap to more than 18 million, including 45.5 million images and videos. Experts lament that this is just the tip of the iceberg.

In 2018, Facebook (especially Facebook Messenger) was responsible for 16.8 million of the 18.4 million reports worldwide of CSAM (91 percent of the total).   To be fair, Facebook comprises the bulk of reported images because it scans more actively for them than any other company; most barely bother. Yet despite Facebook’s claim as the industry leader in combating CSAM, its other activities hamper its reporting and help enable child abuse online.  

For example, Facebook searches rely on artificial intelligence that generally detects previously identified images but has trouble detecting new images, videos and livestreaming. Human confirmation is typically needed and most of Facebook’s reports go to non-profit groups such as the National Center for Missing and Exploited Children that are overwhelmed with the avalanche of material they are sent. 

Reporting aside, Facebook and other tech peers have grossly insufficient age verification procedures, allowing predators’ access to kids. Tech companies often also lobby against legislation to control online child data collection, safety measures and sexual exploitation. 

Most importantly, Facebook’s rush for end-to-end encryption in Messenger and Instagram, in the name of protecting user privacy, seems to ignore the overwhelming threat to children’s privacy and safety. Encryption will provide child predators cover that will exponentially expand their outreach and the number of victims. Facebook CEO Mark Zuckerberg blogged, “Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things.” In late 2019, 129 child protection groups implored the company to end its move to encryption because of child sexual abuse harms.

Electronic Service Providers—websites, email, social media and cloud storage—currently are not liable for what users say or do on their platforms. However, U.S. and U.K. lawmakers are considering legislation to remove this immunity for CSAM cases. 

A shareholder group led by Christian Brothers Investment Services has engaged Alphabet/Google, Apple, AT&T, Facebook, Microsoft, Sprint, Deutsche Telekom, T-Mobile US and Verizon on this issue. Proxy Impact, on behalf of Lisette Cooper and along with several co-filers, has a resolution pending at Facebook asking it to assess the risk of increased child sexual exploitation from end-to-end encryption. 

The information and communications technology sector is the world’s main facilitator of child sexual exploitation. Facebook is the world’s largest social media company with 2.45 billon active monthly users. It is not unreasonable to expect a $70 billion dollar company to help solve a problem that it has helped to create—and one that Facebook and the tech industry are about to make it much worse.

Michael Passoff
CEO, Proxy Impact