Finding the Balance Between Child Safety and Internet Privacy

Online child sexual exploitation is a global crisis that is growing at an exponential rate. Yet efforts to promote online child safety and privacy have met strong opposition from privacy and human rights proponents. Child safety and internet privacy do not have to conflict, even though advocates on each side seem to be at odds.

Apple’s recent experience illustrates the apparent conflict. The company, renowned for its strong privacy protections, announced in August 2021 it would expand measures to protect children. Child safety advocates said this was a long overdue step and would help reduce the tens of millions of child sexual abuse materials (CSAM) posted online. But Apple delayed implementation of its plan given intense backlash from privacy and civil society groups. While many privacy concerns are valid, objections to Apple’s policy were often alarmist, with cries of ‘they’re spying on our phones,’ even though Apple’s hashing technology does not search phones nor provide Apple with any non-CSAM information.

Meta (formally Facebook), not renowned for its privacy protections, said its new end-to-end encryption on Facebook, Messenger and Instagram would address longstanding concerns. While applauded by privacy advocates, Meta soon faced backlash from other stakeholders. Meta is the main source of online CSAM; in 2021 it was responsible for 94 percent of the 29 million reports of CSAM. Child safety advocates, law enforcement and governments worldwide are intensely concerned that the new encryption program will make most instances of CSAM invisible, protect child predators and leave children more vulnerable. This has led to proposed online child safety legislation in several countries, which in turn is opposed by privacy and human rights proponents.

Companies, and society, are being asked to choose between child safety and internet privacy.

There must be a better way.

In an attempt to seek common ground that would address both privacy and CSAM, the Interfaith Center for Corporate Responsibility launched an initiative to bring together shareholders, child safety, internet privacy and human rights advocates, to better understand conflicts and find solutions. All agree that a crucial first step is to fix failed age enforcement verification policies, which puts together on the same platforms children and adults who pretend to be different ages. The new MetaVerse seems poised to compound the problems, since it can allow easy access by under-age participants, raising the specter of even more direct inappropriate and dangerous contact access for predators.

Since 2019, shareholder engagement with Facebook, Apple, Alphabet, AT&T and Verizon on CSAM has produced mixed results. Verizon and AT&T have conducted child risk assessments and reduced specific risks. But Facebook and Alphabet have been less willing to discuss the problem, and it remains difficult to independently assess actions they have taken. Apple’s announcement of its child safety policy was a welcome surprise, followed by disappointment with its cancellation.

Part of any solution will be better information, which is why Proxy Impact has resubmitted a resolution that asks Meta to report on the risk of increased child sexual exploitation from end-to-end encryption and other privacy plans. Before putting more children at risk, Meta and the IT industry need to do more to help find a workable solution supported by advocates on all sides.

 
Contributor Michael Passoff

Michael Passoff
CEO, Proxy Impact