Meta Pushed Messenger Encryption Despite Internal Child Safety Warnings, Court Documents Reveal

Meta Pushed Messenger Encryption Despite Internal Child Safety Warnings, Court Documents Reveal
X

Internal documents reveal Meta advanced Messenger encryption plans despite warnings it could significantly reduce child exploitation reporting capabilities.

Meta executives moved ahead with plans to introduce default end-to-end encryption across Facebook Messenger and Instagram even after senior staff warned the shift could weaken efforts to combat child exploitation, according to newly disclosed court documents.

The filings, made public in a New Mexico state court case, include internal emails, chats, and briefing materials obtained during discovery in a lawsuit filed by New Mexico Attorney General Raul Torrez. The case alleges the company failed to adequately protect minors from online predators.

In one internal message dated March 2019, Monika Bickert, Meta’s head of content policy at the time, voiced strong concern as CEO Mark Zuckerberg prepared to publicly announce the encryption rollout.

“We are about to do a bad thing as a company. This is so irresponsible,” Bickert wrote in an internal exchange.

The documents indicate that some senior safety and policy leaders feared the move would significantly reduce the company’s ability to proactively detect and report child exploitation cases to law enforcement agencies. While Zuckerberg promoted encryption as a major privacy enhancement, internal discussions reflected anxiety about its potential consequences.

“I'm not very invested in helping him sell this, I must say,” Bickert wrote regarding Zuckerberg’s public messaging. She added that with end-to-end encryption, “there is no way to find the terror attack planning or child exploitation” and proactively refer those cases to law enforcement.

End-to-end encryption ensures that only the sender and recipient can read a message, a feature widely used across messaging platforms such as Apple’s iMessage, Google Messages, and WhatsApp. However, child safety advocates, including the National Center for Missing and Exploited Children, have argued that applying such encryption within large social networking ecosystems presents heightened risks.

According to a February 2019 internal briefing cited in the court filing, Meta estimated that its reporting of child nudity and sexual exploitation imagery to NCMEC would have dropped from 18.4 million cases to 6.4 million the previous year if Messenger had already been encrypted — a 65% decrease.

A later revision of the document warned that Meta would have been “unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings.”

The lawsuit alleges that Meta allowed predators “unfettered access” to underage users and facilitated harmful connections that sometimes resulted in real-world abuse and trafficking. The trial, now underway, marks the first case of its kind against the company to reach a jury.

In response to questions, Meta spokesperson Andy Stone said the concerns raised internally in 2019 led to the development of additional safety tools before encrypted messaging was fully rolled out across Facebook and Instagram in 2023.

While chats are now encrypted by default, users can still report suspicious or harmful messages for review and possible referral to authorities. The company also introduced safeguards for teen accounts that restrict adults from initiating contact with minors they do not know.

In a 2019 email cited in the filings, Antigone Davis, then Global Head of Safety, warned: “FB [Facebook] allows pedophiles to find each other and kids via social graph with easy transition to Messenger.”

She contrasted Messenger with WhatsApp, noting: “WA (WhatsApp) does not make it easy to make social connections, meaning making Messenger e2ee (end-to-end encrypted) will be far, far worse than anything we have seen/gotten a glimpse of on WA.”

The broader case unfolds amid mounting legal and regulatory scrutiny over youth safety aross Meta’s platforms.

Next Story
Share it