As law enforcement agencies scramble to address threats of terrorism, child sexual abuse, and human trafficking—and repressive governments around the world look to broadly expand their surveillance capabilities—researchers fear that Meta’s retreat from its commitments to protect user privacy with end-to-end encryption on Instagram chat could create a problematic precedent in big tech.
Meta spent the better part of a decade working to deploy end-to-end encryption by default across all of its chat apps. It was a saga—fraught with both technical and political hurdles. But in December 2023, the company declared victory, announcing default end-to-end encryption for Messenger and promising that it was in testing to roll out for Instagram Direct Messaging as well. In the end, though, end-to-end encryption only came to Instagram chat as a backwater opt-in feature. And as threats to end-to-end encryption from governments around the world loom larger than ever, Meta quietly announced last week that it intends to eliminate the feature from Instagram chat entirely on May 8.
Crucially, few companies have the scale and stability needed to stake out an influential pro-end-to-end encryption position. And an even smaller group—namely, Meta and Apple—have made it a priority. Experts say that Meta’s decision about Instagram chat could give other companies, or even simply other divisions within Meta, permission to do less, too.
“Meta’s deployment of encryption was a public commitment, and they were weathering a lot of pressure from various governments to do it,” says Johns Hopkins cryptographer Matt Green, who has consulted for Meta over the years on its end-to-end encryption rollout as both an unpaid advisor and paid reviewer. “Public commitments to support privacy features are literally the only thing that we the public have. If they’re worthless, then why should we assume we’ll continue to have end-to-end encryption in Messenger and WhatsApp?”
Meta’s decision to revoke end-to-end encryption for Instagram chat seems to have been particularly alarming for researchers and privacy advocates because of the company’s stated reason for the change: low user adoption.
“Very few people were opting in to end-to-end encrypted messaging in DMs, so we’re removing this option from Instagram in the coming months,” a Meta spokesperson told WIRED and other outlets. “Anyone who wants to keep messaging with end-to-end encryption can easily do that on WhatsApp.”
The statement struck many as disingenuous given that Meta emphasized for years that it was committed specifically to default end-to-end encryption, not the opt-in version that ultimately emerged for Instagram chat buried behind layers of menus.
“Designed the feature so nobody could find it, killed it for not being easy enough to find and, therefore, unpopular. It’s deeply cynical,” says Davi Ottenheimer, a longtime security executive and creator of the post-quantum cryptography assessment tool pqprobe.
Johns Hopkins’ Green adds, too, that Meta originally rolled out opt-in encryption for Messenger and seemingly learned the lesson about the need for default implementation from low adoption in that trial.
“This is a Meta post where they publicly commit to default encryption in Instagram chat. Then, seemingly without even looking back over it, they add an update to the top that implies that it was optional encryption, and blames lack of opt-in as the reason they need to remove this feature,” Green says. “Nothing about this is honest. They know what they promised.”
WIRED gave Meta multiple opportunities to comment for this story, but the company ultimately declined.
In a key 2019 treatise laying out his vision for privacy and security across Meta’s properties, CEO Mark Zuckerberg wrote, “I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform—because frankly, we don’t currently have a strong reputation for building privacy-protective services, and we’ve historically focused on tools for more open sharing.” But, he added, “we’ve repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.”
Years later, in a December 2023 background call with WIRED ahead of the announcement that full, default end-to-end encryption was ready for Messenger and Instagram chat, a Meta employee who had worked for years on the project specifically described a phase from 2016 to early 2019 in which engineers were working on an optional encryption feature for Messenger, and then a phase beginning with Zuckerberg’s 2019 announcement where the team shifted to implementing default encryption.
Documents from inside Meta released as part of a lawsuit alleging that Meta did not adequately protect underage users from abuse show, though, that implementing end-to-end encryption by default has long been politically fraught within the company as well as outside of it. Reuters reported at the end of February that Meta head of content policy Monika Bickert wrote internally in March 2019 ahead of Zuckerberg’s announcement, “We are about to do a bad thing as a company. This is so irresponsible.”
Privacy advocates have long pointed out that while child safety and public safety should always be paramount, child sexual abuse and other crimes still play out daily on chat apps and other digital services that do not offer the universal protection of end-to-end encryption. In other words, adding default end-to-end encryption gives numerous protections to everyone, while eliminating it takes those protections away without solving threats to the most vulnerable.
Meanwhile, Meta’s statement about its decision to revoke end-to-end encryption for Instagram chat notably does not mention that the protection is available on Messenger, pointing users instead to WhatsApp. This is, perhaps, a revealing omission. As Casey Newton pointed out on Monday in a Platformer story, Meta will eliminate the stand-alone Messenger website in April and is in the process of recoupling Messenger with Facebook after a major push in 2014 and 2015 to separate Messenger as a stand-alone product. In 2015 WIRED reported that “Facebook wants to position Messenger as your default chat app.” Now, it seems the product is being sidelined.
Meta is investing in at least one new project that would bring encryption protections to more of its users: a partnership with Signal creator Moxie Marlinspike to deploy his new private AI technology, known as Confer, for Meta AI. The move, announced by Marlinspike this week, could protect millions of conversations people have with Meta’s AI chatbot. The collaboration is in an early phase, though, and there are no details yet on exactly how Confer might be integrated.
Some sources tell WIRED that, absent more information from Meta about its Instagram chat decision, the only conclusion they can come to is that the company’s commitment to implementing end-to-end encryption by default across its chat platforms was always a ploy to improve its public image and mend user trust in the wake of numerous scandals in which the company mismanaged user data or suffered breaches.
“Encryption was used politically as a shield and a sword,” pqprobe’s Ottenheimer says. “The shield was against the post-Cambridge Analytica trust collapse and the sword was against governments who’d been pressuring Meta on safety and content moderation concerns. Now I guess the privacy brand isn’t as valuable, so they just reverse it and blame the users.”
Disclaimer : This story is auto aggregated by a computer programme and has not been created or edited by DOWNTHENEWS. Publisher: wired.com










