In his latest announcement, Facebook CEO Mark Zuckerberg embraces privacy and security fundamentals like end-to-end encrypted messaging. But announcing a plan is one thing. Implementing it is entirely another. And for those reading between the lines of Zuckerberg’s pivot-to-privacy manifesto, it’s clear that this isn’t just about privacy. It’s also about competition.
The Proof is in the Pudding
At the core of Zuckerberg’s announcement is Facebook’s plan to merge its three messaging platforms: Facebook’s Messenger, Instagram’s Direct, and WhatsApp. The announcement promises security and privacy features across the board, including end-to-end encryption, ephemerality, reduced data retention, and a commitment to not store data in countries with poor human rights records. This would mean that your messages on any of these platforms would be unreadable to anyone but you and your recipients; could be set to disappear at certain intervals; and would not be stored indefinitely or in countries that are likely to attempt to improperly access your data. Even better, the announcement promises that Facebook will not store your encryption keys for any of these services, as is already the case with WhatsApp.
This all sounds great, in theory. But secure messaging is not easy to get right at either the technical or policy level.
In technical terms, end-to-end encryption is only part of the story. In practice, the choices that undermine messaging security often lie far from the encryption engine. Strong authentication, for example, is necessary to ensure that you are messaging only with your intended recipients and not with any law enforcement “ghosts.” Automatic backups are another potential chink in the armor; if you choose to have WhatsApp back up your messages, it stores an unencrypted copy of your messages on iCloud (for iPhone) or Google Drive (for Android), essentially undermining the app’s end-to-end encryption.
The prospect of merging WhatsApp, Instagram, and Messenger also raises concerns about combining identities that users intended to keep separate. Each of the three uses a different way to establish your identity: WhatsApp uses your phone number; Instagram asks for a username; and Messenger requires your “authentic name.” It’s not unusual for people to use each app for different parts of their life; therapists, sex workers, and activists, for example, face huge risks if they can no longer manage separate identities across these platforms.
Zuckerberg’s announcement claims that merging the three apps “would be opt-in and you will be able to keep your accounts separate if you like.” An opt-in—not an opt-out—is an important safety valve and the right choice. Time will tell if a merged “Whatstamessenger” can pull off this promise.
Above all, Facebook needs to be transparent about its business model. For example, while end-to-end encryption protects the contents of your messages, it cannot protect the metadata: who the recipients are, when messages are sent, and even where you are. Will Facebook be tracking and retaining that metadata? What about the possibility of a “super-app” model like WeChat’s? Without transparency about how Facebook will monetize its end-to-end encrypted services, users and advocates cannot scrutinize the various pressure points that business model might place on privacy and security.
We could never get on board with a tool—even one that made solid technical choices—unless it was developed and had its infrastructure maintained by a trustworthy group with a history of responsible stewardship of the tool. Zuckerberg’s statement is vague about how Facebook will consult with “safety experts, law enforcement and governments on the best way to implement safety measures,” and what that will mean for how Facebook responds to government data requests.
Recent news also does not inspire optimism that Facebook can execute responsible stewardship of security and privacy features. One need look no further than this week’s headlines, for example, about the extent to which Facebook has abused the security feature two-factor authentication to share and expose users’ phone numbers.
Pay No Attention to the Competition Concerns Behind the Curtain
Facebook’s privacy-focused vision is also a competition move. Zuckerberg’s out-of-character privacy focus in this announcement takes a page out of the Wizard of Oz: “Pay no attention to the competition concerns behind the curtain!”
This is clearest when Zuckerberg’s announcement turns to “interoperability,” describing how users will be able to message friends on WhatsApp, Instagram, or Messenger from any one of the three apps. But it appears Facebook’s aim isn’t necessarily to make its messaging properties interoperable, but to make them indistinguishable—at least as far as regulators are concerned. Combining the services beyond recognition might give Facebook a technical excuse to sidestep impending competition and data-sharing regulation. Timing is key here: This privacy announcement comes on the heels of a German order to prevent Facebook from pooling user data without consent.
More broadly, Zuckerberg’s idea of interoperability might better be called “consolidation.” The announcement lays out a convenient future in which users have the freedom to communicate however they want…as long as they use Facebook-owned apps or SMS texting to do it. Zuckerberg’s excuse for excluding everyone else’s apps and messengers from this vision is security: “[I]t would create safety and spam vulnerabilities in an encrypted system to let people send messages from unknown apps where our safety and security systems couldn’t see the patterns of activity.” But a future in which Facebook is the sole owner and guardian of our communication methods is not good news for user security, choice, and control.
If Facebook really cares about interoperability, it should pursue open standards that level the playing field, not a closed proprietary family of apps that entrenches Facebook’s own dominance.
Gennie Gebhardt earned a Master of Library and Information Science from the University of Washington, where her thesis with the Department of Computer Science & Engineering’s Security & Privacy Research Lab investigated user reactions to censorship. She is associate director for research at Electronic Frontier Foundation, who first published this piece.