By Sue Halpern
On Wednesday, a few hours before the C.E.O. of Facebook, Mark Zuckerberg, published a thirty-two-hundred-word post on his site titled “A privacy-focused vision for social networking,” a new study from the market research firm Edison Research revealed that Facebook had lost fifteen million users in the United States since 2017. “Fifteen million is a lot of people, no matter which way you cut it,” Larry Rosin, the president of Edison Research, said on American Public Media’s “Marketplace.” “This is the second straight year we’ve seen this number go down.” The trend is likely related to the public’s dawning recognition that Facebook has become both an unbridled surveillance tool and a platform for propaganda and misinformation. According to a recent Harris/Axios survey of the hundred most visible companies in the U.S., Facebook’s reputation has taken a precipitous dive in the last five years, with its most acute plunge in the past year, and it scores particularly low in the categories of citizenship, ethics, and trust.
While Zuckerberg’s blog post can be read as a response to this loss of faith, it is also a strategic move to capitalize on the social-media platform’s failures. To be clear, what Zuckerberg calls “town square” Facebook, where people post updates about new jobs, and share prom pictures and erroneous information about vaccines, will continue to exist. (On Thursday, Facebook announced that it would ban anti-vaccine advertisements on the site.) His new vision is to create a separate product that merges Facebook Messenger, WhatsApp, and Instagram into an encrypted and interoperable communications platform that will be more like a “living room.” According to Zuckerberg, “We’ve worked hard to build privacy into all our products, including those for public sharing. But one great property of messaging services is that, even as your contacts list grows, your individual threads and groups remain private. As your friends evolve over time, messaging services evolve gracefully and remain intimate.”
This new Facebook promises to store data securely in the cloud, and delete messages after a set amount of time to reduce “the risk of your messages resurfacing and embarrassing you later.” (Apparently, Zuckerberg already uses this feature, as Tech Crunch reported, in April, 2018.) Its interoperability means, for example, that users will be able to buy something from Facebook Marketplace and communicate with the seller via WhatsApp; Zuckerberg says this will enable the buyer to avoid sharing a phone number with a stranger. Just last week, however, a user discovered that phone numbers provided for two-factor authentication on Facebook can be used to track people across the Facebook universe. Zuckerberg does not address how the new product will handle this feature, since “town square” Facebook will continue to exist.
Once Facebook has merged all of its products, the company plans to build other products on top of it, including payment portals, banking services, and, not surprisingly, advertising. In an interview with Wired’s editor-in-chief, Nicholas Thompson, Zuckerberg explained that “What I’m trying to lay out is a privacy-focused vision for this kind of platform that starts with messaging and making that as secure as possible with end-to-end encryption, and then building all of the other kinds of private and intimate ways that you would want to interact—from calling, to groups, to stories, to payments, to different forms of commerce, to sharing location, to eventually having a more open-ended system to plug in different kinds of tools for providing the interaction with people in all the ways that you would want.”
If this sounds familiar, it is. Zuckerberg’s concept borrows liberally from WeChat, the multiverse Chinese social-networking platform, popularly known as China’s “app for everything.” WeChat’s billion monthly active users employ the app for texting, video conferencing, broadcasting, money transfers, paying fines, and making medical appointments. Privacy, however, is not one of its attributes. According to a 2015 article in Quartz, WeChat’s “heat map” feature alerts Chinese authorities to unusual crowds of people, which the government can then surveil.
Zuckerberg is quick to point out that his vision for this new Facebook is aspirational. It has not yet been built, and it is not certain that it can be built—at least in the way he imagines it—with across-the-board encryption. “I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever,” Zuckerberg tells us. “This is the future I hope we will help bring about.” By announcing it now, and framing it in terms of privacy, he appears to be addressing the concerns of both users and regulators, while failing to acknowledge that a consolidated Facebook will provide advertisers with an even richer and more easily accessed database of users than the site currently offers. As Wired reported in January, when the merger of Facebook’s apps was floated in the press, “the move will unlock huge quantities of user information that was previously locked away in silos.” ( As Privacy Matters recently noted on Twitter, Facebook’s data policy states, “We collect information about the people, Pages, accounts, hashtags and groups that you are connected to and how you interact with them across our Products, such as people you communicate with the most or groups that you are part of.”)
Zuckerberg also acknowledged that an encrypted Facebook may pose problems for law enforcement and intelligence services, but promised that the company would work with authorities to root out bad guys who “misuse it for truly terrible things like child exploitation, terrorism, and extortion.” It’s unclear how, with end-to-end encryption, it will be able to do this. Facebook’s private groups have already been used to incite genocide and other acts of violence, suppress voter turnout, and disseminate misinformation. Its pivot to privacy will not only give such activities more space to operate behind the relative shelter of a digital wall but will also relieve Facebook from the responsibility of policing them. Instead of more—and more exacting—content moderation, there will be less. Instead of removing bad actors from the service, the pivot to privacy will give them a safe harbor.
If “town square” Facebook is, as Zuckerberg and his associates like to say, “a neutral platform,” this new “living room” Facebook, where people gather in smallish, obscure groups, is likely to be a broadcast channel for all sorts of odious and malevolent ideas and behaviors. We’ve seen this already on WhatsApp. Last year, mobs in India were incited to kill more than two dozen innocent people after false rumors of a child-kidnapping ring were spread through the messaging app. As the New York Times reported at the time, “WhatsApp’s design makes it easy to spread false information. Many messages are shared in groups, and when they are forwarded, there is no indication of their origin.” (According to the Times, WhatsApp subsequently introduced new labels for forwarded messages, ran newspaper ads to warn the public about misinformation, and vowed to work more closely with police.) Instagram has also been an effective tool for spreading misinformation. According to data shared with the Senate Intelligence Committee last December, between 2015 and 2017, Russian propagandists working out of the Internet Research Agency generated posts that garnered a hundred and eighty-seven million interactions on the photo-sharing app, which was more engagement than they accomplished on either Facebook or Twitter. An encrypted Instagram will be a boon to trolls.
Last week, the relentless British journalist Carole Cadwalladr, along with her colleague Duncan Campbell, reported in the Observer that Facebook has been pressuring politicians around the world to lobby against imposing regulations on the social network, offering investments and other incentives. (A Facebook spokesperson told the Observer that the documents forming the basis of the reporting were “cherrypicked.”) In the United States, the Federal Trade Commission is considering whether to fine Facebook billions of dollars for violating a 2011 consent decree intended to protect users’ private data. In Wednesday’s announcement, Zuckerberg, who is famous for shambling apologies, seemed to have finally run out of mea culpas. “I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform,” he wrote, “because frankly we don’t currently have a strong reputation for building privacy protective services.” On that point, at least, he’s right.