WIREDYESTERDAY’S WIKILEAKS DUMP reiterated something we already knew: Our devices are fundamentally unsafe. No matter what kind of encryption we use, no matter which secure messaging apps we take care to run, no matter how careful we are to sign up for two-factor authentication, the CIA—and, we have to assume, other hackers—can infiltrate our operating systems, take control of our cameras and microphones, and bend our phones to their will. The same can be said of smart TVs, which could be made to surreptitiously record our living-room conversations, and internet-connected cars, which could potentially be commandeered and even crashed.
Previous security revelations told us that our data wasn’t safe. The Vault 7 leak reminded us that our machines aren’t secure—and, because those machines lived in our homes and on our bodies, they rendered our homes and bodies insecure as well. There is a word for these flaws and holes in code that leave us open to harm, and it’s the same word for the unease that accompanies them: vulnerability.
Take the iPhone—a single example among many, but an especially instructive one. Last year, while fighting the FBI’s request to access the iPhone of the San Bernadino shooter, Apple CEO Tim Cook presented his company as a bulwark against intruders. “Customers expect Apple and other technology companies to do everything in our power to protect their personal information,” he wrote. Now, like a child learning of his parents’ inability to prevent bad things from happening, we understand Cook’s promises to be unfulfillable. This morning Apple announced that it had already patched most of these holes, but we can never know if there aren’t others out there, unbeknownst to us or to the company.
If we feel freshly vulnerable, we are not alone. The darlings of the tech industry—which for much of the past decade have convincingly presented themselves as swaggering inevitabilities—are showing signs of vulnerability as well. Google and Facebook, which pride themselves as algorithmically-pristine information-delivery systems, fell prey to fake-news mills and virulent troll armies. Uber’s scorched-earth approach to capitalism and human resources, which once made it a seemingly indomitable competitor, now threaten to sink its once-bulletproof CEO. The more powerful and inevitable something appears, the more startling and devastating its weaknesses are when they are exposed. Or, to borrow a phrase, the harder they come, the harder they fall.
That’s useful to remember when you consider the transformation we are currently undergoing, one in which more and more of our devices become connected to the internet. Whether you call it the “Internet of Things” or the “Internet of Everything” or the “Third Wave” or the “Programmable World,” the long-predicted moment when connectivity becomes as ubiquitous as electricity is nearly upon us. The benefits will be staggering—a world that will know us and adjust to our needs and desires, a universe of data that will impart new wisdom. But so will the vulnerabilities, the opportunities for our worlds to be penetrated, manipulated, and even destroyed by malevolent intruders.
This exposes yet another vulnerability for the tech industry—a meta-vulnerability, really. That vision depends on trust. It requires us to put our faith in our self-driving cars and Alexa-enabled virtual assistants and thermostats and, yes, smart televisions. Every time we learn of a new zero-day exploit, it renews fears of an entirely hackable world, where our machines can be enlisted against us. It reminds us that the future is a necessarily more vulnerable place.
The Vault 7 leak is not the tech industry’s fault, exactly, but we must ask at what point we stop placing our trust in devices, systems, and people that are inherently undeserving of it? Actually, never mind, we’re past it already. The most troubling aspect of the latest revelations is that there is no way to protect yourself beyond not buying a smartphone, or at least not having any meaningful conversations when you are in the same room with one. These vulnerabilities and cracks are not optional, but woven throughout the fabric of our social and commercial lives. They are coming from inside the house.
Exclusive: Privacy campaigners criticise WhatsApp vulnerability as a ‘huge threat to freedom of speech’ and warn it could be exploited by government agencies
Research shows that WhatsApp can read messages due to the way the company has implemented its end-to-end encryption protocol. Photograph: Ritchie B Tongo/EPA
A security backdoor that can be used to allow Facebook and others to intercept and read encrypted messages has been found within its WhatsApp messaging service.
Facebook claims that no one can intercept WhatsApp messages, not even the company and its staff, ensuring privacy for its billion-plus users. But new research shows that the company could in fact read messages due to the way WhatsApphas implemented its end-to-end encryption protocol.
Privacy campaigners said the vulnerability is a “huge threat to freedom of speech” and warned it can be used by government agencies to snoop on users who believe their messages to be secure. WhatsApp has made privacy and security a primary selling point, and has become a go to communications tool of activists, dissidents and diplomats.
WhatsApp’s end-to-end encryption relies on the generation of unique security keys, using the acclaimed Signal protocol, developed by Open Whisper Systems, that are traded and verified between users to guarantee communications are secure and cannot be intercepted by a middleman. However, WhatsApp has the ability to force the generation of new encryption keys for offline users, unbeknown to the sender and recipient of the messages, and to make the sender re-encrypt messages with new keys and send them again for any messages that have not been marked as delivered.
The recipient is not made aware of this change in encryption, while the sender is only notified if they have opted-in to encryption warnings in settings, and only after the messages have been resent. This re-encryption and rebroadcasting effectively allows WhatsApp to intercept and read users’ messages.
The security backdoor was discovered by Tobias Boelter, a cryptography and security researcher at the University of California, Berkeley. He told the Guardian: “If WhatsApp is asked by a government agency to disclose its messaging records, it can effectively grant access due to the change in keys.”
The backdoor is not inherent to the Signal protocol. Open Whisper Systems’ messaging app, Signal, the app used and recommended by whistleblower Edward Snowden, does not suffer from the same vulnerability. If a recipient changes the security key while offline, for instance, a sent message will fail to be delivered and the sender will be notified of the change in security keys without automatically resending the message.
WhatsApp’s implementation automatically resends an undelivered message with a new key without warning the user in advance or giving them the ability to prevent it.
Boelter reported the backdoor vulnerability to Facebook in April 2016, but was told that Facebook was aware of the issue, that it was “expected behaviour” and wasn’t being actively worked on. The Guardian has verified the backdoor still exists.
The WhatsApp vulnerability calls into question the privacy of messages sent across the service used around the world, including by people living in oppressive regimes. Photograph: Marcelo Sayão/EPA
Steffen Tor Jensen, head of information security and digital counter-surveillance at the European-Bahraini Organisation for Human Rights, verified Boelter’s findings. He said: “WhatsApp can effectively continue flipping the security keys when devices are offline and re-sending the message, without letting users know of the change till after it has been made, providing an extremely insecure platform.”
Boelter said: “[Some] might say that this vulnerability could only be abused to snoop on ‘single’ targeted messages, not entire conversations. This is not true if you consider that the WhatsApp server can just forward messages without sending the ‘message was received by recipient’ notification (or the double tick), which users might not notice. Using the retransmission vulnerability, the WhatsApp server can then later get a transcript of the whole conversation, not just a single message.”
The vulnerability calls into question the privacy of messages sent across the service, which is used around the world, including by people living in oppressive regimes.
Professor Kirstie Ball, co-director and founder of the Centre for Research into Information, Surveillance and Privacy, called the existence of a backdoor within WhatsApp’s encryption “a gold mine for security agencies” and “a huge betrayal of user trust”. She added: “It is a huge threat to freedom of speech, for it to be able to look at what you’re saying if it wants to. Consumers will say, I’ve got nothing to hide, but you don’t know what information is looked for and what connections are being made.”
In the UK, the recently passed Investigatory Powers Act allows the government to intercept bulk data of users held by private companies, without suspicion of criminal activity, similar to the activity of the US National Security Agency uncovered by the Snowden revelations. The government also has the power to force companies to “maintain technical capabilities” that allow data collection through hacking and interception, and requires companies to remove “electronic protection” from data. Intentional or not, WhatsApp’s backdoor to the end-to-end encryption could be used in such a way to facilitate government interception.
Jim Killock, executive director of Open Rights Group, said: “If companies claim to offer end-to-end encryption, they should come clean if it is found to be compromised – whether through deliberately installed backdoors or security flaws. In the UK, the Investigatory Powers Act means that technical capability notices could be used to compel companies to introduce flaws – which could leave people’s data vulnerable.”
A WhatsApp spokesperson told the Guardian: “Over 1 billion people use WhatsApp today because it is simple, fast, reliable and secure. At WhatsApp, we’ve always believed that people’s conversations should be secure and private. Last year, we gave all our users a better level of security by making every message, photo, video, file and call end-to-end encrypted by default. As we introduce features like end-to-end encryption, we focus on keeping the product simple and take into consideration how it’s used every day around the world.
“In WhatsApp’s implementation of the Signal protocol, we have a “Show Security Notifications” setting (option under Settings > Account > Security) that notifies you when a contact’s security code has changed. We know the most common reasons this happens are because someone has switched phones or reinstalled WhatsApp. This is because in many parts of the world, people frequently change devices and Sim cards. In these situations, we want to make sure people’s messages are delivered, not lost in transit.”
Asked to comment specifically on whether Facebook/WhatApps had accessed users’ messages and whether it had done so at the request of government agencies or other third parties, it directed the Guardian to its site that details aggregate data on government requests by country.
Concerns over the privacy of WhatsApp users has been repeatedly highlighted since Facebook acquired the company for $22bn in 2014. In August 2015, Facebook announced a change to the privacy policy governing WhatsApp that allowed the social network to merge data from WhatsApp users and Facebook, including phone numbers and app usage, for advertising and development purposes.