Schlagwort-Archive: Whatsapp

How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users

Source: https://www.propublica.org/article/how-facebook-undermines-privacy-protections-for-its-2-billion-whatsapp-users

When Mark Zuckerberg unveiled a new “privacy-focused vision” for Facebook in March 2019, he cited the company’s global messaging service, WhatsApp, as a model. Acknowledging that “we don’t currently have a strong reputation for building privacy protective services,” the Facebook CEO wrote that “I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about. We plan to build this the way we’ve developed WhatsApp.”

Zuckerberg’s vision centered on WhatsApp’s signature feature, which he said the company was planning to apply to Instagram and Facebook Messenger: end-to-end encryption, which converts all messages into an unreadable format that is only unlocked when they reach their intended destinations. WhatsApp messages are so secure, he said, that nobody else — not even the company — can read a word. As Zuckerberg had put it earlier, in testimony to the U.S. Senate in 2018, “We don’t see any of the content in WhatsApp.”

WhatsApp emphasizes this point so consistently that a flag with a similar assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”

Given those sweeping assurances, you might be surprised to learn that WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through millions of private messages, images and videos. They pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute.

The workers have access to only a subset of WhatsApp messages — those flagged by users and automatically forwarded to the company as possibly abusive. The review is one element in a broader monitoring operation in which the company also reviews material that is not encrypted, including data about the sender and their account.

Policing users while assuring them that their privacy is sacrosanct makes for an awkward mission at WhatsApp. A 49-slide internal company marketing presentation from December, obtained by ProPublica, emphasizes the “fierce” promotion of WhatsApp’s “privacy narrative.” It compares its “brand character” to “the Immigrant Mother” and displays a photo of Malala ​​Yousafzai, who survived a shooting by the Taliban and became a Nobel Peace Prize winner, in a slide titled “Brand tone parameters.” The presentation does not mention the company’s content moderation efforts.

WhatsApp’s director of communications, Carl Woog, acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove “the worst” abusers. But Woog told ProPublica that the company does not consider this work to be content moderation, saying: “We actually don’t typically use the term for WhatsApp.” The company declined to make executives available for interviews for this article, but responded to questions with written comments. “WhatsApp is a lifeline for millions of people around the world,” the company said. “The decisions we make around how we build our app are focused around the privacy of our users, maintaining a high degree of reliability and preventing abuse.”

WhatsApp’s denial that it moderates content is noticeably different from what Facebook Inc. says about WhatsApp’s corporate siblings, Instagram and Facebook. The company has said that some 15,000 moderators examine content on Facebook and Instagram, neither of which is encrypted. It releases quarterly transparency reports that detail how many accounts Facebook and Instagram have “actioned” for various categories of abusive content. There is no such report for WhatsApp.

Deploying an army of content reviewers is just one of the ways that Facebook Inc. has compromised the privacy of WhatsApp users. Together, the company’s actions have left WhatsApp — the largest messaging app in the world, with two billion users — far less private than its users likely understand or expect. A ProPublica investigation, drawing on data, documents and dozens of interviews with current and former employees and contractors, reveals how, since purchasing WhatsApp in 2014, Facebook has quietly undermined its sweeping security assurances in multiple ways. (Two articles this summer noted the existence of WhatsApp’s moderators but focused on their working conditions and pay rather than their effect on users’ privacy. This article is the first to reveal the details and extent of the company’s ability to scrutinize messages and user data — and to examine what the company does with that information.)

Many of the assertions by content moderators working for WhatsApp are echoed by a confidential whistleblower complaint filed last year with the U.S. Securities and Exchange Commission. The complaint, which ProPublica obtained, details WhatsApp’s extensive use of outside contractors, artificial intelligence systems and account information to examine user messages, images and videos. It alleges that the company’s claims of protecting users’ privacy are false. “We haven’t seen this complaint,” the company spokesperson said. The SEC has taken no public action on it; an agency spokesperson declined to comment.

Facebook Inc. has also downplayed how much data it collects from WhatsApp users, what it does with it and how much it shares with law enforcement authorities. For example, WhatsApp shares metadata, unencrypted records that can reveal a lot about a user’s activity, with law enforcement agencies such as the Department of Justice. Some rivals, such as Signal, intentionally gather much less metadata to avoid incursions on its users’ privacy, and thus share far less with law enforcement. (“WhatsApp responds to valid legal requests,” the company spokesperson said, “including orders that require us to provide on a real-time going forward basis who a specific person is messaging.”)

WhatsApp user data, ProPublica has learned, helped prosecutors build a high-profile case against a Treasury Department employee who leaked confidential documents to BuzzFeed News that exposed how dirty money flows through U.S. banks.

Like other social media and communications platforms, WhatsApp is caught between users who expect privacy and law enforcement entities that effectively demand the opposite: that WhatsApp turn over information that will help combat crime and online abuse. WhatsApp has responded to this dilemma by asserting that it’s no dilemma at all. “I think we absolutely can have security and safety for people through end-to-end encryption and work with law enforcement to solve crimes,” said Will Cathcart, whose title is Head of WhatsApp, in a YouTube interview with an Australian think tank in July.

The tension between privacy and disseminating information to law enforcement is exacerbated by a second pressure: Facebook’s need to make money from WhatsApp. Since paying $22 billion to buy WhatsApp in 2014, Facebook has been trying to figure out how to generate profits from a service that doesn’t charge its users a penny.

That conundrum has periodically led to moves that anger users, regulators or both. The goal of monetizing the app was part of the company’s 2016 decision to start sharing WhatsApp user data with Facebook, something the company had told European Union regulators was technologically impossible. The same impulse spurred a controversial plan, abandoned in late 2019, to sell advertising on WhatsApp. And the profit-seeking mandate was behind another botched initiative in January: the introduction of a new privacy policy for user interactions with businesses on WhatsApp, allowing businesses to use customer data in new ways. That announcement triggered a user exodus to competing apps.

WhatsApp’s increasingly aggressive business plan is focused on charging companies for an array of services — letting users make payments via WhatsApp and managing customer service chats — that offer convenience but fewer privacy protections. The result is a confusing two-tiered privacy system within the same app where the protections of end-to-end encryption are further eroded when WhatsApp users employ the service to communicate with businesses.

The company’s December marketing presentation captures WhatsApp’s diverging imperatives. It states that “privacy will remain important.” But it also conveys what seems to be a more urgent mission: the need to “open the aperture of the brand to encompass our future business objectives.”


I. “Content Moderation Associates”

In many ways, the experience of being a content moderator for WhatsApp in Austin is identical to being a moderator for Facebook or Instagram, according to interviews with 29 current and former moderators. Mostly in their 20s and 30s, many with past experience as store clerks, grocery checkers and baristas, the moderators are hired and employed by Accenture, a huge corporate contractor that works for Facebook and other Fortune 500 behemoths.

The job listings advertise “Content Review” positions and make no mention of Facebook or WhatsApp. Employment documents list the workers’ initial title as “content moderation associate.” Pay starts around $16.50 an hour. Moderators are instructed to tell anyone who asks that they work for Accenture, and are required to sign sweeping non-disclosure agreements. Citing the NDAs, almost all the current and former moderators interviewed by ProPublica insisted on anonymity. (An Accenture spokesperson declined comment, referring all questions about content moderation to WhatsApp.)

When the WhatsApp team was assembled in Austin in 2019, Facebook moderators already occupied the fourth floor of an office tower on Sixth Street, adjacent to the city’s famous bar-and-music scene. The WhatsApp team was installed on the floor above, with new glass-enclosed work pods and nicer bathrooms that sparked a tinge of envy in a few members of the Facebook team. Most of the WhatsApp team scattered to work from home during the pandemic. Whether in the office or at home, they spend their days in front of screens, using a Facebook software tool to examine a stream of “tickets,” organized by subject into “reactive” and “proactive” queues.

Collectively, the workers scrutinize millions of pieces of WhatsApp content each week. Each reviewer handles upwards of 600 tickets a day, which gives them less than a minute per ticket. WhatsApp declined to reveal how many contract workers are employed for content review, but a partial staffing list reviewed by ProPublica suggests that, at Accenture alone, it’s more than 1,000. WhatsApp moderators, like their Facebook and Instagram counterparts, are expected to meet performance metrics for speed and accuracy, which are audited by Accenture.

Their jobs differ in other ways. Because WhatsApp’s content is encrypted, artificial intelligence systems can’t automatically scan all chats, images and videos, as they do on Facebook and Instagram. Instead, WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service. This forwards five messages — the allegedly offending one along with the four previous ones in the exchange, including any images or videos — to WhatsApp in unscrambled form, according to former WhatsApp engineers and moderators. Automated systems then feed these tickets into “reactive” queues for contract workers to assess.

Artificial intelligence initiates a second set of queues — so-called proactive ones — by scanning unencrypted data that WhatsApp collects about its users and comparing it against suspicious account information and messaging patterns (a new account rapidly sending out a high volume of chats is evidence of spam), as well as terms and images that have previously been deemed abusive. The unencrypted data available for scrutiny is extensive. It includes the names and profile images of a user’s WhatsApp groups as well as their phone number, profile photo, status message, phone battery level, language and time zone, unique mobile phone ID and IP address, wireless signal strength and phone operating system, as a list of their electronic devices, any related Facebook and Instagram accounts, the last time they used the app and any previous history of violations.

The WhatsApp reviewers have three choices when presented with a ticket for either type of queue: Do nothing, place the user on “watch” for further scrutiny, or ban the account. (Facebook and Instagram content moderators have more options, including removing individual postings. It’s that distinction — the fact that WhatsApp reviewers can’t delete individual items — that the company cites as its basis for asserting that WhatsApp reviewers are not “content moderators.”)

WhatsApp moderators must make subjective, sensitive and subtle judgments, interviews and documents examined by ProPublica show. They examine a wide range of categories, including “Spam Report,” “Civic Bad Actor” (political hate speech and disinformation), “Terrorism Global Credible Threat,” “CEI” (child exploitative imagery) and “CP” (child pornography). Another set of categories addresses the messaging and conduct of millions of small and large businesses that use WhatsApp to chat with customers and sell their wares. These queues have such titles as “business impersonation prevalence,” “commerce policy probable violators” and “business verification.”

Moderators say the guidance they get from WhatsApp and Accenture relies on standards that can be simultaneously arcane and disturbingly graphic. Decisions about abusive sexual imagery, for example, can rest on an assessment of whether a naked child in an image appears adolescent or prepubescent, based on comparison of hip bones and pubic hair to a medical index chart. One reviewer recalled a grainy video in a political-speech queue that depicted a machete-wielding man holding up what appeared to be a severed head: “We had to watch and say, ‘Is this a real dead body or a fake dead body?’”

In late 2020, moderators were informed of a new queue for alleged “sextortion.” It was defined in an explanatory memo as “a form of sexual exploitation where people are blackmailed with a nude image of themselves which have been shared by them or someone else on the Internet.” The memo said workers would review messages reported by users that “include predefined keywords typically used in sextortion/blackmail messages.”

WhatsApp’s review system is hampered by impediments, including buggy language translation. The service has users in 180 countries, with the vast majority located outside the U.S. Even though Accenture hires workers who speak a variety of languages, for messages in some languages there’s often no native speaker on site to assess abuse complaints. That means using Facebook’s language-translation tool, which reviewers said could be so inaccurate that it sometimes labeled messages in Arabic as being in Spanish. The tool also offered little guidance on local slang, political context or sexual innuendo. “In the three years I’ve been there,” one moderator said, “it’s always been horrible.”

The process can be rife with errors and misunderstandings. Companies have been flagged for offering weapons for sale when they’re selling straight shaving razors. Bras can be sold, but if the marketing language registers as “adult,” the seller can be labeled a forbidden “sexually oriented business.” And a flawed translation tool set off an alarm when it detected kids for sale and slaughter, which, upon closer scrutiny, turned out to involve young goats intended to be cooked and eaten in halal meals.

The system is also undercut by the human failings of the people who instigate reports. Complaints are frequently filed to punish, harass or prank someone, according to moderators. In messages from Brazil and Mexico, one moderator explained, “we had a couple of months where AI was banning groups left and right because people were messing with their friends by changing their group names” and then reporting them. “At the worst of it, we were probably getting tens of thousands of those. They figured out some words the algorithm did not like.”

Other reports fail to meet WhatsApp standards for an account ban. “Most of it is not violating,” one of the moderators said. “It’s content that is already on the internet, and it’s just people trying to mess with users.” Still, each case can reveal up to five unencrypted messages, which are then examined by moderators.

The judgment of WhatsApp’s AI is less than perfect, moderators say. “There were a lot of innocent photos on there that were not allowed to be on there,” said Carlos Sauceda, who left Accenture last year after nine months. “It might have been a photo of a child taking a bath, and there was nothing wrong with it.” As another WhatsApp moderator put it, “A lot of the time, the artificial intelligence is not that intelligent.”

Facebook’s written guidance to WhatsApp moderators acknowledges many problems, noting “we have made mistakes and our policies have been weaponized by bad actors to get good actors banned. When users write inquiries pertaining to abusive matters like these, it is up to WhatsApp to respond and act (if necessary) accordingly in a timely and pleasant manner.” Of course, if a user appeals a ban that was prompted by a user report, according to one moderator, it entails having a second moderator examine the user’s content.


II. “Industry Leaders” in Detecting Bad Behavior

In public statements and on the company’s websites, Facebook Inc. is noticeably vague about WhatsApp’s monitoring process. The company does not provide a regular accounting of how WhatsApp polices the platform. WhatsApp’s FAQ page and online complaint form note that it will receive “the most recent messages” from a user who has been flagged. They do not, however, disclose how many unencrypted messages are revealed when a report is filed, or that those messages are examined by outside contractors. (WhatsApp told ProPublica it limits that disclosure to keep violators from “gaming” the system.)

By contrast, both Facebook and Instagram post lengthy “Community Standards” documents detailing the criteria its moderators use to police content, along with articles and videos about “the unrecognized heroes who keep Facebook safe” and announcements on new content-review sites. Facebook’s transparency reports detail how many pieces of content are “actioned” for each type of violation. WhatsApp is not included in this report.

When dealing with legislators, Facebook Inc. officials also offer few details — but are eager to assure them that they don’t let encryption stand in the way of protecting users from images of child sexual abuse and exploitation. For example, when members of the Senate Judiciary Committee grilled Facebook about the impact of encrypting its platforms, the company, in written follow-up questions in Jan. 2020, cited WhatsApp in boasting that it would remain responsive to law enforcement. “Even within an encrypted system,” one response noted, “we will still be able to respond to lawful requests for metadata, including potentially critical location or account information… We already have an encrypted messaging service, WhatsApp, that — in contrast to some other encrypted services — provides a simple way for people to report abuse or safety concerns.”

Sure enough, WhatsApp reported 400,000 instances of possible child-exploitation imagery to the National Center for Missing and Exploited Children in 2020, according to its head, Cathcart. That was ten times as many as in 2019. “We are by far the industry leaders in finding and detecting that behavior in an end-to-end encrypted service,” he said.

During his YouTube interview with the Australian think tank, Cathcart also described WhatsApp’s reliance on user reporting and its AI systems’ ability to examine account information that isn’t subject to encryption. Asked how many staffers WhatsApp employed to investigate abuse complaints from an app with more than two billion users, Cathcart didn’t mention content moderators or their access to encrypted content. “There’s a lot of people across Facebook who help with WhatsApp,” he explained. “If you look at people who work full time on WhatsApp, it’s above a thousand. I won’t get into the full breakdown of customer service, user reports, engineering, etc. But it’s a lot of that.”

In written responses for this article, the company spokesperson said: “We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive. This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.” The spokesperson noted that WhatsApp has released new privacy features, including “more controls about how people’s messages can disappear” or be viewed only once. He added, “Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp we receive the content they send us.”


III. “Deceiving Users” About Personal Privacy

Since the moment Facebook announced plans to buy WhatsApp in 2014, observers wondered how the service, known for its fervent commitment to privacy, would fare inside a corporation known for the opposite. Zuckerberg had become one of the wealthiest people on the planet by using a “surveillance capitalism” approach: collecting and exploiting reams of user data to sell targeted digital ads. Facebook’s relentless pursuit of growth and profits has generated a series of privacy scandals in which it was accused of deceiving customers and regulators.

By contrast, WhatsApp knew little about its users apart from their phone numbers and shared none of that information with third parties. WhatsApp ran no ads, and its co-founders, Jan Koum and Brian Acton, both former Yahoo engineers, were hostile to them. “At every company that sells ads,” they wrote in 2012, “a significant portion of their engineering team spends their day tuning data mining, writing better code to collect all your personal data, upgrading the servers that hold all the data and making sure it’s all being logged and collated and sliced and packed and shipped out,” adding: “Remember, when advertising is involved you the user are the product.” At WhatsApp, they noted, “your data isn’t even in the picture. We are simply not interested in any of it.”

Zuckerberg publicly vowed in a 2014 keynote speech that he would keep WhatsApp “exactly the same.” He declared, “We are absolutely not going to change plans around WhatsApp and the way it uses user data. WhatsApp is going to operate completely autonomously.”

In April 2016, WhatsApp completed its long-planned adoption of end-to-end encryption, which helped establish the app as a prized communications platform in 180 countries, including many where text messages and phone calls are cost-prohibitive. International dissidents, whistleblowers and journalists also turned to WhatsApp to escape government eavesdropping.

Four months later, however, WhatsApp disclosed it would begin sharing user data with Facebook — precisely what Zuckerberg had said would not happen — a move that cleared the way for an array of future revenue-generating plans. The new WhatsApp terms of service said the app would share information such as users’ phone numbers, profile photos, status messages and IP addresses for the purposes of ad targeting, fighting spam and abuse and gathering metrics. “By connecting your phone number with Facebook’s systems,” WhatsApp explained, “Facebook can offer better friend suggestions and show you more relevant ads if you have an account with them.”

Such actions were increasingly bringing Facebook into the crosshairs of regulators. In May 2017, European Union antitrust regulators fined the company 110 million euros (about $122 million) for falsely claiming three years earlier that it would be impossible to link the user information between WhatsApp and the Facebook family of apps. The EU concluded that Facebook had “intentionally or negligently” deceived regulators. Facebook insisted its false statements in 2014 were not intentional, but didn’t contest the fine.

By the spring of 2018, the WhatsApp co-founders, now both billionaires, were gone. Acton, in what he later described as an act of “penance” for the “crime” of selling WhatsApp to Facebook, gave $50 million to a foundation backing Signal, a free encrypted messaging app that would emerge as a WhatsApp rival. (Acton’s donor-advised fund has also given money to ProPublica.)

Meanwhile, Facebook was under fire for its security and privacy failures as never before. The pressure culminated in a landmark $5 billion fine by the Federal Trade Commission in July 2019 for violating a previous agreement to protect user privacy. The fine was almost 20 times greater than any previous privacy-related penalty, according to the FTC, and Facebook’s transgressions included “deceiving users about their ability to control the privacy of their personal information.”

The FTC announced that it was ordering Facebook to take steps to protect privacy going forward, including for WhatsApp users: “As part of Facebook’s order-mandated privacy program, which covers WhatsApp and Instagram, Facebook must conduct a privacy review of every new or modified product, service, or practice before it is implemented, and document its decisions about user privacy.” Compliance officers would be required to generate a “quarterly privacy review report” and share it with the company and, upon request, the FTC.

Facebook agreed to the FTC’s fine and order. Indeed, the negotiations for that agreement were the backdrop, just four months before that, for Zuckerberg’s announcement of his new commitment to privacy.

By that point, WhatsApp had begun using Accenture and other outside contractors to hire hundreds of content reviewers. But the company was eager not to step on its larger privacy message — or spook its global user base. It said nothing publicly about its hiring of contractors to review content.


IV. “We Kill People Based On Metadata”

Even as Zuckerberg was touting Facebook Inc.’s new commitment to privacy in 2019, he didn’t mention that his company was apparently sharing more of its WhatsApp users’ metadata than ever with the parent company — and with law enforcement.

To the lay ear, the term “metadata” can sound abstract, a word that evokes the intersection of literary criticism and statistics. To use an old, pre-digital analogy, metadata is the equivalent of what’s written on the outside of an envelope — the names and addresses of the sender and recipient and the postmark reflecting where and when it was mailed — while the “content” is what’s written on the letter sealed inside the envelope. So it is with WhatsApp messages: The content is protected, but the envelope reveals a multitude of telling details (as noted: time stamps, phone numbers and much more).

Those in the information and intelligence fields understand how crucial this information can be. It was metadata, after all, that the National Security Agency was gathering about millions of Americans not suspected of a crime, prompting a global outcry when it was exposed in 2013 by former NSA contractor Edward Snowden. “Metadata absolutely tells you everything about somebody’s life,” former NSA general counsel Stewart Baker once said. “If you have enough metadata, you don’t really need content.” In a symposium at Johns Hopkins University in 2014, Gen. Michael Hayden, former director of both the CIA and NSA, went even further: “We kill people based on metadata.”

U.S. law enforcement has used WhatsApp metadata to help put people in jail. ProPublica found more than a dozen instances in which the Justice Department sought court orders for the platform’s metadata since 2017. These represent a fraction of overall requests, known as pen register orders (a phrase borrowed from the technology used to track numbers dialed by landline telephones), as many more are kept from public view by court order. U.S. government requests for data on outgoing and incoming messages from all Facebook platforms increased by 276% from the first half of 2017 to the second half of 2020, according to Facebook Inc. statistics (which don’t break out the numbers by platform). The company’s rate of handing over at least some data in response to such requests has risen from 84% to 95% during that period.

It’s not clear exactly what government investigators have been able to gather from WhatsApp, as the results of those orders, too, are often kept from public view. Internally, WhatsApp calls such requests for information about users “prospective message pairs,” or PMPs. These provide data on a user’s messaging patterns in response to requests from U.S. law enforcement agencies, as well as those in at least three other countries — the United Kingdom, Brazil and India — according to a person familiar with the matter who shared this information on condition of anonymity. Law enforcement requests from other countries might only receive basic subscriber profile information.

WhatsApp metadata was pivotal in the arrest and conviction of Natalie “May” Edwards, a former Treasury Department official with the Financial Crimes Enforcement Network, for leaking confidential banking reports about suspicious transactions to BuzzFeed News. The FBI’s criminal complaint detailed hundreds of messages between Edwards and a BuzzFeed reporter using an “encrypted application,” which interviews and court records confirmed was WhatsApp. “On or about August 1, 2018, within approximately six hours of the Edwards pen becoming operative — and the day after the July 2018 Buzzfeed article was published — the Edwards cellphone exchanged approximately 70 messages via the encrypted application with the Reporter-1 cellphone during an approximately 20-minute time span between 12:33 a.m. and 12:54 a.m.,” FBI Special Agent Emily Eckstut wrote in her October 2018 complaint. Edwards and the reporter used WhatsApp because Edwards believed the platform to be secure, according to a person familiar with the matter.

Edwards was sentenced on June 3 to six months in prison after pleading guilty to a conspiracy charge and reported to prison last week. Edwards’ attorney declined to comment, as did representatives from the FBI and the Justice Department.

WhatsApp has for years downplayed how much unencrypted information it shares with law enforcement, largely limiting mentions of the practice to boilerplate language buried deep in its terms of service. It does not routinely keep permanent logs of who users are communicating with and how often, but company officials confirmed they do turn on such tracking at their own discretion — even for internal Facebook leak investigations — or in response to law enforcement requests. The company declined to tell ProPublica how frequently it does so.

The privacy page for WhatsApp assures users that they have total control over their own metadata. It says users can “decide if only contacts, everyone, or nobody can see your profile photo” or when they last opened their status updates or when they last opened the app. Regardless of the settings a user chooses, WhatsApp collects and analyzes all of that data — a fact not mentioned anywhere on the page.


V. “Opening the Aperture to Encompass Business Objectives”

The conflict between privacy and security on encrypted platforms seems to be only intensifying. Law enforcement and child safety advocates have urged Zuckerberg to abandon his plan to encrypt all of Facebook’s messaging platforms. In June 2020, three Republican senators introduced the “Lawful Access to Encrypted Data Act,” which would require tech companies to assist in providing access to even encrypted content in response to law enforcement warrants. For its part, WhatsApp recently sued the Indian government to block its requirement that encrypted apps provide “traceability” — a method to identify the sender of any message deemed relevant to law enforcement. WhatsApp has fought similar demands in other countries.

Other encrypted platforms take a vastly different approach to monitoring their users than WhatsApp. Signal employs no content moderators, collects far less user and group data, allows no cloud backups and generally rejects the notion that it should be policing user activities. It submits no child exploitation reports to NCMEC.

Apple has touted its commitment to privacy as a selling point. Its iMessage system displays a “report” button only to alert the company to suspected spam, and the company has made just a few hundred annual reports to NCMEC, all of them originating from scanning outgoing email, which is unencrypted.

But Apple recently took a new tack, and appeared to stumble along the way. Amid intensifying pressure from Congress, in August the company announced a complex new system for identifying child-exploitative imagery on users’ iCloud backups. Apple insisted the new system poses no threat to private content, but privacy advocates accused the company of creating a backdoor that potentially allows authoritarian governments to demand broader content searches, which could result in the targeting of dissidents, journalists or other critics of the state. On Sept. 3, Apple announced it would delay implementation of the new system.

Still, it’s Facebook that seems to face the most constant skepticism among major tech platforms. It is using encryption to market itself as privacy-friendly, while saying little about the other ways it collects data, according to Lloyd Richardson, the director of IT at the Canadian Centre for Child Protection. “This whole idea that they’re doing it for personal protection of people is completely ludicrous,” Richardson said. “You’re trusting an app owned and written by Facebook to do exactly what they’re saying. Do you trust that entity to do that?” (On Sept. 2, Irish authorities announced that they are fining WhatsApp 225 million euros, about $267 million, for failing to properly disclose how the company shares user information with other Facebook platforms. WhatsApp is contesting the finding.)

Facebook’s emphasis on promoting WhatsApp as a paragon of privacy is evident in the December marketing document obtained by ProPublica. The “Brand Foundations” presentation says it was the product of a 21-member global team across all of Facebook, involving a half-dozen workshops, quantitative research, “stakeholder interviews” and “endless brainstorms.” Its aim: to offer “an emotional articulation” of WhatsApp’s benefits, “an inspirational toolkit that helps us tell our story,” and a “brand purpose to champion the deep human connection that leads to progress.” The marketing deck identifies a feeling of “closeness” as WhatsApp’s “ownable emotional territory,” saying the app delivers “the closest thing to an in-person conversation.”

WhatsApp should portray itself as “courageous,” according to another slide, because it’s “taking a strong, public stance that is not financially motivated on things we care about,” such as defending encryption and fighting misinformation. But the presentation also speaks of the need to “open the aperture of the brand to encompass our future business objectives. While privacy will remain important, we must accommodate for future innovations.”

WhatsApp is now in the midst of a major drive to make money. It has experienced a rocky start, in part because of broad suspicions of how WhatsApp will balance privacy and profits. An announced plan to begin running ads inside the app didn’t help; it was abandoned in late 2019, just days before it was set to launch. Early this January, WhatsApp unveiled a change in its privacy policy — accompanied by a one-month deadline to accept the policy or get cut off from the app. The move sparked a revolt, impelling tens of millions of users to flee to rivals such as Signal and Telegram.

The policy change focused on how messages and data would be handled when users communicate with a business in the ever-expanding array of WhatsApp Business offerings. Companies now could store their chats with users and use information about users for marketing purposes, including targeting them with ads on Facebook or Instagram.

Elon Musk tweeted “Use Signal,” and WhatsApp users rebelled. Facebook delayed for three months the requirement for users to approve the policy update. In the meantime, it struggled to convince users that the change would have no effect on the privacy protections for their personal communications, with a slightly modified version of its usual assurance: “WhatsApp cannot see your personal messages or hear your calls and neither can Facebook.” Just as when the company first bought WhatsApp years before, the message was the same: Trust us.

Source: https://www.propublica.org/article/how-facebook-undermines-privacy-protections-for-its-2-billion-whatsapp-users

Werbung

Metadaten: Wo das eigentliche Privacy-Problem von WhatsApp liegt

Wo das eigentliche Privacy-Problem von WhatsApp liegt

Gerade macht eine Nachricht aus den USA die Runde: Das Investigativmagazin ProPublica widmet dem Datenschutz bei WhatsApp einen ausführlichen Artikel und kommt zu dem Schluss, das Mutterunternehmen Facebook untergrabe die Privatsphäre der zwei Milliarden Nutzer:innen. So richtig diese Aussage ist, so problematisch ist das Framing der Autoren und vieler deutscher Medien, die die Meldung oberflächlich aufgreifen.

Im Hauptteil des Artikels geht es darum, dass Facebook ein Heer von Content-Moderator:innen beschäftigt, um gemeldete Inhalte in WhatsApp-Chats zu überprüfen. Das ist keine Neuigkeit, aber ProPublica kann erstmals ausführlicher darüber berichten, wie diese Arbeit abläuft. Dass potenziell jede WhatsApp-Nachricht von den Moderator:innen des Konzerns gelesen werden kann, stellen die Autoren dem Privacy-Versprechen des Messengers gegenüber: „No one outside of this chat, not even WhatsApp, can read or listen to them.”

Allerdings, und hier wird es problematisch, setzen die Autoren dann auf ein Framing, dass die Content-Moderation (die WhatsApp nicht so nennen will) als Schwächung der Ende-zu-Ende-Verschlüsselung darstellt. Ein ProPublica-Autor bezeichnete die Moderation sogar als „Backdoor“, was gemeinhin eine gezielt eingebaute Hintertür zum Umgehen von Verschlüsselung meint. Diverse Sicherheitsexpert:innen wie die Cybersecurity-Direktorin der Electronic Frontier Foundation, Eva Galperin, kritisieren deshalb die Berichterstattung.

Die Verschlüsselung tut, was sie soll

Wo also liegt das Problem? Klar ist: Mark Zuckerbergs 2018 gegebenes Versprechen, dass seine Firma keinerlei Kommunikationsinhalte aus WhatsApp-Chats lesen könne, ist irreführend. Jede Nachricht, jedes Bild und jedes Video, die von Chat-Teilnehmer:innen gemeldet werden, landen zur Überprüfung bei WhatsApp und deren Dienstleistern. Etwa 1000 Menschen seien in Austin, Dublin und Singapur rund um die Uhr im Einsatz, um die gemeldeten Inhalte zu sichten, berichtet ProPublica. Weil das Unternehmen das Privacy-Versprechen für sein Marketing benötigt, versteckt WhatsApp diese Info vor seinen Nutzer:innen.

Klar ist auch: Wie jede Form der Inhaltemoderation bringt dies erhebliche Probleme mit sich. So zeigen die Autoren nach Gesprächen mit diversen Quellen etwa, dass die Moderator:innen wenig Zeit für ihre schwerwiegenden Entscheidungen haben und mit teils missverständlichen Vorgaben arbeiten müssen. Wie bei der Moderation für Facebook und Instagram werden sie zudem von einem automatisierten System unterstützt, das mitunter fehlerhafte Vorschläge macht. Deshalb werden immer wieder Inhalte gesperrt, die eigentlich nicht gesperrt werden dürften, etwa harmlose Fotos oder Satire. Einen ordentlichen Widerspruchsmechanismus gibt es bei WhatsApp nicht und es ist ein Verdienst des Artikels, diese Schwierigkeiten ans Licht zu bringen.

Diese Probleme liegen jedoch nicht an einer mangelhaften Ende-zu-Ende-Verschlüsselung der WhatsApp-Nachrichten. Diese funktioniert technisch gesehen weiterhin gut. Die Nachrichten sind zunächst nur auf den Geräten der Kommunikationsteilnehmer:innen lesbar (sofern diese nicht durch kriminelle oder staatliche Hacker kompromittiert wurden). Die Nutzer:innen, die Inhalte aus Chats melden, leiten diese an WhatsApp weiter. Das kann jede:r tun und ist kein Verschlüsselungsproblem.

Die eigentliche Gefahr liegt woanders

Die Möglichkeit, missbräuchliche Inhalte zu melden, besteht bei WhatsApp schon seit Längerem. Das Meldesystem soll helfen, wenn etwa volksverhetztende Inhalte geteilt werden, Ex-Partner:innen bedroht oder in Gruppen zur Gewalt gegen Minderheiten aufgerufen wird. Es ist zwar ein Eingriff in private Kommunikation, aber man kann argumentieren, dass dieser in Abwägung mit den Gefahren gerechtfertigt ist. Selbstverständlich wäre WhatsApp in der Pflicht, seine Nutzer:innen besser darüber informieren, wie das Meldesystem funktioniert und dass ihre Nachrichten mit ein paar Klicks an Moderator:innen weitergeleitet werden können.

Die größere Gefahr für die Privatsphäre bei WhatsApp kommt jedoch von einer anderen Stelle: Es sind die Metadaten, die über Menschen ähnlich viel verraten wie die Inhalte ihrer Gespräche. Dazu gehört die Identität von Absender und Empfänger, ihre Telefonnummern und zugehörige Facebook-Konten, Profilfotos, Statusnachrichten sowie Akkustand des Telefons. Außerdem Informationen zum Kommunikationsverhalten: Wer kommuniziert mit wem? Wer nutzt die App wie häufig und wie lange?

Aus solchen Daten lassen sich Studien zufolge weitgehende psychologische Profile bilden. So kommt es schon mal vor, dass Facebook-Manager ihren Werbekunden versprechen, diese könnten auf der Plattform „emotional verletzliche Teenager“ finden. „We kill people based on metadata“, offenbarte der frühere NSA-Chef Michael Hayden über metadatenbasierte Raketenangriffe der USA.

Wie WhatsApp eine Whistleblowerin ans Messer lieferte

WhatsApp sammelt diese Daten im großen Stil, weil sie sich zu Geld machen lassen. Im Originalbericht von ProPublica kommt dieser Aspekt durchaus vor, in vielen deutschen Meldungen geht er leider unter. Tatsächlich berichtet das US-Medium sogar vom Fall einer Whistleblowerin, die ins Gefängnis musste, weil WhatsApp ihre Metadaten an das FBI weitergab. Natalie Edwards war im US-Finanzministerium angestellt und reichte Informationen über verdächtige Transaktionen an BuzzFeed News weiter. Entdeckt und verurteilt wurde sie unter anderem, weil die Strafverfolger nachweisen konnten, dass sie in regem WhatsApp-Kontakt mit dem BuzzFeed-Reporter stand.

Dem Bericht zufolge gibt WhatsApp in den USA derlei Metadaten regelmäßig an Ermittlungsbehörden weiter. Auch in Deutschland und Europa dürfte dies der Fall sein. Hinzukommt, dass nicht nur staatliche Stellen die verräterischen Informationen erhalten, sondern auch Facebook. Dort werden sie genutzt, um die Datenprofile der Nutzer:innen zu verfeinern und in weiten Teilen der Welt auch, um Werbeanzeigen besser zuschneiden zu können. Als der Datenkonzern den Messenger 2014 aufkaufte, versprach er der europäischen Wettbewerbsbehörde, dass dies technisch überhaupt nicht möglich sei. Eine dreiste Lüge, für die das Unternehmen mehr als 100 Millionen Euro Strafe zahlen musste.

Deshalb lässt sich nicht oft genug sagen: Auch wenn die Ende-zu-Ende-Verschlüsselung des Messengers funktioniert, ist WhatsApp kein guter Ort für private Kommunikation. Journalist:innen, die auf diesem Messenger vertrauliche Gespräche mit ihren Quellen führen, handeln unverantwortlich. Wer wirklich sicher und datensparsam kommunizieren will, sollte Alternativen wie Threema oder Signal nutzen, die kaum Metadaten speichern.

 

Is it time to leave WhatsApp – and is Signal the answer!

 

The Facebook-owned messaging service has been hit by a global backlash over privacy. Many users are migrating to Signal or Telegram. Should you join them?

Whatsapp, Signal and Telegram app icons  on a smartphone screen
WhatsApp, Signal and Telegram: three leading choices for messaging services. Photograph: Rafael Henrique/Sopa Images/RexShutterstock
 

Earlier this month, WhatsApp issued a new privacy policy along with an ultimatum: accept these new terms, or delete WhatsApp from your smartphone. But the new privacy policy wasn’t particularly clear, and it was widely misinterpreted to mean WhatsApp would be sharing more sensitive personal data with its parent company Facebook. Unsurprisingly, it prompted a fierce backlash, with many users threatening to stop using the service.

WhatsApp soon issued a clarification, explaining that the new policy only affects the way users’ accounts interact with businesses (ie not with their friends) and does not mandate any new data collection. The messaging app also delayed the introduction of the policy by three months. Crucially, WhatsApp said, the new policy doesn’t affect the content of your chats, which remain protected by end-to-end encryption – the “gold standard” of security that means no one can view the content of messages, even WhatsApp, Facebook, or the authorities.

 

But the damage had already been done. The bungled communication attempts have raised awareness that WhatsApp does collect a lot of data, and some of this could be shared with Facebook. The BBC reported that Signal was downloaded 246,000 times worldwide in the week before WhatsApp announced the change on 4 January, and 8.8m times the week after.

WhatsApp does share some data with Facebook, including phone numbers and profile name, but this has been happening for years. WhatsApp has stated that in the UK and EU the update does not share further data with Facebook – because of strict privacy regulation, known as the general update to data protection regulation (GDPR). The messaging app doesn’t gather the content of your chats, but it does collect the metadata attached to them – such as the sender, the time a message was sent and who it was sent to. This can be shared with “Facebook companies”.

Facebook’s highly criticised data collection ethos has eroded trust in the social network. Its practices can put vulnerable people at risk, says Emily Overton, a data protection expert and managing director of RMGirl. She cites the example of Facebook’s “people you may know” algorithm exposing sex workers’ real names to their clients – despite both parties taking care to set up fake identities. “The more data they profile, the more they put people in vulnerable positions at risk.”

And the social network isn’t known for keeping promises. When Facebook bought WhatsApp in 2014, it pledged to keep the two services separate. Yet only a few years later, Facebook announced aims to integrate the messaging systems of Facebook, Instagram and WhatsApp. This appears to have stalled owing to technical and regulatory difficulties around encryption, but it’s still the long-term plan.


Why are people choosing Signal over Telegram?

Signal, a secure messaging app recommended by authorities such as the Electronic Frontier Foundation and Edward Snowden, has been the main beneficiary of the WhatsApp exodus. Another messaging app, Telegram, has also experienced an uptick in downloads, but Signal has been topping the charts on the Apple and Android app stores.

Signal benefits from being the most similar to WhatsApp in terms of features, while Telegram has had problems as a secure and private messaging app, with its live location feature recently coming under fire for privacy infringements. Crucially, Telegram is not end-to-end encrypted by default, instead storing your data in the cloud. Signal is end-to-end encrypted, collects less data than Telegram and stores messages on your device rather than in the cloud.


Does Signal have all the features I am used to and why is it more private?

Yes, Signal has most of the features you are used to on WhatsApp, such as stickers and emojis. You can set up and name groups, and it’s easy to send a message: just bring up the pen sign in the right-hand corner.

Signal has a desktop app, and you can voice and video chat with up to eight people. Like WhatsApp, Signal uses your phone number as your identity, something that has concerned some privacy and security advocates. However, the company has introduced pin codes in the hope of moving to a more secure and private way of identifying users in the future.

As well as being end-to-end encrypted, both WhatsApp and Signal have a “disappearing messages” feature for additional privacy. The major difference is how each app is funded. WhatsApp is owned by Facebook, whose business model is based on advertising. Signal is privacy focused and has no desire to analyse, share or profit from users’ private information, says Jake Moore, cybersecurity specialist at ESET.

Signal is supported by the non-profit Signal Foundation, set up in 2018 by WhatsApp founder Brian Acton and security researcher (and Signal Messenger CEO) Moxie Marlinspike, who created an encryption protocol that is used by several messaging services, including WhatsApp and Skype as well as Signal itself. Acton, who left Facebook in 2017 after expressing concerns over how the company operated, donated an initial $50m to Signal, and the open-source app is now funded by the community. Essentially that means developers across the world will continually work on it and fix security issues as part of a collaborative effort, making the app arguably more secure.

But there are concerns over whether Signal can maintain this free model as its user base increases to the tens, or potentially in the future, hundreds of millions. Signal is adamant it can continue to offer its service for free. “As a non-profit, we simply need to break even,” says Aruna Harder, the app’s COO.

Signal is exclusively supported by grants and donations, says Acton. “We believe that millions of people value privacy enough to sustain it, and we’re here to demonstrate that there is an alternative to the ad-based business models that exploit user privacy.”


I want to move to Signal. How do you persuade WhatsApp groups to switch?

The momentum away from WhatsApp does appear to be building, and you may find more of your friends have switched to Signal already. But persuading a larger contact group can be more challenging.

Overton has been using Signal for several years and says all her regular contacts use the app. “Even when dating online, I ask the person I want to go on a date with to download Signal, or they don’t get my number.”

Some Signal advocates have already begun to migrate their groups over from WhatsApp. Jim Creese, a security expert, is moving a neighbourhood text group of 100 people to Signal. He is starting with a smaller sub-group of 20, some of whom struggle with technology. Creese says most are ambivalent about switching “as long as the new method isn’t more difficult”.

He advises anyone who’s moving groups across apps to focus on the “why” first. “Explain the reasons for the change, how it is likely to affect them, and the benefits. Don’t rush the process. While WhatsApp might not be where you want to be today, there’s no emergency requiring an immediate move.”

Moore thinks the shift away from WhatsApp will continue to gain momentum, but he says it will take time to move everyone across. Until then, it’s likely you will need to keep both WhatsApp and Signal on your phone.

Moore is in the process of moving a family chat to Signal, for the second time. “When I originally tried, one family member didn’t understand my concerns and thought I was being overcautious.

“However, the recent news has helped him understand the potential issues and why moving isn’t such a bad idea. The next hurdle will be getting my mother to download a new app and use it for the first time without me physically assisting her.”

Source: https://www.theguardian.com/technology/2021/jan/24/is-it-time-to-leave-whatsapp-and-is-signal-the-answer

The Messenger Alternatives

Some use the internet, some function without servers, some are paid and others are free, but all these apps claim to have one thing in common—respect for user privacy

alternate apps_bgImage: Jaap Arriens/NurPhoto via Getty Images

Ever since WhatsApp announced an update in its privacy policy, thousands of people rushed to download messenger alternatives such as Signal and Telegram. While these two have been in the news for their security features that are tighter than the messaging giant’s, there are other applications that have been around, used for both facilitating consumer-to-consumer messaging and within enterprises for their internal communication.While some of these alternative apps need the internet, others don’t. Some function without servers with peer-to-peer technology, and are on a subscription model, while others are free to use. But they all claim to have one thing in common–respect for users’ privacy.

Although security and privacy-related technologies are constantly evolving making it difficult to lay down a clear benchmark for which app is completely secure, there are a few things users should be aware of to ensure their privacy is not compromised, say technology and privacy experts.First, says Divij Joshi, technology policy fellow at Mozilla Foundation, a global non-profit, “It’s definitely important to have a communications protocol based on end-to-end encryption.”End-to-end encryption refers to a system of communication wherein only the sender and receiver can read the messages and see the content shared.However, Joseph Aloysius, a Singapore-based student researcher in surveillance studies, says, “Even with encryption it is important that it is device-based end-to-end encryption, and not cloud-based. In addition, the encryption setting should be a default setting, not optional as seen in Telegram.”Another point to keep in mind is to ensure that technologies collect as little metadata–information not related to the message content but things like quantum or location of messages–as possible, adds Joshi.Second, they should be open source and left open for public auditing. “Ideally, it’s best if companies leave the server code open as Signal has done,” says Aloysius.Both Joshi and Aloysius are of the view that it is also necessary to ensure that the corporate practices of the application are clear and fair. “For instance, terms of use, the privacy policy, so they can’t alter the technology or data collection practices arbitrarily,” says Joshi.Although there has been an uproar about the latest changes to the privacy policy, WhatsApp continues to remain popular primarily due to its ease of use and convenience, say experts. “For some, it may also be a cost concern. There may also be a false sense of security since nothing apparent has gone wrong and there have been no consequences to date for them using the app for business purposes,” explains Heidi Shey, principal analyst, security and risk, Forrester.However, if you are a user who is concerned about privacy, here is a lowdown on alternatives to WhatsApp and the features they offer.Wickr

wickr

The San Francisco-based app, founded in 2012, is used by some of the biggest players in the federal space including the U.S. Department of Defense. It has also been validated by the National Security Agency as the, “most secure collaboration tool in the world,” says co-founder and CTO of Wickr, Chris Howell. He adds, “Our government and enterprise customers choose Wickr because we have the most secure, end-to-end encrypted platform on the market that enables sensitive mission and business communications without compromising compliance.”Wickr’s largest user base is in the US, followed by Europe, India and Australia, but it has seen an uptick in both their consumer and commercial platforms ever since WhatsApp announced plans to update its privacy policy, says Howell.While the app can be deployed by organisations in highly regulated industries such as banking, energy, healthcare and the federal government, one of its versions, Wickr Me, is more suitable for one-on-one conversations with family and friends. Wickr cannot identify owners because it doesn’t have access to any personal information. The data is encrypted and not accessible to the company. All the messages are stored on the user’s device and for a brief period on Wickr’s servers, but get deleted upon delivery. Since messages are end-to-end encrypted, even when messages are on the server, they are not available to the company.With Wickr Me, users can share files, photos, videos and voice messages, and also do video and audio conferencing. The messages are ephemeral, meaning they only exist for a limited amount of time and get permanently deleted from the sending as well as the receiving device after a while. Therefore, if the recipient doesn’t check Wickr frequently, the messages may never get delivered. “Wickr’s security architecture and proprietary encryption methodology is designed to ensure that only users can gain access to their message content. Users’ content is encrypted locally on their device and is accessible only to intended recipients,” explains Howell.Jami

jami

An open-source service, Jami doesn’t store users’ personal information on a central server, guaranteeing users full anonymity and privacy. Around since 2013, Christophe Villemer, advocacy vice-president of the Canada-based messenger app, says, “We really are a newcomer in the market, we estimate there are around 100,000 users around the globe but our community is growing every day.” He says Jami is peer-to-peer, which means it doesn’t require a server for relaying data between users. Therefore, users don’t have to worry about a third party conserving their video or data on its servers. With features such as HD video calling, instant and voice messaging, and file sharing, the service is free to use. All the connections are end-to-end encrypted. “At Jami, we think that privacy is a primary right on the internet. Everybody should be free not to give their data to corporations to benefit from an essential service on the internet,” says Villemer. “Also, we think that our solution, as it’s peer-to-peer, is globally better for the environment because it does not rely on huge server farms or data-centers,” he adds. Users of the service have no restrictions in terms of the size of the files they share, nor speed, bandwidth, features, number of accounts or storage. In addition, if users are on the same local network, they can communicate using Jami even if they are disconnected from the internet. “There will never be advertising on Jami,” says Villemer.Briar

briar

Briar Messenger is a not-for-profit organisation that started off as a project by Michael Rogers in an attempt to support freedom of expression, freedom of association, and the right to privacy. In India, Briar is extremely popular in Kashmir. Reason? It can work without the internet via Wi-Fi or Bluetooth. Launched in 2018, this application uses direct, encrypted connections to prevent surveillance and censorship. Briar allows users to form private groups (with one admin that can invite others), write blogs, and also create public discussion forums. The application doesn’t rely on central servers and sends across messages without leaking metadata.Torsten Grote, senior developer, Briar Messenger, says, “Briar is for users who have higher security requirements such as not wanting to reveal who their contacts are (think journalist and source) or for users who need to keep the communication going when the internet is not available, be it because of natural disasters or deliberate shutdowns.” So far, Briar has around 200,000 downloads on Google Play and around 100,000 downloads from their website. The application is also available on F-Droid and other independent stores, which don’t track downloads. However, “thanks to the WhatsApp policy change,” says Grote, “we are seeing 7x the usual number of downloads.”Threema

threema

In 2012, three young software developers from Switzerland decided to create a secure instant messenger that would prevent the misuse of user data by companies and surveillance by governments. After Facebook bought WhatsApp in early 2014, the number of users climbed to 2 million in just a few weeks. “In Threema, all communication is protected in the best possible way by end-to-end encryption. Since Threema is open source, users can independently verify that Threema doesn’t have access to any user data that could be handed over to third parties,” says Roman Flepp, head of marketing and sales, Threema.One of Threema’s guiding principles is “metadata restraint”, which means if there is no data, no data can be misused, either by corporations, hackers or surveillance authorities. Currently, the messenger has over 9 million users. In the light of the recent WhatsApp privacy issue, Flepp claims the daily download numbers have increased significantly, by a factor of 10. This growth has been consistently high since the policy change was announced. He adds, “This whole controversy could be a game changer. Now more and more people are looking around for a more private and secure messaging solution.”The application can be used not only by individual users, but also businesses. Threema has various business solutions such as Threema Work and Threema Education. “Especially in the business environment, it is crucial that a secure and privacy-compliant solution is used for work-related communication. We see a great demand, more than 5,000 companies are already using our business solution Threema Work,” says Flepp. Currently, the team is working on creating a multi-device solution that will allow users to use Threema on multiple devices.****While a bunch of these applications are great options for secure peer-to-peer messaging, it is not a very sustainable revenue model for most of these companies. Hence, a few of them have moved to offer enterprise solutions. “For business use, a consumer-focused messaging app [like WhatsApp] is insufficient because it isn’t designed with business requirements for security, privacy, and compliance in mind,” says Shey.Post the recent announcement about the policy changes, a lot of government organisations and companies banned the use of applications like WhatsApp on company-issued devices and for work. We take a look at some applications that offer paid messaging solutions to businesses.Wire

wire

Though the idea for Wire was conceived in 2012, the product was only launched in 2014 and initially for consumers. However, in 2017, the Germany-based company decided to focus mainly on enterprises. This was because, says Morten Brøgger, CEO of Wire, “We were against giants like Facebook, and consumers were not willing to understand the importance of privacy and pay for it.” This was also around the same time that the General Data Protection Regulation (EU GDPR) was coming up, and privacy was becoming a major concern for organisations. “Hence, we felt the solution we built would be extremely compelling to enterprise consumers,” he adds.Currently, Wire has close to 1,800 paid customers, which mainly include governments and large enterprises, whereas, for the general free solution, they have about half a million monthly active users. Most of their paid customers are in Germany, North America, Australia, the Middle East, and some European countries.Most of the traditional enterprise SaaS solutions have a few risk points, including “man in the middle vulnerability” since the cloud provider is in the middle, which means all the processing and storage happens on the cloud. The main weakness here is that the cloud provider can technically access the encryption key, which means the cloud provider can technically read and listen to all your content. However, Wire has a very different architecture, wherein there is no man in the middle. “All the data resides in the application on your device. There is some storage on the cloud, for bigger files, and these are secured with individual encryption keys. But the encryption keys only exist on the devices of our users, there’s no copy of the keys on the cloud,” Brøgger says.Another USP of this open-source application is that every time you send or receive a message—be it a text message, call, video conference or screen share—the encryption key updates, hence giving each individual message a unique encryption key. Says Brøgger, “We don’t know who the users are, what they are using it for and we barely collect any metadata, whatever little is collected to help synchronise different devices is also anonymised.”Currently, the company is going at 400 percent revenue growth year-on-year. “We saw a great spike in the paid clients at the beginning of the pandemic, and now [due to the WhatsApp privacy policy issue] since enterprises are becoming more aware of the importance of privacy.”Troop

troop_messenger

Troop Messenger was launched in mid-2018 as an internal messaging app for enterprises. “It is a home-grown, made in India, robust and a secured business messaging platform,” says CEO and founder Sudhir Naidu. A single platform, it enables internal teams to chat, make audio and video calls, convert them into conferencing, share screens, and create groups. It also features a self-destructible chat window to exchange secured information, and will shortly introduce an email client so users can both send e-mails and messages. “We have pledged that we would not sell any kind of user data to any third-party organisations. We assess and track all kinds of intrusions and attacks and follow the policy of honestly disclosing to clients if there is a breach which involves a threat to their data,” says Naidu. Additionally, Troop follows a stringent and comprehensive internal security framework and policy, in terms of development, testing and release.Besides Indian enterprises, Troop Messenger has been seeing good traction from the US, UK and the Middle East, informs Naidu. “We see three times the usual daily registrations for our platform, since the [WhatsApp] policy came out,” he says. “Businesses that were using WhatsApp before are actively looking out for much safer and business-oriented platforms such as ours,” he adds.Arattai

arattai

Zoho Corp, which has products like Zoho Mail and Zoho Business Suite, released a beta version of its messaging application Arattai, meaning chit-chat in Tamil, in the middle of the pandemic in 2020. “More than 70,000 users have already downloaded Arattai and we didn’t advertise at all,” says Praval Singh, VP, marketing at Zoho Corp. “The final application is close to being launched,” he adds. As a privately held company, Singh says, their focus is on user privacy. “We have retained that we’ve held that stance in many ways for our enterprise and business users. And we would like to take it forward with consumer applications as well. For example, we don’t use our own application or data of users to share with third parties, either as a monetisation strategy or for any other reason. So, data that sits on an application doesn’t go to a third party,” he says. In fact, they own their data centers. Therefore, they are not dependent on any third party or public clouds for storage. Spike

spike

Initially released in October 2018, Spike is a conversational and collaborative email application that turns legacy email into a synchronic chat-like experience, adding tasks, collaborative notes and multimedia to create a single feed for work.Instead of using another application, Spike turns an individual’s email address inbox into a hub for chatting with co-workers, friends, and family–as well as a place to work on documents, manage tasks, and share files. Unlike WhatsApp groups, says Dvir Ben-Aroya, co-founder and CEO of Spike, “Spike groups provide a real-time collaborative tool for businesses, without switching between separate team messenger apps.” The application promises to store minimum data to provide fast communication and ensure privacy. Currently, Spike has over 100,000 active teams using this application.“We’ve seen a drastic uptick in users after the WhatsApp announcement, but since we track minimal user data, we cannot access specific data or directly attribute these users’ behaviour with correlation to using WhatsApp,” he says. Its highest user base is in the US, Germany, the UK, and it is very popular in India, especially among students and educators.(With inputs from Namrata Sahoo)

Source: https://www.forbesindia.com/article/take-one-big-story-of-the-day/whatsalt-the-messenger-alternatives/65909/1

How Whatsapp spies on Your Messages – WhatsApp Retransmission Vulnerability

According to Tobias Boelter tobias@boelter.it

Download the Slides from Tobias here: Whatsapp Slides from Tobias Boelter

Setting: Three phones. Phone A is Alice’s phone. Phone B is Bob’s phone. Phone C is the attacker’s phone.

Alice starts by communication with bob and being a good human of course meets with Bob in person and they verify each other’s identities, i.e. that the key exchange was not compromised.

Remember, Alice encrypts her messages with the public key she has received from Bob. But this key is sent through the WhatsApp servers so she can not know for sure that it is actually Bob’s key. That’s why they use a secure channel (the physical channel) to verify this.

Now, Alice sends a message to Bob. And then another message. But this time this message does not get delivered. For example because Bob is offline, or the WhatsApp server just does not forward the message.

wa3

Now the attacker comes in. He registers Bob’s phone number with the WhatsApp server (by attacking the way to vulnerable GSM network, putting WhatsApp under pressure or by being WhatsApp itself).

Alice’s WhatsApp client will now automatically, without Alices‘ interaction, re-encrypt the second message with the attackers key and send it to the attacker, who receives it:

wa2

Only after the act, a warning is displayed to Alice (and also only if she explicitly chose to see warnings in here settings).

wa1

Conclusion

Proprietary closed-source crypto software is the wrong path. After all this – potentially mallicious code – handles all our decrypted messages. Next time the FBI will not ask Apple but WhatsApp to ship a version of their code that will send all decrypted messages directly to the FBI.

Signal is better

Signal is doing it right. Alice’s second message („Offline message“) was never sent to the attacker.

signal3 signal1 signal2

Signal is also open source and experimenting with reproducible builds. Have a look at it.

Update (May 31, 2016)

Facebook responded to my white-hat report

„[…] We were previously aware of the issue and might change it in the future, but for now it’s not something we’re actively working on changing.[…]“

https://tobi.rocks/2016/04/whats-app-retransmission-vulnerability/

Download the Presentation here: Whatsapp Slides from Tobias Boelter

Whatsapp spies on your encrypted messages

Exclusive: Privacy campaigners criticise WhatsApp vulnerability as a ‘huge threat to freedom of speech’ and warn it could be exploited by government agencies

Research shows that the company can read messages due to the way WhatsApp has implemented its end-to-end encryption protocol.
Research shows that WhatsApp can read messages due to the way the company has implemented its end-to-end encryption protocol. Photograph: Ritchie B Tongo/EPA

A security backdoor that can be used to allow Facebook and others to intercept and read encrypted messages has been found within its WhatsApp messaging service.

Facebook claims that no one can intercept WhatsApp messages, not even the company and its staff, ensuring privacy for its billion-plus users. But new research shows that the company could in fact read messages due to the way WhatsApphas implemented its end-to-end encryption protocol.

Privacy campaigners said the vulnerability is a “huge threat to freedom of speech” and warned it can be used by government agencies to snoop on users who believe their messages to be secure. WhatsApp has made privacy and security a primary selling point, and has become a go to communications tool of activists, dissidents and diplomats.

WhatsApp’s end-to-end encryption relies on the generation of unique security keys, using the acclaimed Signal protocol, developed by Open Whisper Systems, that are traded and verified between users to guarantee communications are secure and cannot be intercepted by a middleman. However, WhatsApp has the ability to force the generation of new encryption keys for offline users, unbeknown to the sender and recipient of the messages, and to make the sender re-encrypt messages with new keys and send them again for any messages that have not been marked as delivered.

The recipient is not made aware of this change in encryption, while the sender is only notified if they have opted-in to encryption warnings in settings, and only after the messages have been resent. This re-encryption and rebroadcasting effectively allows WhatsApp to intercept and read users’ messages.

The security backdoor was discovered by Tobias Boelter, a cryptography and security researcher at the University of California, Berkeley. He told the Guardian: “If WhatsApp is asked by a government agency to disclose its messaging records, it can effectively grant access due to the change in keys.”

The backdoor is not inherent to the Signal protocol. Open Whisper Systems’ messaging app, Signal, the app used and recommended by whistleblower Edward Snowden, does not suffer from the same vulnerability. If a recipient changes the security key while offline, for instance, a sent message will fail to be delivered and the sender will be notified of the change in security keys without automatically resending the message.

WhatsApp’s implementation automatically resends an undelivered message with a new key without warning the user in advance or giving them the ability to prevent it.

Boelter reported the backdoor vulnerability to Facebook in April 2016, but was told that Facebook was aware of the issue, that it was “expected behaviour” and wasn’t being actively worked on. The Guardian has verified the backdoor still exists.

The WhatsApp vulnerability calls into question the privacy of messages sent across the service used around the world, including by people living in oppressive regimes.
Pinterest
The WhatsApp vulnerability calls into question the privacy of messages sent across the service used around the world, including by people living in oppressive regimes. Photograph: Marcelo Sayão/EPA

Steffen Tor Jensen, head of information security and digital counter-surveillance at the European-Bahraini Organisation for Human Rights, verified Boelter’s findings. He said: “WhatsApp can effectively continue flipping the security keys when devices are offline and re-sending the message, without letting users know of the change till after it has been made, providing an extremely insecure platform.”

Boelter said: “[Some] might say that this vulnerability could only be abused to snoop on ‘single’ targeted messages, not entire conversations. This is not true if you consider that the WhatsApp server can just forward messages without sending the ‘message was received by recipient’ notification (or the double tick), which users might not notice. Using the retransmission vulnerability, the WhatsApp server can then later get a transcript of the whole conversation, not just a single message.”

The vulnerability calls into question the privacy of messages sent across the service, which is used around the world, including by people living in oppressive regimes.

Professor Kirstie Ball, co-director and founder of the Centre for Research into Information, Surveillance and Privacy, called the existence of a backdoor within WhatsApp’s encryption “a gold mine for security agencies” and “a huge betrayal of user trust”. She added: “It is a huge threat to freedom of speech, for it to be able to look at what you’re saying if it wants to. Consumers will say, I’ve got nothing to hide, but you don’t know what information is looked for and what connections are being made.”

In the UK, the recently passed Investigatory Powers Act allows the government to intercept bulk data of users held by private companies, without suspicion of criminal activity, similar to the activity of the US National Security Agency uncovered by the Snowden revelations. The government also has the power to force companies to “maintain technical capabilities” that allow data collection through hacking and interception, and requires companies to remove “electronic protection” from data. Intentional or not, WhatsApp’s backdoor to the end-to-end encryption could be used in such a way to facilitate government interception.

Jim Killock, executive director of Open Rights Group, said: “If companies claim to offer end-to-end encryption, they should come clean if it is found to be compromised – whether through deliberately installed backdoors or security flaws. In the UK, the Investigatory Powers Act means that technical capability notices could be used to compel companies to introduce flaws – which could leave people’s data vulnerable.”

A WhatsApp spokesperson told the Guardian: “Over 1 billion people use WhatsApp today because it is simple, fast, reliable and secure. At WhatsApp, we’ve always believed that people’s conversations should be secure and private. Last year, we gave all our users a better level of security by making every message, photo, video, file and call end-to-end encrypted by default. As we introduce features like end-to-end encryption, we focus on keeping the product simple and take into consideration how it’s used every day around the world.

“In WhatsApp’s implementation of the Signal protocol, we have a “Show Security Notifications” setting (option under Settings > Account > Security) that notifies you when a contact’s security code has changed. We know the most common reasons this happens are because someone has switched phones or reinstalled WhatsApp. This is because in many parts of the world, people frequently change devices and Sim cards. In these situations, we want to make sure people’s messages are delivered, not lost in transit.”

Asked to comment specifically on whether Facebook/WhatApps had accessed users’ messages and whether it had done so at the request of government agencies or other third parties, it directed the Guardian to its site that details aggregate data on government requests by country.

Concerns over the privacy of WhatsApp users has been repeatedly highlighted since Facebook acquired the company for $22bn in 2014. In August 2015, Facebook announced a change to the privacy policy governing WhatsApp that allowed the social network to merge data from WhatsApp users and Facebook, including phone numbers and app usage, for advertising and development purposes.

Facebook halted the use of the shared user data for advertising purposes in November after pressure from the pan-European data protection agency groupArticle 29 Working Party in October. The European commission then filed charges against Facebook for providing “misleading” information in the run-up to the social network’s acquisition of messaging service WhatsApp, following its data-sharing change.

https://www.theguardian.com/technology/2017/jan/13/whatsapp-backdoor-allows-snooping-on-encrypted-messages

Facebook Messenger On Android Hits 1 Billion Downloads

Only two companies have apps with over 1 billion Google Play downloads, and the other is Google. Today Facebook proved just how big a business replacing SMS can be, as its leader David Marcus announced Messenger has now been downloaded over 1 billion times on Android. It joins Facebook and WhatsApp, and Google’s Gmail, YouTube, Search, and Maps in this very exclusive club.

Messenger’s strategy of layering modern mobile sharing features over a speedy texting app has paid off, and it looks like Facebook’s just getting started. With VOIP, video calling, stickers, voice clips, peer-to-peer payments, location, and a whole platform of third-party content creation apps, Messenger wants to own every way you communicate. And it partially is for well over 600 million users.

11406840_10155749985835195_5645524173167231589_n

Combined with WhatsApp’s streamlined SMS alternative, Facebook controls messaging in a way that deeply insulates it from disruption. Snapchat and Yik Yak might steal a few users from its social network feed, but Facebook’s already focusing on the next fundamental communication utility.

In fact, Facebook has been subtly baking Messenger munch deeper into its product.

When you graph search for people, like friends who like a certain band, Facebook shortcuts you to ping them on Messenger, not visit their profile. When it’s a friend’s birthday, in some cases Facebook now recommends that you message them Happy Birthday, rather than writing it on their wall.

Messenger In Facebook

Just last week, Facebook overhauled how Messenger handles map and location sharing to lay the groundwork for a slew of new GPS-enabled features. Before, finding where to meet up with people was the domain of Nearby Friends in the main Facebook app.

And Facebook’s secret weapon in the messaging wars is that chat isn’t where it makes its money. Rather than having to cram Messenger full of ads or convince you to buy Sticker packs, it just has to tie people closer to its big brother Facebook where lucrative mobile ads earn enough money to provide for the whole family.

messenger-location-sharing3

Getting to this point wasn’t easy. Facebook had offend the pride of its whole userbase by telling them they were required to download whole other app for Messaging. It wasn’t sweet, but the medicine went down, and Facebook saw engagement rise once chat wasn’t buried in its blue behemoth. Freed from the extra weight, Messenger was thin and agile enough to build out its bells and whistles.

With former PayPal President David Marcus in command and expert product guy Stan Chudnovsky as his first mate, in just the last six months Messenger has:

Meanwhile, the other giant with deep enough pockets to fund a true attempt at owning messaging has spent the past few years distracted by moonshots. Google was late to launch its mobile messenger, which was dragged down by Google Plus. It squandered its Hangouts product’s early lead in video chat, and missed on the chance to acquire WhatsApp, which could have turned this into a two-horse race.

facebooks-family

Instead, Facebook saw that messaging was the center of mobile, the app you use the most times per day. If it’s the reason you open your phone at first, it’s wedged a foot in the door to become the second and third thing you do too. And with China’s WeChat pioneering the chat-app-as-a-portal roadmap, Facebook can just port what works to the rest of the world.

After years of people asking what would be the Facebook killer, Facebook happily provided its own answer.

Quote: http://techcrunch.com/2015/06/09/the-new-facebook

Facebook’s WhatsApp Will Be How the World Makes Phone Calls

Further Reading: http://www.wired.com/2015/04/facebooks-whatsapp-worlds-next-phone

WhatsApp is the world’s most popular smartphone messaging app, letting more than 800 million people send and receive texts on the cheap. But it’s evolving into something more.

On Tuesday, the company, which is owned by Facebook, released a new version of the app that allows people with iPhones to not only text people, but actually talk to them. This built on a similar move the company made at the end of March, when it quietly released an Android update that did the same thing. And in the week following the addition of voice calling on Android, WhatsApp-related traffic increased about 5 percent on carrier networks, according to a study by Allot Communications—an Israeli company that helps manage wireless network traffic worldwide.

That figure will likely get a lot bigger as WhatsApp shifts from being the world’s favorite messaging app to become a more wide-ranging—and bandwidth-intensive—communication tool.

Others have offered internet voice calls on smartphones, most notably Skype and Viber. But WhatsApp is different. So many people already use the app, and the company is intent on keeping it free (or nearly free). Though it has little traction here in the US, WhatsApp is enormously popular in parts of Europe and the developing world—areas where there’s a hunger for cheap communication. The result is an app that could bring inexpensive Internet calls to an audience of unprecedented size.

Developing World

The rapidly evolving WhatsApp is but one face of the dramatic technological changes sweeping across the developing world. So many companies are working to bring affordable smartphones to the market, from China’s Xiaomi to the Silicon Valley’s Cyanogen, as many others, from China’s WeChat to Viber, push cheap communication services onto these devices.

These technologies face the usual obstacles—and WhatsApp is no exception. Though the app is expected to reach a billion users by year’s end, its push into voice calls could alienate many wireless carriers. If you have free internet calls, after all, you don’t need to pay for cellular calls. Some carriers may fight the tool as a result, says Allot associate vice president Yaniv Sulkes.

But the same could be said of messaging on WhatsApp. It too cuts into the carriers’ way of doing things. And yet, WhatsApp has thrived. It has so much traction in large part because it has cultivated partnerships with carriers, striking deals that bundle its app with lost-cost wireless services. According another Allot survey, about 37 percent of the carriers now have deals with WhatsApp or similar inexpensive Internet-based services—a sharp rise over the past few years. “More and more operators are adopting the strategy of ‘let’s partner with them’ rather than ‘let’s fight them,’” Sulkes says.

In the meantime, Facebook is pushing for somewhat similar arrangements, through its Internet.org initiative, that bundle limited Internet access with access to specific apps. Mark Zuckerberg and company have encountered some opposition to these deals. But the combined might of Facebook and WhatsApp will be hard for carriers to resist.

Video Next?

As WhatsApp spreads, Sulkes believes, it will keep pushing into new services. After rolling out voice calling, he says, it may venture into video calling. The app already lets you send files, including videos, and other messaging apps, such as SnapChat, already have ventured into video calls.

None of these tools—video calls, voice calls, file sharing—are new technologies. But not everyone has them. WhatsApp has the leverage to change that. The app has grabbed hold of the developing world in rapid fashion, and now it can serve as a platform for bringing all sorts of modern communications to the far reaches of the globe. Yes, there’s another major obstacle to overcome: so much of the developing world doesn’t have the network infrastructure to accommodate these kinds of modern services. But Facebook is set to change that, too.

Whatsapp Calls on Iphone

Further Reading: http://www.forbes.com/sites/amitchowdhry/2015/04/21/whatsapp-voice-calling-ios/ and http://www.macrumors.com/2015/04/21/whatsapp-gains-voice-calling/

WhatsApp, the popular mobile messaging service owned by Facebook, has released a major update to its iPhone app today. The update includes the highly-anticipated WhatsApp Calling feature, which rolled out to every Android user late last month. The WhatsApp Calling feature is comparable to Skype and the FaceTime Audio service on iOS. Data charges may apply while using the WhatsApp Calling feature.

“Call your friends and family using WhatsApp for free, even if they’re in another country. WhatsApp calls uses your phone’s Internet connection rather than your cellular plan’s voice minutes,” said WhatsApp in its app update description. 

Unfortunately, The WhatsApp Calling feature is rolling out slowly so you may not see it right away. The new calling feature should be available for every iOS user within the next few weeks. Prior to launching WhatsApp Calling for Android, the messaging company ran a lengthy beta test.

WhatsApp version 2.12.1 also includes an iOS 8 share extension, a quick camera button in chats, the ability to edit your contacts right from WhatsApp and an option to send multiple videos at once. You can also crop and rotate videos before sending them. The iOS 8 share extension lets you share photos, videos and links to WhatsApp from other apps. And the quick camera button lets you seamlessly capture photos and videos or choose a recent camera roll photo or video.

WhatsApp Update For iOS / Credit: WhatsApp

How does WhatsApp Calling for iOS work? If someone calls you through WhatsApp, you will see a push notification from the messaging service showing who the call is from. Once you answer the call, you will notice that there are options to mute the call or put it on speakerphone. You can also send a message to the person calling you. If the WhatsApp Calling feature for iOS is similar to the Android app, then you will see a Calls tab that has a list of your incoming, outgoing and missed WhatsApp calls. Personally, I do not have access to WhatsApp Calling for iOS app yet.

Launched in 2009, WhatsApp started out as a simple group text messaging app. Four years later, WhatsApp added a voice messaging service. And then Facebook acquired WhatsApp for $19 billion in February 2014. Several months ago, WhatsApp launched a desktop client called WhatsApp Web — which you can activate with an Android, BlackBerry, Windows Phone or Nokia S60 device.

Earlier this month, WhatsApp hit 800 million monthly active users. WhatsApp has been adding about 100 million monthly active users every four months since August. In January, WhatsApp hit 700 million monthly active users. WhatsApp now has more users than every other messaging app, including Facebook Messenger. It took Facebook about 8 years to hit 1 billion users. Facebook now has about 1.4 billion monthly users and Facebook Messenger has roughly 600 million users.“

„After promising to deliver voice calling capabilities back in 2014, WhatsApp has finally delivered, introducing voice over IP features in its latest update. With the new version of the app, it’s possible for WhatsApp users to call friends and family directly within the app using a Wi-Fi or cellular connection at no cost.

The introduction of voice calling to the Facebook-ownedWhatsApp app puts it on par with Facebook’s other messaging app, Facebook Messenger, which gained voice calling back in 2013. It also allows the app to better compete with other iOS-based VoIP calling options like Skype and FaceTime Audio.

Today’s WhatsApp update also brings a few other features, including the iOS 8 share extension for sharing videos, photos, and links to WhatsApp from other apps, contact editing tools, and the ability to send multiple videos at one time.

What’s new
-WhatsApp Calling: Call your friends and family using WhatsApp for free, even if they’re in another country. WhatsApp calls use your phone’s Internet connection rather than your cellular plan’s voice minutes. Data charges may apply. Note: WhatsApp Calling is rolling out slowly over the next several weeks.

-iOS 8 share extension: Share photos, videos, and links right to WhatsApp from other apps.

-Quick camera button in chats: Now you can capture photos and videos, or quickly choose a recent camera roll photo or video.

-Edit your contacts right from WhatsApp.

-Send multiple videos at once and crop and rotate videos before sending them.

WhatsApp can be downloaded from the App Store for free. The new WhatsApp calling feature will be rolling out to users over the next few weeks.“