Schlagwort-Archive: Privacy

Delete Signal’s texts, or the app itself, and virtually no trace of the conversation remains.

Delete Signal’s texts, or the app itself, and virtually no trace of the conversation remains. “The messages are pretty much gone

Suing to See the Feds’ Encrypted Messages? Good Luck

The recent rise of end-to-end encrypted messaging apps has given billions of people access to strong surveillance protections. But as one federal watchdog group may soon discover, it also creates a transparency conundrum: Delete the conversation from those two ends, and there may be no record left.

The conservative group Judicial Watch is suing the Environmental Protection Agency under the Freedom of Information Act, seeking to compel the EPA to hand over any employee communications sent via Signal, the encrypted messaging and calling app. In its public statement about the lawsuit, Judicial Watch points to reports that EPA staffers have used Signal to communicate secretly, in the face of an adversarial Trump administration.

But encryption and forensics experts say Judicial Watch may have picked a tough fight. Delete Signal’s texts, or the app itself, and virtually no trace of the conversation remains. “The messages are pretty much gone,” says Johns Hopkins crypotgrapher Matthew Green, who has closely followed the development of secure messaging tools. “You can’t prove something was there when there’s nothing there.”

End-to-Dead-End

Signal, like other end-to-end encryption apps, protects messages such that only the people participating in a conversation can read them. No outside observer—not even the Signal server that the messages route through—can sneak a look. Delete the messages from the devices of two Signal communicants, and no other unencrypted copy of it exists.

In fact, Signal’s own server doesn’t keep record of even the encrypted versions of those communications. Last October, Signal’s developers at the non-profit Open Whisper Systems revealed that a grand jury subpoena had yielded practically no useful data. “The only information we can produce in response to a request like this is the date and time a user registered with Signal and the last date of a user’s connectivity to the Signal service,” Open Whisper Systems wrote at the time. (That’s the last time they opened the app, not sent or received a message.)

Even seizing and examining the phones of EPA employees likely won’t help if users have deleted their messages or the full app, Green says. They could even do so on autopilot. Six months ago, Signal added a Snapchat-like feature to allow automated deletionof a conversation from both users’ phones after a certain amount of time. Forensic analyst Jonathan Zdziarski, who now works as an Apple security engineer, wrote in a blog post last year that after Signal messages are deleted, the app “leaves virtually nothing, so there’s nothing to worry about. No messy cleanup.” (Open Whisper Systems declined to comment on the Judicial Watch FOIA request, or how exactly it deletes messages.)

Still, despite its best sterilization efforts, even Signal might leave some forensic trace of deleted messages on phones, says Green. And other less-secure ephemeral messaging apps like Confide, which has also become popular among government staffers, likely leave more fingerprints behind. But Green argues that recovering deleted messages from even sloppier apps would take deeper digging than FOIA requests typically compel—so long as users are careful to delete messages on both sides of the conversation and any cloud backups. “We’re talking about expensive, detailed forensic analysis,” says Green. “It’s a lot more work than you’d expect from someone carrying out FOIA requests.”

For the Records

Deleting records of government business from government-issued devices is—let’s be clear—illegal. That smartphone scrubbing, says Georgetown Law professor David Vladeck, would blatantly violate the Federal Records Act. “It’s no different from taking records home and burning them,” says Vladeck. “They’re not your records, they’re the federal government’s, and you’re not supposed to do that.”

Judicial Watch, for its part, acknowledges that it may be tough to dig up deleted Signal communications. But another element of its FOIA request asks for any EPA information about whether it has approved Signal for use by agency staffers. “They can’t use these apps to thwart the Federal Records Act just because they don’t like Donald Trump,” says Judicial Watch president Tom Fitton. “This serves also as an educational moment for any government employees, that using the app to conduct government business to ensure the deletion of records is against the law, and against record-keeping policies in almost every agency.”

Fitton hopes the lawsuit will at least compel the EPA to prevent employees from installing Signal or similar apps on government-issued phones. “The agency is obligated to ensure their employees are following the rules so that records subject to FOIA are preserved,” he says. “If they’re not doing that, they could be answerable to the courts.”

Georgetown’s Vladeck says that even evidence employees have used Signal at all should be troubling, and might warrant a deeper investigation. “I would be very concerned if employees were using an app designed to leave no trace. That’s smoke, if not a fire, and it’s deeply problematic,” he says.

But Johns Hopkins’ Green counters that FOIA has never been an all-seeing eye into government agencies. And he points out that sending a Signal message to an EPA colleague isn’t so different from simply walking into their office and closing the door. “These ephemeral communications apps give us a way to have those face-to-face conversations electronically and in a secure way,” says Green. “It’s a way to communicate without being on the record. And people need that.”

https://www.wired.com/2017/04/suing-see-feds-encrypted-messages-good-luck/

Advertisements

The CIA Leak Exposes Tech’s Vulnerable Future

Source: https://www.wired.com/2017/03/cia-leak-exposes-techs-vulnerable-future/

Apple Ditched Secrecy for Openness

Apple CEO Tim Cook waves goodbye after an event at the Apple headquarters in Cupertino, California

Encryption Is Being Scapegoated To Mask The Failures Of Mass Surveillance

Source: http://techcrunch.com/2015/11/17/the-blame-game/

Well that took no time at all. Intelligence agencies rolled right into the horror and fury in the immediate wake of the latest co-ordinated terror attacks in the French capital on Friday, to launch their latest co-ordinated assault on strong encryption — and on the tech companies creating secure comms services — seeking to scapegoat end-to-end encryption as the enabling layer for extremists to perpetrate mass murder.

There’s no doubt they were waiting for just such an ‘opportune moment’ to redouble their attacks on encryption after recent attempts to lobby for encryption-perforating legislation foundered. (A strategy confirmed by a leaked email sent by the intelligence community’s top lawyer, Robert S. Litt, this August — and subsequently obtained by the Washington Post — in which he anticipated that a “very hostile legislative environment… could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement”. Et voila Paris… )

Speaking to CBS News the weekend in the immediate aftermath of the Paris attacks, former CIA deputy director Michael Morell said: “I think this is going to open an entire new debate about security versus privacy.”

“We, in many respects, have gone blind as a result of the commercialization and the selling of these devices that cannot be accessed either by the manufacturer or, more importantly, by us in law enforcement, even equipped with search warrants and judicial authority,” added New York City police commissioner, William J. Bratton, quoted by the NYT in a lengthy article probing the “possible” role of encrypted messaging apps in the Paris attacks.

Elsewhere the fast-flowing attacks on encrypted tech services have come without a byline — from unnamed European and American officials who say they are “not authorized to speak publicly”. Yet are happy to speak publicly, anonymously.

The NYT published an article on Sunday alleging that attackers had used “encryption technology” to communicate — citing “European officials who had been briefed on the investigation but were not authorized to speak publicly”. (The paper subsequently pulled the article from its website, as noted by InsideSources, although it can still be read via the Internet Archive.)

The irony of government/intelligence agency sources briefing against encryption on condition of anonymity as they seek to undermine the public’s right to privacy would be darkly comic if it weren’t quite so brazen.

Seeking to outlaw technology tools that are used by the vast majority of people to protect the substance of law-abiding lives is not just bad politics, it’s dangerous policy.

Here’s what one such unidentified British intelligence source told Politico: “As members of the general public get preoccupied that the government is spying on them, they have adopted these applications and terrorists have found them tailor-made for their own use.”

It’s a pretty incredible claim when you examine it. This unknown spook mouthpiece is saying terrorists are able to organize acts of mass murder as a direct consequence of the public’s dislike of government mass surveillance. Take even a cursory glance at the history of terrorism and that claim folds in on itself immediately. The highly co-ordinated 9/11 attacks of 2001 required no backdrop of public privacy fears in order to be carried out — and with horrifying, orchestrated effectiveness.

In the same Politico article, an identified source — J.M. Berger, the co-author of a book about ISIS — makes a far more credible claim: “Terrorists use technology improvisationally.”

Of course they do. The co-founder of secure messaging app Telegram, Pavel Durov, made much the same point earlier this fall when asked directly by TechCrunch about ISIS using his app to communicate. “Ultimately the ISIS will always find a way to communicate within themselves. And if any means of communication turns out to be not secure for them, then they switch to another one,” Durov argued. “I still think we’re doing the right thing — protecting our users privacy.”

Bottom line: banning encryption or enforcing tech companies to backdoor communications services has zero chance of being effective at stopping terrorists finding ways to communicate securely. They can and will route around such attempts to infiltrate their comms, as others have detailed at length.

Here’s a recap: terrorists can use encryption tools that are freely distributed from countries where your anti-encryption laws have no jurisdiction. Terrorists can (and do) build their own securely encrypted communication tools. Terrorists can switch to newer (or older) technologies to circumvent enforcement laws or enforced perforations. They can use plain old obfuscation to code their communications within noisy digital platforms like the Playstation 4 network, folding their chatter into general background digital noise (of which there is no shortage). And terrorists can meet in person, using a network of trusted couriers to facilitate these meetings, as Al Qaeda — the terrorist group that perpetrated the highly sophisticated 9/11 attacks at a time when smartphones were far less common, nor was there a ready supply of easy-to-use end-to-end encrypted messaging apps — is known to have done.

Point is, technology is not a two-lane highway that can be regulated with a couple of neat roadblocks — whatever many politicians appear to think. All such roadblocks will do is catch the law-abiding citizens who rely on digital highways to conduct more and more aspects of their daily lives. And make those law-abiding citizens less safe in multiple ways.

There’s little doubt that the lack of technological expertise in the upper echelons of governments is snowballing into a very ugly problem indeed as technology becomes increasingly sophisticated yet political rhetoric remains grounded in age-old kneejerkery. Of course we can all agree it would be beneficial if we were able to stop terrorists from communicating. But the hard political truth of the digital era is that’s never going to be possible. It really is putting the proverbial finger in the dam. (There are even startups working on encryption that’s futureproofed against quantum computers — and we don’t even have quantum computers yet.)

Another hard political truth is that effective counter terrorism policy requires spending money on physical, on-the-ground resources — putting more agents on the ground, within local communities, where they can gain trust and gather intelligence. (Not to mention having a foreign policy that seeks to promote global stability, rather than generating the kind of regional instability that feeds extremism by waging illegal wars, for instance, or selling arms to regimes known to support the spread of extremist religious ideologies.)

Yet, in the U.K. at least, the opposite is happening — police force budgets are being slashed. Meanwhile domestic spy agencies are now being promised more staff, yet spooks’ time is increasingly taken up with remote analysis of data, rather than on the ground intelligence work. The U.K. government’s draft new surveillance laws aim to cement mass surveillance as the officially sanctioned counter terror modus operandi, and will further increase the noise-to-signal ratio with additional data capture measures, such as mandating that ISPs retain data on the websites every citizen in the country has visited for the past year. Truly the opposite of a targeted intelligence strategy.

The draft Investigatory Powers Bill also has some distinctly ambiguous wording when it comes to encryption — suggesting the U.K. government is still seeking to legislate a general ability that companies be able to decrypt communications. Ergo, to outlaw end-to-end encryption. Yes, we’re back here again. You’d be forgiven for thinking politicians lacked a long-term memory.

Effective encryption might be a politically convenient scapegoat to kick around in the wake of a terror attack — given it can be used to detract attention from big picture geopolitical failures of governments. And from immediate on the ground intelligence failures — whether those are due to poor political direction, or a lack of resources, or bad decision-making/prioritization by overstretched intelligence agency staff. Pointing the finger of blame at technology companies’ use of encryption is a trivial diversion tactic to detract from wider political and intelligence failures with much more complex origins.

(On the intelligence failures point, questions certainly need to be asked, given that French and Belgian intelligence agencies apparently knew about the jihadi backgrounds of perpetrators of the Paris attacks. Yet weren’t, apparently, targeting them closely enough to prevent Saturday’s attack. And all this despite France having hugely draconian counter-terrorism digital surveillance laws…)

But seeking to outlaw technology tools that are used by the vast majority of people to protect the substance of law-abiding lives is not just bad politics, it’s dangerous policy.

Mandating vulnerabilities be built into digital communications opens up an even worse prospect: new avenues for terrorists and criminals to exploit. As officials are busy spinning the notion that terrorism is all-but only possible because of the rise of robust encryption, consider this: if the public is deprived of its digital privacy — with terrorism applied as the justification to strip out the robust safeguard of strong encryption — then individuals become more vulnerable to acts of terrorism, given their communications cannot be safeguarded from terrorists. Or criminals. Or fraudsters. Or anyone incentivized by malevolent intent.

If you want to speculate on fearful possibilities, think about terrorists being able to target individuals at will via legally-required-to-be insecure digital services. If you think terror tactics are scary right now, think about terrorists having the potential to single out, track and terminate anyone at will based on whatever twisted justification fits their warped ideology — perhaps after that person expressed views they do not approve of in an online forum.

In a world of guaranteed insecure digital services it’s a far more straightforward matter for a terrorist to hack into communications to obtain the identity of a person they deem a target, and to use other similarly perforated technology services to triangulate and track someone’s location to a place where they can be made the latest victim of a new type of hyper-targeted, mass surveillance-enabled terrorism. Inherently insecure services could also be more easily compromised by terrorists to broadcast their own propaganda, or send out phishing scams, or otherwise divert attention en masse.

The only way to protect against these scenarios is to expand the reach of properly encrypted services. To champion the cause of safeguarding the public’s personal data and privacy, rather than working to undermine it — and undermining the individual freedoms the West claims to be so keen to defend in the process.

While, when it comes to counter terrorism strategy, what’s needed is more intelligent targeting, not more mass measures that treat everyone as a potential suspect and deluge security agencies in an endless churn of irrelevant noise. Even the robust end-to-end encryption that’s now being briefed against as a ‘terrorist-enabling evil’ by shadowy officials on both sides of the Atlantic can be compromised at the level of an individual device. There’s no guaranteed shortcut to achieve that. Nor should there be — that’s the point. It takes sophisticated, targeted work.

But blanket measures to compromise the security of the many in the hopes of catching out the savvy few are even less likely to succeed on the intelligence front. We have mass surveillance already, and we also have blood on the streets of Paris once again. Encryption is just a convenient scapegoat for wider policy failures of an industrial surveillance complex.

So let’s not be taken in by false flags flown by anonymous officials trying to mask bad political decision-making. And let’s redouble our efforts to fight bad policy which seeks to entrench a failed ideology of mass surveillance — instead of focusing intelligence resources where they are really needed; honing in on signals, not drowned out by noise.

Mobile Carriers Are Working With Partners to Manage, Package and Sell Data

Source: http://adage.com/article/datadriven-marketing/24-billion-data-business-telcos-discuss/301058/

 

U.K. grocer Morrisons, ad-buying behemoth GroupM and other marketers and agencies are testing never-before-available data from cellphone carriers that connects device location and other information with telcos‘ real-world files on subscribers. Some services offer real-time heat maps showing the neighborhoods where store visitors go home at night, lists the sites they visited on mobile browsers recently and more.

Under the radar, Verizon, Sprint, Telefonica and other carriers have partnered with firms including SAP, IBM, HP and AirSage to manage, package and sell various levels of data to marketers and other clients. It’s all part of a push by the world’s largest phone operators to counteract diminishing subscriber growth through new business ventures that tap into the data that showers from consumers‘ mobile web surfing, text messaging and phone calls.

Morrison's
Morrison’s Credit: Chris Ratcliffe/Bloomberg

SAP’s Consumer Insight 365 ingests regularly updated data representing as many as 300 cellphone events per day for each of the 20 million to 25 million mobile subscribers. SAP won’t disclose the carriers providing this data. It „tells you where your consumers are coming from, because obviously the mobile operator knows their home location,“ said Lori Mitchell-Keller, head of SAP’s global retail industry business unit.

There is a lot of marketer interest in that information because it is tied to actual individuals. For the same reason, however, there is potential for resistance from privacy advocates.

WPP units such as Kantar Media and GroupM’s Mindshare have „kicked the tires“ for three years on Consumer Insight 365, testing and helping develop applications for the service, said Nick Nyhan, CEO of WPP’s Data Alliance. The extensive time spent so far partly reflects „high sensitivity to not doing something that would be too close for comfort from a consumer point of view,“ Mr. Nyhan said.

The service also combines data from telcos with other information, telling businesses whether shoppers are checking out competitor prices on their phones or just emailing friends. It can tell them the age ranges and genders of people who visited a store location between 10 a.m. and noon, and link location and demographic data with shoppers‘ web browsing history. Retailers might use the information to arrange store displays to appeal to certain customer segments at different times of the day, or to help determine where to open new locations.

„It used to be that this data was a lot harder to come by,“ said Ross Shanken, CEO of LeadID, a lead generation analytics firm. In a previous position at data firm TargusInfo 2008 and 2010 he nonetheless partnered with „a very large telco“ to validate names, addresses and phone numbers for data appending.

Too risky for the E.U.?
To protect privacy, SAP receives non-personally-identifiable, anonymized information from telcos, and only provides aggregated information to its clients to prevent reidentification of individuals. Still, sharing and using data this way is controversial. Nearly all the players exploring the burgeoning Telecom Data as a Service field, or TDaaS for short, are reluctant to provide the details of their operations, much less freely name their clients. And despite privacy safeguards, SAP is focused on selling its 365 product in North America and the Asia-Pacific region because it cannot get the data it needs from telcos representing consumers in the E.U., where data protections are stricter than in the U.S. and elsewhere.

But the rewards may outweigh the possible tangles with government regulators, consumer advocates and even squeamish board members.

The global market for telco data as a service is potentially worth $24.1 billion this year, on its way to $79 billion in 2020, according to estimates by 451 Research based on a survey of likely customers. „Challenges and constraints“ mean operators are scraping just 10% of the possible market right now, though that will rise to 30% by 2020, 451 Research said.

„If I was a CEO of any telecom operator in the U.S., I would be saying to myself I can do the same,“ said Michael Provenzano, CEO and co-founder of Vistar Media, which teams up with mobile operators to provide anonymized and aggregated data for targeting digital out-of-home ads based on consumers‘ comings and goings. „That’s going to be something these guys are talking about in the boardroom.“

Perhaps the most prominent recent moves in the burgeoning TDaaS realm are Verizon’s $4.4 billion acquisition of AOL in May, followed by its purchase of mobile ad network Millennial Media for $238 million in September. Many saw the AOL buy as a means for Verizon to turn its data into a viable business, in part because AOL provides ad-tech infrastructure and marketer relationships that Verizon lacks.

The level of authenticated information derived from Verizon and other mobile operators is seen as potentially more valuable than some other consumer data because it directly connects mobile phone interactions to individuals through actual billing information. „We’re talking about linking a household and a billing relationship with a human being,“ said Seth Demsey, CTO of AOL Platforms.

Verizon’s Precision Market Insights division previously stumbled in its attempts to aggregate and package mobile data to help marketers target consumers and measure campaigns. Sprint’s similar Pinsight Media division and AT&T’s AdWorks—which segments and targets TV audiences—have not fared much better, according to observers.

But lackluster results from going it alone have driven telcos toward companies that can facilitate cashing in on data. Along with SAP on the marketer-facing side, others including HP and IBM have stepped in to help phone carriers on the back-end data management and analysis side.

When Spanish operator Telefonica embarked on its Dynamic Insights offering, it partnered with consumer insights firm GfK to help package the telco’s mobile data for clients including U.K. food purveyor Morrisons. The grocery chain used the service to garner anonymized data connecting consumer demographic data to location visits.

SAP's Rohit Tripathi
SAP’s Rohit Tripathi Credit: Courtesy SAP

Some of these data relationships have long histories. SAP America owns Sybase, a subsidiary it bought in 2010 that serves as a technology hub for multiple mobile carriers and counts Verizon as a partner. The Sybase business has provided „deep relationships with mobile operators around the globe,“ said Rohit Tripathi, global VP and general manager of SAP Mobile Services, in an email.

AirSage, another firm that has tight integrations with mobile operators, supplies data to municipal planners, retail store developers and city tourism boards. The company integrated its technology with telecom companies in the 1990s to enable 911 call support services. More recently it has signed data deals with Verizon Wireless and Sprint. „Our solution is actually plugged into the network behind the firewall of the carrier,“ said Ryan Kinskey, director of business development and sales at AirSage. Device IDs tracked by AirSage are anonymized, he added.

Verizon and Sprint declined to comment for this story. AT&T and T-Mobile said they don’t share consumer or location data with SAP, Sybase, AirSage or Vistar.

Why the secrecy?
Insiders say phone carriers exploring data-sharing businesses are tight-lipped because they don’t want to reveal too many details to competitors, but fear of consumer complaints is always lurking in the background.

EFF's Peter Eckersley
EFF’s Peter Eckersley Credit: Courtesy EFF

„The practices that carriers have gotten into, the sheer volume of data and the promiscuity with which they’re revealing their customers‘ data creates enormous risk for their businesses,“ said Peter Eckersley, chief computer scientist at the Electronic Frontier Foundation, a privacy watchdog. Mr. Eckersley and others suggest that anonymization techniques are faulty in many cases because even information associated with a hashed or encrypted identification code can be linked back to a home address and potentially reidentified by hackers.

Unlike other types of location tracking, such as beacon technologies that work only with mobile apps that people have agreed to let track them, many services employing telco data require no explicit opt-ins by consumers. Companies like SAP instead rely on carriers‘ terms and conditions with their subscribers, calling acceptance of the terms equivalent to opting in. Verizon’s privacy policy, for example, says that information collected on its customers may „be aggregated or anonymized for business and marketing uses by us or by third parties.“

Ultimately, for mobile operators, these relationships could reap substantial income from the data generated by subscribers who already account for their primary revenue streams. The telcos do not break out revenue derived from their data-related sales in their quarterly earnings reports, so just how much money they’re making from these deals is not known.

SAP will „effectively share the revenue back with the operator, so they get to make money from data that they’re basically not utilizing or under-utilizing today,“ former SAP Mobile President John Sims said at an industry conference in Las Vegas in 2013 as the company introduced Consumer Insight 365.

„The mobile operators don’t want to reveal this,“ said Mr. Tripathi, the SAP Mobile Services executive. No matter how much telcos and their partners stress that the data is anonymized and aggregated, he said, „they are fearful people will take this and twist it into something that it isn’t.“

Spotify, Facebook und der Open Graph

Spotify hat es nun auch nach Österreich geschafft. Gratulation.

Warum ist es für die Nutzung eines legalen Gratis Musik-Hör-Dienstes zwingend erforderlich Facebook-Kunde zu sein?
Dahinter steckt der Masterplan namens Open Graph (aka „automatisches Teilen“)

Sie sind Facebook-Kunde im Sinne einer kostenfreien Mitgliedschaft freilich, aber unter Preisgabe jeder Menge persönlicher Daten. Bis dato! Sie noch genauer zu kennen und Sie als Abbild eines Datenbank-Satzes möglichst präzise zu charakterisieren, Ihre persönlichen Bedürfnisse besser zu beschreiben, als Sie sich jemandem, den Sie noch nicht kennen, vorstellen würden, das ist das Ziel von Open Graph.

Möglichst jede Website, die Sie regelmäßig besuchen möchte ihr virtuelles Werbe-Profil ein Stück mehr komplettieren. Jeder Artikel auf einer Newssite, jedes Online-Buch, das sie gekauft haben, jeder Gratis-Song, den sie gehört haben und jedes Fotoalbum, das sie erstellt haben.

All diese Informationen werden automatisch in ihrem Facebook-Profil ihren Freunden mitgeteilt.

Sie werden exakt klassifiziert als 40 jähriger Jazz-Fan, mit Online-Präferenz zum Standard, Konzertbegeistert, Reisefreudig und erhalten zielgenau Werbung für Jazz-Konzerte, Zeitungsabos und Reiseangebote.

Segen für jeden Werbetreibenden Geschäftsmann, aber zugleich Fluch für denselben. Eine Innovation jedenfalls, nie zuvor in der Geschichte der Menschheit wurde ein so riesiger Sample von Echten Geschmackspräferenzen von Realen Personen gleichzeitig auf Knopfdruck auswertbar zahlenden Werbekunden zur Verfügung gestellt. Facebook – hat es geschafft.

Wie können zielgenaue Profile ihre Werbungsgestaltung verbessern? Diskutieren Sie mit uns.
innovativ@dieIdee.eu

Quellen:
http://news.cnet.com/8301-31322_3-57324406-256/how-facebook-is-ruining-sharing/

http://blog.freestyleinteractive.co.uk/2011/09/facebook-frictionless-sharing-will-share-everything-you-read-with-your-friends/