Schlagwort-Archive: Surveillance

Lets Get Rid of the “Nothing to Hide, Nothing to Fear” Mentality

With Zuckerberg testifying to the US Congress over Facebook’s data privacy and the implementation of GDPR fast approaching, the debate around data ownership has suddenly burst into the public psyche. Collecting user data to serve targeted advertising in a free platform is one thing, harvesting the social graphs of people interacting with apps and using it to sway an election is somewhat worse.

Suffice to say that neither of the above compare to the indiscriminate collection of ordinary civilians’ data on behalf of governments every day.

In 2013, Edward Snowden blew the whistle on the systematic US spy program he helped to architect. Perhaps the largest revelation to come out of the trove of documents he released were the details of PRISM, an NSA program that collects internet communications data from US telecommunications companies like Microsoft, Yahoo, Google, Facebook and Apple. The data collected included audio and video chat logs, photographs, emails, documents and connection logs of anyone using the services of 9 leading US internet companies. PRISM benefited from changes to FISA that allowed warrantless domestic surveillance of any target without the need for probable cause. Bill Binney, former US intelligence official, explains how, for instances where corporate control wasn’t achievable, the NSA enticed third party countries to clandestinely tap internet communication lines on the internet backbone via the RAMPART-A program.What this means is that the NSA was able to assemble near complete dossiers of all web activity carried out by anyone using the internet.

But this is just in the US right?, policies like this wouldn’t be implemented in Europe.

Wrong unfortunately.

GCHQ, the UK’s intelligence agency allegedly collects considerably more metadata than the NSA. Under Tempora, GCHQ can intercept all internet communications from submarine fibre optic cables and store the information for 30 days at the Bude facility in Cornwall. This includes complete web histories, the contents of all emails and facebook entires and given that more than 25% of all internet communications flow through these cables, the implications are astronomical. Elsewhere, JTRIG, a unit of GCHQ have intercepted private facebook pictures, changed the results of online polls and spoofed websites in real time. A lot of these techniques have been made possible by the 2016 Investigatory Powers Act which Snowden describes as the most “extreme surveillance in the history of western democracy”.

But despite all this, the age old reprise; “if you’ve got nothing to hide, you’ve got nothing to fear” often rings out in debates over privacy.

Indeed, the idea is so pervasive that politicians often lean on the phrase to justify ever more draconian methods of surveillance. Yes, they draw upon the selfsame rhetoric of Joseph Goebbels, propaganda minister for the Nazi regime.

In drafting legislation for the the Investigatory Powers Act, May said that such extremes were necessary to ensure “no area of cyberspace becomes a haven for those who seek to harm us, to plot, poison minds and peddle hatred under the radar”.

When levelled against the fear of terrorism and death, its easy to see how people passively accept ever greater levels of surveillance. Indeed, Naomi Klein writes extensively in Shock Doctrine how the fear of external threats can be used as a smokescreen to implement ever more invasive policy. But indiscriminate mass surveillance should never be blindly accepted, privacy should and always will be a social norm, despite what Mark Zuckerberg said in 2010. Although I’m sure he may have a different answer now.

So you just read emails and look at cat memes online, why would you care about privacy?

In the same way we’re able to close our living room curtains and be alone and unmonitored, we should be able to explore our identities online un-impinged. Its a well rehearsed idea that nowadays we’re more honest to our web browsers than we are to each other but what happens when you become cognisant that everything you do online is intercepted and catalogued? As with CCTV, when we know we’re being watched, we alter our behaviour in line with whats expected.

As soon as this happens online, the liberating quality provided by the anonymity of the internet is lost. Your thinking aligns with the status quo and we lose the boundless ability of the internet to search and develop our identities. No progress can be made when everyone thinks the same way. Difference of opinion fuels innovation.

This draws obvious comparisons with Bentham’s Panopticon, a prison blueprint for enforcing control from within. The basic setup is as follows; there is a central guard tower surrounded by cells. In the cells are prisoners. The tower shines bright light so that the watchman can see each inmate silhouetted in their cell but the prisoners cannot see the watchman. The prisoners must assume they could be observed at any point and therefore act accordingly. In literature, the common comparison is Orwell’s 1984 where omnipresent government surveillance enforces control and distorts reality. With revelations about surveillance states, the relevance of these metaphors are plain to see.

In reality, theres actually a lot more at stake here.

With the Panopticon certain individuals are watched, in 1984 everyone is watched. On the modern internet, every person, irrespective of the threat they pose, is not only watched but their information is stored and archived for analysis.

Kafka’s The Trial, in which a bureaucracy uses citizens information to make decisions about them, but denies them the ability to participate in how their information is used, therefore seems a more apt comparison. The issue here is that corporations, more so, states have been allowed to comb our data and make decisions that affect us without our consent.

Maybe, as a member of a western democracy, you don’t think this matters. But what if you’re a member of a minority group in an oppressive regime? What if you’re arrested because a computer algorithm cant separate humour from intent to harm?

On the other hand, maybe you trust the intentions of your government, but how much faith do you have in them to keep your data private? The recent hack of the SEC shows that even government systems aren’t safe from attackers. When a business database is breached, maybe your credit card details become public, when a government database that has aggregated millions of data points on every aspect of your online life is hacked, you’ve lost all control of your ability to selectively reveal yourself to the world. Just as Lyndon Johnson sought to control physical clouds, he who controls the modern cloud, will rule the world.

Perhaps you think that even this doesn’t matter, if it allows the government to protect us from those that intend to cause harm then its worth the loss of privacy. The trouble with indiscriminate surveillance is that with so much data you see everything but paradoxically, still know nothing.

Intelligence is the strategic collection of pertinent facts, bulk data collection cannot therefore be intelligent. As Bill Binney puts it “bulk data kills people” because technicians are so overwhelmed that they cant isolate whats useful. Data collection as it is can only focus on retribution rather than reduction.

Granted, GDPR is a big step forward for individual consent but will it stop corporations handing over your data to the government? Depending on how cynical you are, you might think that GDPR is just a tool to clean up and create more reliable deterministic data anyway. The nothing to hide, nothing to fear mentality renders us passive supplicants in the removal of our civil liberties. We should be thinking about how we relate to one another and to our Governments and how much power we want to have in that relationship.

To paraphrase Edward Snowden, saying you don’t care about privacy because you’ve got nothing to hide is analogous to saying you don’t care about freedom of speech because you have nothing to say.

http://behindthebrowser.space/index.php/2018/04/22/nothing-to-fear-nothing-to-hide/

Advertisements

Encryption Is Being Scapegoated To Mask The Failures Of Mass Surveillance

Source: http://techcrunch.com/2015/11/17/the-blame-game/

Well that took no time at all. Intelligence agencies rolled right into the horror and fury in the immediate wake of the latest co-ordinated terror attacks in the French capital on Friday, to launch their latest co-ordinated assault on strong encryption — and on the tech companies creating secure comms services — seeking to scapegoat end-to-end encryption as the enabling layer for extremists to perpetrate mass murder.

There’s no doubt they were waiting for just such an ‘opportune moment’ to redouble their attacks on encryption after recent attempts to lobby for encryption-perforating legislation foundered. (A strategy confirmed by a leaked email sent by the intelligence community’s top lawyer, Robert S. Litt, this August — and subsequently obtained by the Washington Post — in which he anticipated that a “very hostile legislative environment… could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement”. Et voila Paris… )

Speaking to CBS News the weekend in the immediate aftermath of the Paris attacks, former CIA deputy director Michael Morell said: “I think this is going to open an entire new debate about security versus privacy.”

“We, in many respects, have gone blind as a result of the commercialization and the selling of these devices that cannot be accessed either by the manufacturer or, more importantly, by us in law enforcement, even equipped with search warrants and judicial authority,” added New York City police commissioner, William J. Bratton, quoted by the NYT in a lengthy article probing the “possible” role of encrypted messaging apps in the Paris attacks.

Elsewhere the fast-flowing attacks on encrypted tech services have come without a byline — from unnamed European and American officials who say they are “not authorized to speak publicly”. Yet are happy to speak publicly, anonymously.

The NYT published an article on Sunday alleging that attackers had used “encryption technology” to communicate — citing “European officials who had been briefed on the investigation but were not authorized to speak publicly”. (The paper subsequently pulled the article from its website, as noted by InsideSources, although it can still be read via the Internet Archive.)

The irony of government/intelligence agency sources briefing against encryption on condition of anonymity as they seek to undermine the public’s right to privacy would be darkly comic if it weren’t quite so brazen.

Seeking to outlaw technology tools that are used by the vast majority of people to protect the substance of law-abiding lives is not just bad politics, it’s dangerous policy.

Here’s what one such unidentified British intelligence source told Politico: “As members of the general public get preoccupied that the government is spying on them, they have adopted these applications and terrorists have found them tailor-made for their own use.”

It’s a pretty incredible claim when you examine it. This unknown spook mouthpiece is saying terrorists are able to organize acts of mass murder as a direct consequence of the public’s dislike of government mass surveillance. Take even a cursory glance at the history of terrorism and that claim folds in on itself immediately. The highly co-ordinated 9/11 attacks of 2001 required no backdrop of public privacy fears in order to be carried out — and with horrifying, orchestrated effectiveness.

In the same Politico article, an identified source — J.M. Berger, the co-author of a book about ISIS — makes a far more credible claim: “Terrorists use technology improvisationally.”

Of course they do. The co-founder of secure messaging app Telegram, Pavel Durov, made much the same point earlier this fall when asked directly by TechCrunch about ISIS using his app to communicate. “Ultimately the ISIS will always find a way to communicate within themselves. And if any means of communication turns out to be not secure for them, then they switch to another one,” Durov argued. “I still think we’re doing the right thing — protecting our users privacy.”

Bottom line: banning encryption or enforcing tech companies to backdoor communications services has zero chance of being effective at stopping terrorists finding ways to communicate securely. They can and will route around such attempts to infiltrate their comms, as others have detailed at length.

Here’s a recap: terrorists can use encryption tools that are freely distributed from countries where your anti-encryption laws have no jurisdiction. Terrorists can (and do) build their own securely encrypted communication tools. Terrorists can switch to newer (or older) technologies to circumvent enforcement laws or enforced perforations. They can use plain old obfuscation to code their communications within noisy digital platforms like the Playstation 4 network, folding their chatter into general background digital noise (of which there is no shortage). And terrorists can meet in person, using a network of trusted couriers to facilitate these meetings, as Al Qaeda — the terrorist group that perpetrated the highly sophisticated 9/11 attacks at a time when smartphones were far less common, nor was there a ready supply of easy-to-use end-to-end encrypted messaging apps — is known to have done.

Point is, technology is not a two-lane highway that can be regulated with a couple of neat roadblocks — whatever many politicians appear to think. All such roadblocks will do is catch the law-abiding citizens who rely on digital highways to conduct more and more aspects of their daily lives. And make those law-abiding citizens less safe in multiple ways.

There’s little doubt that the lack of technological expertise in the upper echelons of governments is snowballing into a very ugly problem indeed as technology becomes increasingly sophisticated yet political rhetoric remains grounded in age-old kneejerkery. Of course we can all agree it would be beneficial if we were able to stop terrorists from communicating. But the hard political truth of the digital era is that’s never going to be possible. It really is putting the proverbial finger in the dam. (There are even startups working on encryption that’s futureproofed against quantum computers — and we don’t even have quantum computers yet.)

Another hard political truth is that effective counter terrorism policy requires spending money on physical, on-the-ground resources — putting more agents on the ground, within local communities, where they can gain trust and gather intelligence. (Not to mention having a foreign policy that seeks to promote global stability, rather than generating the kind of regional instability that feeds extremism by waging illegal wars, for instance, or selling arms to regimes known to support the spread of extremist religious ideologies.)

Yet, in the U.K. at least, the opposite is happening — police force budgets are being slashed. Meanwhile domestic spy agencies are now being promised more staff, yet spooks’ time is increasingly taken up with remote analysis of data, rather than on the ground intelligence work. The U.K. government’s draft new surveillance laws aim to cement mass surveillance as the officially sanctioned counter terror modus operandi, and will further increase the noise-to-signal ratio with additional data capture measures, such as mandating that ISPs retain data on the websites every citizen in the country has visited for the past year. Truly the opposite of a targeted intelligence strategy.

The draft Investigatory Powers Bill also has some distinctly ambiguous wording when it comes to encryption — suggesting the U.K. government is still seeking to legislate a general ability that companies be able to decrypt communications. Ergo, to outlaw end-to-end encryption. Yes, we’re back here again. You’d be forgiven for thinking politicians lacked a long-term memory.

Effective encryption might be a politically convenient scapegoat to kick around in the wake of a terror attack — given it can be used to detract attention from big picture geopolitical failures of governments. And from immediate on the ground intelligence failures — whether those are due to poor political direction, or a lack of resources, or bad decision-making/prioritization by overstretched intelligence agency staff. Pointing the finger of blame at technology companies’ use of encryption is a trivial diversion tactic to detract from wider political and intelligence failures with much more complex origins.

(On the intelligence failures point, questions certainly need to be asked, given that French and Belgian intelligence agencies apparently knew about the jihadi backgrounds of perpetrators of the Paris attacks. Yet weren’t, apparently, targeting them closely enough to prevent Saturday’s attack. And all this despite France having hugely draconian counter-terrorism digital surveillance laws…)

But seeking to outlaw technology tools that are used by the vast majority of people to protect the substance of law-abiding lives is not just bad politics, it’s dangerous policy.

Mandating vulnerabilities be built into digital communications opens up an even worse prospect: new avenues for terrorists and criminals to exploit. As officials are busy spinning the notion that terrorism is all-but only possible because of the rise of robust encryption, consider this: if the public is deprived of its digital privacy — with terrorism applied as the justification to strip out the robust safeguard of strong encryption — then individuals become more vulnerable to acts of terrorism, given their communications cannot be safeguarded from terrorists. Or criminals. Or fraudsters. Or anyone incentivized by malevolent intent.

If you want to speculate on fearful possibilities, think about terrorists being able to target individuals at will via legally-required-to-be insecure digital services. If you think terror tactics are scary right now, think about terrorists having the potential to single out, track and terminate anyone at will based on whatever twisted justification fits their warped ideology — perhaps after that person expressed views they do not approve of in an online forum.

In a world of guaranteed insecure digital services it’s a far more straightforward matter for a terrorist to hack into communications to obtain the identity of a person they deem a target, and to use other similarly perforated technology services to triangulate and track someone’s location to a place where they can be made the latest victim of a new type of hyper-targeted, mass surveillance-enabled terrorism. Inherently insecure services could also be more easily compromised by terrorists to broadcast their own propaganda, or send out phishing scams, or otherwise divert attention en masse.

The only way to protect against these scenarios is to expand the reach of properly encrypted services. To champion the cause of safeguarding the public’s personal data and privacy, rather than working to undermine it — and undermining the individual freedoms the West claims to be so keen to defend in the process.

While, when it comes to counter terrorism strategy, what’s needed is more intelligent targeting, not more mass measures that treat everyone as a potential suspect and deluge security agencies in an endless churn of irrelevant noise. Even the robust end-to-end encryption that’s now being briefed against as a ‘terrorist-enabling evil’ by shadowy officials on both sides of the Atlantic can be compromised at the level of an individual device. There’s no guaranteed shortcut to achieve that. Nor should there be — that’s the point. It takes sophisticated, targeted work.

But blanket measures to compromise the security of the many in the hopes of catching out the savvy few are even less likely to succeed on the intelligence front. We have mass surveillance already, and we also have blood on the streets of Paris once again. Encryption is just a convenient scapegoat for wider policy failures of an industrial surveillance complex.

So let’s not be taken in by false flags flown by anonymous officials trying to mask bad political decision-making. And let’s redouble our efforts to fight bad policy which seeks to entrench a failed ideology of mass surveillance — instead of focusing intelligence resources where they are really needed; honing in on signals, not drowned out by noise.