Archiv der Kategorie: Privacy

Secure your Privacy – HERE’S WHY YOU SHOULD USE SIGNAL

Source: https://www.wired.com/story/ditch-all-those-other-messaging-apps-heres-why-you-should-use-signal/

STOP ME IF you’ve heard this before. You text a friend to finalize plans, anxiously awaiting their reply, only to get a message from them on Snapchat to say your latest story was hilarious. So, you move the conversation over to Snapchat, decide to meet up at 10:30, but then you close the app and can’t remember if you agreed on meeting at Hannegan’s or that poppin‘ new brewery downtown. You can’t go back and look at the message since Snapchat messages have a short shelf life, so you send a text, but your friend has already proven to be an unreliable texter. You’d be lucky if they got back to you by midnight.

All of this illustrates a plain truth. There are just too many messaging apps. As conversations can bounce between Snapchat, iMessage, Skype, Instagram, Twitter, and Hangouts/Allo or whatever Google’s latest attempt at messaging is, they’re rendered confusing and unsearchable. We could stick to SMS, but it’s pretty limited compared to other options, and it has some security holes. Rather than just chugging along with a dozen chat apps, letting your notifications pile up, it’s time to pick one messaging app and get all of your friends on board. That way, everyone can just pick up their phones and shoot a message to anyone without hesitation.

Here comes the easy part. There’s one messaging app we should all be using: Signal. It has strong encryption, it’s free, it works on every mobile platform, and the developers are committed to keeping it simple and fast by not mucking up the experience with ads, web-tracking, stickers, or animated poop emoji.

Tales From the Crypto

Signal looks and works a lot like other basic messaging apps, so it’s easy to get started. It’s especially convenient if you have friends and family overseas because, like iMessage and WhatsApp, Signal lets you sidestep expensive international SMS fees. It also supports voice and video calls, so you can cut out Skype and FaceTime. Sure, you don’t get fancy stickers or games like some of the competition, but you can still send pictures, videos, and documents. It’s available on iOS, Android, and desktop.

But plenty of apps have all that stuff. The thing that actually makes Signal superior is that it’s easy to ensure that the contents of every chat remain private and unable to be read by anyone else. As long as both parties are using the app to message each other, every single message sent with Signal is encrypted. Also, the encryption Signal uses is available under an open-source license, so experts have had the chance to test and poke the app to make sure it stays as secure as what’s intended.

If you’re super concerned about messages being read by the wrong eyes, Signal lets you force individual conversations to delete themselves after a designated amount of time. Signal’s security doesn’t stop at texts. All of your calls are encrypted, so nobody can listen in. Even if you have nothing to hide, it’s nice to know that your private life is kept, you know, private.

WhatAbout WhatsApp

Yes, this list of features sounds a lot like WhatsApp. It’s true, the Facebook-owned messaging app has over a billion users, offers most of the same features, and even employs Signal’s encryption to keep chats private. But WhatsApp raises a few concerns that Signal doesn’t. First, it’s owned by Facebook, a company whose primary interest is in collecting information about you to sell you ads. That alone may steer away those who feel Facebook already knows too much about us. Even though the content of your WhatsApp messages are encrypted, Facebook can still extract metadata from your habits, like who you’re talking to and how frequently.

Still, if you use WhatsApp, chances are you already know a lot of other people who are using it. Getting all of them to switch to Signal is highly unlikely. And you know, that’s OK—WhatsApp really is the next-best option to Signal. The encryption is just as strong, and while it isn’t as cleanly stripped of extraneous features as Signal, that massive user base makes it easy to reach almost anyone in your contact list.

Chat Heads

While we’re talking about Facebook, it’s worth noting that the company’s Messenger app isn’t the safest place to keep your conversations. Aside from all the clutter inside the app, the two biggest issues with Facebook Messenger are that you have to encrypt conversations individually by flipping on the „Secret Conversations“ option (good luck remembering to do that), and that anyone with a Facebook profile can just search for your name and send you a message. (Yikes!) There are too many variables in the app, and a lot the security is out of your hands. iMessage may seem like a solid remedy to all of these woes, but it’s tucked behind Apple’s walled iOS garden, so you’re bound to leave out your closest friends who use Android devices. And if you ever switch platforms, say bye-bye to your chat history.

Signal isn’t going to win a lot of fans among those who’ve grown used to the more novel features inside their chat apps. There are no stickers, and no animoji. Still, as privacy issues come to the fore in the minds of users, and as mobile messaging options proliferate, and as notifications pile up, everyone will be searching for a path to sanity. It’s easy to invite people to Signal. Once you’re using it, just tap the „invite“ button inside the chat window, and your friend will be sent a link to download the app. Even stubborn people who only send texts can get into it—Signal can be set as your phone’s default SMS client, so the pain involved in the switch is minimal.

So let’s make a pact right now. Let’s all switch to Signal, keep our messages private, and finally put an end to the untenable multi-app shuffle that’s gone on far too long.

Advertisements

Macron, May, Merkel – weakening encryption and making messengers (whatsapp) vulnerable leads to data security catastrophes

In weakening strong encryption by weakening software like Android or IOS operating System (subroutines, inlays, essentials) in order to enable mass surveillance you the leaders of Europe risk the data security of thousands of Europe companies. Is it worth it?

Even Microsoft is now warning that the government practice of “stockpiling” software vulnerabilities so that they can be used as weapons is a misguided tactic that weakens security for everybody.

“An equivalent scenario with conventional weapons would be the U.S. military having some of its Tomahawk missiles stolen,” the company said Sunday.

Why are you doing this? Hopefully not for the need to give information in order to receive from the USA?

epa05989737 French President Emmanuel Macron (L) talks with German Chancellor Angela Merkel (R) as US President Donald J. Trump (C) walks by, during a line up for the group photo at the NATO summit in Brussels, Belgium, 25 May 2017. NATO countries‘ heads of states and governments gather in Brussels for a one-day meeting. EPA/ARMANDO BABANI

You saw and recognised and understood WannaCry that affected thousands of companies throuout Europe?

The vulnerability in Windows that WannaCry takes advantage of was discovered by the NSA for its surveillance toolkit. But word got out when a hacker group known as the Shadow Brokers dumped a bunch of leaked NSA information onto the Internet in April. Microsoft, however, had already issued a software update the month before; those that downloaded and installed the patch were protected from WannaCry, but many others lagged behind and became victims.

More Android phones are using encryption and lock screen security than ever before

Many often have a false sense of just how secure their private data is on their devices — that is, if they’re thinking about it at all. Your average smartphone user just wants to access the apps and people they care about, and not worry about security.

That’s why it was extremely encouraging to hear some of the security metrics announced at Google I/O 2017. For devices running Android Nougat, roughly 80% of users are running them fully encrypted. At the same time, about 70% of Nougat devices are using a secure lock screen of some form.

Android encryption adoptionAndroid lock screen adoption

That 80% encryption number isn’t amazingly surprising when you remember that Nougat has full-device encryption turned on by default, but that number also includes devices that were upgraded from Marshmallow, which didn’t have default encryption. Devices running on Marshmallow have a device encryption rate of just 25%, though, so this is a massive improvement. And the best part about Google’s insistence on default encryption is that eventually older devices will be replaced by those running Nougat or later out of the box, meaning this encryption rate could get very close to 100%.

The default settings are immensely important.

Full-device encryption is particularly effective when paired with a secure lock screen, and Google’s metrics showing 70% adoption in this regard definitely needs some work. It’s a small increase from the roughly 60% secure lock screen rate of Marshmallow phones but a decent jump from the sub-50% rate of devices running Lollipop. The most interesting aspect of these numbers to my eyes is that having a fingerprint sensor on the device doesn’t signal a very large increase in adoption — perhaps just a five percentage point jump. On one hand it’s great to see people using secured lock screens even when they don’t have something as convenient as a fingerprint sensor, but then again I’d expect the simplicity of that sensor to help adoption more than these numbers show.

The trend is heading in the right direction in both of these metrics, and that’s a great sign despite the fact that secure lock screens show a slower growth rate. The closer we get both of these numbers to 100%, the better.

http://www.androidcentral.com/more-android-phones-are-using-encryption-and-lock-screen-security-eve

Don’t wanna Cry? Use Linux 

Don’t wanna Cry? Use Linux. Life is too short to reboot. 

So far, over 213,000 computers across 99 countries around the world have been infected, and the infection is still rising even hours after the kill switch was triggered by the 22-years-old British security researcher behind the twitter handle ‚MalwareTech.‘

For those unaware, WannaCry is an insanely fast-spreading ransomware malware that leverages a Windows SMB exploit to remotely target a computer running on unpatched or unsupported versions of Windows.

So far, Criminals behind WannaCry Ransomware have received nearly 100 payments from victims, total 15 Bitcoins, equals to USD $26,090.


Once infected, WannaCry also scans for other vulnerable computers connected to the same network, as well scans random hosts on the wider Internet, to spread quickly.

The SMB exploit, currently being used by WannaCry, has been identified as EternalBlue, a collection of hacking tools allegedly created by the NSA and then subsequently dumped by a hacking group calling itself „The Shadow Brokers“ over a month ago.

„If NSA had privately disclosed the flaw used to attack hospitals when they *found* it, not when they lost it, this may not have happened,“ NSA whistleblower Edward Snowden says.

http://thehackernews.com/2017/05/wannacry-ransomware-cyber-attack.html

Securing Driverless Cars From Hackers Is Hard, according to Charlie Miller, Ex-NSA’s Tailored Access Operations Hacker

Securing Driverless Cars From Hackers Is Hard. Ask the Ex-Uber Guy Who Protects Them

Two years ago, Charlie Miller and Chris Valasek pulled off a demonstration that shook the auto industry, remotely hacking a Jeep Cherokee via its internet connection to paralyze it on a highway. Since then, the two security researchers have been quietly working for Uber, helping the startup secure its experimental self-driving cars against exactly the sort of attack they proved was possible on a traditional one. Now, Miller has moved on, and he’s ready to broadcast a message to the automotive industry: Securing autonomous cars from hackers is a very difficult problem. It’s time to get serious about solving it.

Last month, Miller left Uber for a position at Chinese competitor Didi, a startup that’s just now beginning its own autonomous ridesharing project. In his first post-Uber interview, Miller talked to WIRED about what he learned in those 19 months at the company—namely that driverless taxis pose a security challenge that goes well beyond even those faced by the rest of the connected car industry.

Miller couldn’t talk about any of the specifics of his research at Uber; he says he moved to Didi in part because the company has allowed him to speak more openly about car hacking. But he warns that before self-driving taxis can become a reality, the vehicles’ architects will need to consider everything from the vast array of automation in driverless cars that can be remotely hijacked, to the possibility that passengers themselves could use their physical access to sabotage an unmanned vehicle.

“Autonomous vehicles are at the apex of all the terrible things that can go wrong,” says Miller, who spent years on the NSA’s Tailored Access Operations team of elite hackers before stints at Twitter and Uber. “Cars are already insecure, and you’re adding a bunch of sensors and computers that are controlling them…If a bad guy gets control of that, it’s going to be even worse.”

At A Computer’s Mercy

In a series of experiments starting in 2013, Miller and Valasek showed that a hacker with either wired or over-the-internet access to a vehicle—including a Toyota Prius, Ford Escape, and a Jeep Cherokee—could disable or slam on a victim’s brakes, turn the steering wheel, or, in some cases, cause unintended acceleration. But to trigger almost all those attacks, Miller and Valasek had to exploit vehicles’ existing automated features. They used the Prius’ collision avoidance system to apply its brakes, and the Jeep’s cruise control feature to accelerate it. To turn the Jeep’s steering wheel, they tricked it into thinking it was parking itself—even if it was moving at 80 miles per hour.

Their car-hacking hijinks, in other words, were limited to the few functions a vehicle’s computer controls. In a driverless car, the computer controls everything. “In an autonomous vehicle, the computer can apply the brakes and turn the steering wheel any amount, at any speed,” Miller says. “The computers are even more in charge.”

An alert driver could also override many of the attacks Miller and Valasek demonstrated on traditional cars: Tap the brakes and that cruise control acceleration immediately ceases. Even the steering wheel attacks could be easily overcome if the driver wrests control of the wheel. When the passenger isn’t in the driver’s seat—or there is no steering wheel or brake pedal—no such manual override exists. “No matter what we did in the past, the human had a chance to control the car. But if you’re sitting in the backseat, that’s a whole different story,” says Miller. “You’re totally at the mercy of the vehicle.”

Hackers Take Rides, Too

A driverless car that’s used as a taxi, Miller points out, poses even more potential problems. In that situation, every passenger has to be considered a potential threat. Security researchers have shown that merely plugging an internet-connected gadget into a car’s OBD2 port—a ubiquitous outlet under its dashboard—can offer a remote attacker an entry point into the vehicle’s most sensitive systems. (Researchers at the University of California at San Diego showed in 2015 that they could take control of a Corvette’s brakes via a common OBD2 dongle distributed by insurance companies—including one that partnered with Uber.)

“There’s going to be someone you don’t necessarily trust sitting in your car for an extended period of time,” says Miller. “The OBD2 port is something that’s pretty easy for a passenger to plug something into and then hop out, and then they have access to your vehicle’s sensitive network.”

Permanently plugging that port is illegal under federal regulations, Miller says. He suggests ridesharing companies that use driverless cars could cover it with tamper-evident tape. But even then, they might only be able to narrow down which passenger could have sabotaged a vehicle to a certain day or week. A more comprehensive fix would mean securing the vehicle’s software so that not even a malicious hacker with full physical access to its network would be able to hack it—a challenge Miller says only a few highly locked-down products like an iPhone or Chromebook can pass.

“It’s definitely a hard problem,” he says.

Deep Fixes

Miller argues that solving autonomous vehicles’ security flaws will require some fundamental changes to their security architecture. Their internet-connected computers, for instance, will need “codesigning,” a measure that ensures they only run trusted code signed with a certain cryptographic key. Today only Tesla has talked publicly about implementing that feature. Cars’ internal networks will need better internal segmentation and authentication, so that critical components don’t blindly follow commands from the OBD2 port. They need intrusion detection systems that can alert the driver—or rider—when something anomalous happens on the cars’ internal networks. (Miller and Valasek designed one such prototype.) And to prevent hackers from getting an initial, remote foothold, cars need to limit their “attack surface,” any services that might accept malicious data sent over the internet.

Complicating those fixes? Companies like Uber and Didi don’t even make the cars they use, but instead have to bolt on any added security after the fact. “They’re getting a car that already has some attack surface, some vulnerabilities, and a lot of software they don’t have any control over, and then trying to make that into something secure,” says Miller. “That’s really hard.”

That means solving autonomous vehicles’ security nightmares will require far more open conversation and cooperation among companies. That’s part of why Miller left Uber, he says: He wants the freedom to speak more openly within the industry. “I want to talk about how we’re securing cars and the scary things we see, instead of designing these things in private and hoping that we all know what we’re doing,” he says.

Car hacking, fortunately, remains largely a concern for the future: No car has yet been digitally hijacked in a documented, malicious case. But that means now’s the time to work on the problem, Miller says, before cars become more automated and make the problem far more real. “We have some time to build up these security measures and get them right before something happens,” says Miller. “And that’s why I’m doing this.”

https://www.wired.com/2017/04/ubers-former-top-hacker-securing-autonomous-cars-really-hard-problem/

Delete Signal’s texts, or the app itself, and virtually no trace of the conversation remains.

Delete Signal’s texts, or the app itself, and virtually no trace of the conversation remains. “The messages are pretty much gone

Suing to See the Feds’ Encrypted Messages? Good Luck

The recent rise of end-to-end encrypted messaging apps has given billions of people access to strong surveillance protections. But as one federal watchdog group may soon discover, it also creates a transparency conundrum: Delete the conversation from those two ends, and there may be no record left.

The conservative group Judicial Watch is suing the Environmental Protection Agency under the Freedom of Information Act, seeking to compel the EPA to hand over any employee communications sent via Signal, the encrypted messaging and calling app. In its public statement about the lawsuit, Judicial Watch points to reports that EPA staffers have used Signal to communicate secretly, in the face of an adversarial Trump administration.

But encryption and forensics experts say Judicial Watch may have picked a tough fight. Delete Signal’s texts, or the app itself, and virtually no trace of the conversation remains. “The messages are pretty much gone,” says Johns Hopkins crypotgrapher Matthew Green, who has closely followed the development of secure messaging tools. “You can’t prove something was there when there’s nothing there.”

End-to-Dead-End

Signal, like other end-to-end encryption apps, protects messages such that only the people participating in a conversation can read them. No outside observer—not even the Signal server that the messages route through—can sneak a look. Delete the messages from the devices of two Signal communicants, and no other unencrypted copy of it exists.

In fact, Signal’s own server doesn’t keep record of even the encrypted versions of those communications. Last October, Signal’s developers at the non-profit Open Whisper Systems revealed that a grand jury subpoena had yielded practically no useful data. “The only information we can produce in response to a request like this is the date and time a user registered with Signal and the last date of a user’s connectivity to the Signal service,” Open Whisper Systems wrote at the time. (That’s the last time they opened the app, not sent or received a message.)

Even seizing and examining the phones of EPA employees likely won’t help if users have deleted their messages or the full app, Green says. They could even do so on autopilot. Six months ago, Signal added a Snapchat-like feature to allow automated deletionof a conversation from both users’ phones after a certain amount of time. Forensic analyst Jonathan Zdziarski, who now works as an Apple security engineer, wrote in a blog post last year that after Signal messages are deleted, the app “leaves virtually nothing, so there’s nothing to worry about. No messy cleanup.” (Open Whisper Systems declined to comment on the Judicial Watch FOIA request, or how exactly it deletes messages.)

Still, despite its best sterilization efforts, even Signal might leave some forensic trace of deleted messages on phones, says Green. And other less-secure ephemeral messaging apps like Confide, which has also become popular among government staffers, likely leave more fingerprints behind. But Green argues that recovering deleted messages from even sloppier apps would take deeper digging than FOIA requests typically compel—so long as users are careful to delete messages on both sides of the conversation and any cloud backups. “We’re talking about expensive, detailed forensic analysis,” says Green. “It’s a lot more work than you’d expect from someone carrying out FOIA requests.”

For the Records

Deleting records of government business from government-issued devices is—let’s be clear—illegal. That smartphone scrubbing, says Georgetown Law professor David Vladeck, would blatantly violate the Federal Records Act. “It’s no different from taking records home and burning them,” says Vladeck. “They’re not your records, they’re the federal government’s, and you’re not supposed to do that.”

Judicial Watch, for its part, acknowledges that it may be tough to dig up deleted Signal communications. But another element of its FOIA request asks for any EPA information about whether it has approved Signal for use by agency staffers. “They can’t use these apps to thwart the Federal Records Act just because they don’t like Donald Trump,” says Judicial Watch president Tom Fitton. “This serves also as an educational moment for any government employees, that using the app to conduct government business to ensure the deletion of records is against the law, and against record-keeping policies in almost every agency.”

Fitton hopes the lawsuit will at least compel the EPA to prevent employees from installing Signal or similar apps on government-issued phones. “The agency is obligated to ensure their employees are following the rules so that records subject to FOIA are preserved,” he says. “If they’re not doing that, they could be answerable to the courts.”

Georgetown’s Vladeck says that even evidence employees have used Signal at all should be troubling, and might warrant a deeper investigation. “I would be very concerned if employees were using an app designed to leave no trace. That’s smoke, if not a fire, and it’s deeply problematic,” he says.

But Johns Hopkins’ Green counters that FOIA has never been an all-seeing eye into government agencies. And he points out that sending a Signal message to an EPA colleague isn’t so different from simply walking into their office and closing the door. “These ephemeral communications apps give us a way to have those face-to-face conversations electronically and in a secure way,” says Green. “It’s a way to communicate without being on the record. And people need that.”

https://www.wired.com/2017/04/suing-see-feds-encrypted-messages-good-luck/

The CIA Leak Exposes Tech’s Vulnerable Future

Source: https://www.wired.com/2017/03/cia-leak-exposes-techs-vulnerable-future/