The millions of dots on the map trace highways, side streets and bike trails — each one following the path of an anonymous cellphone user.
One path tracks someone from a home outside Newark to a nearby Planned Parenthood, remaining there for more than an hour. Another represents a person who travels with the mayor of New York during the day and returns to Long Island at night.
Yet another leaves a house in upstate New York at 7 a.m. and travels to a middle school 14 miles away, staying until late afternoon each school day. Only one person makes that trip: Lisa Magrin, a 46-year-old math teacher. Her smartphone goes with her.
An app on the device gathered her location information, which was then sold without her knowledge. It recorded her whereabouts as often as every two seconds, according to a database of more than a million phones in the New York area that was reviewed by The New York Times. While Ms. Magrin’s identity was not disclosed in those records, The Times was able to easily connect her to that dot.
The app tracked her as she went to a Weight Watchers meeting and to her dermatologist’s office for a minor procedure. It followed her hiking with her dog and staying at her ex-boyfriend’s home, information she found disturbing.
“It’s the thought of people finding out those intimate details that you don’t want people to know,” said Ms. Magrin, who allowed The Times to review her location data.
Like many consumers, Ms. Magrin knew that apps could track people’s movements. But as smartphones have become ubiquitous and technology more accurate, an industry of snooping on people’s daily habits has spread and grown more intrusive.
At least 75 companies receive anonymous, precise location data from apps whose users enable location services to get local news and weather or other information, The Times found. Several of those businesses claim to track up to 200 million mobile devices in the United States — about half those in use last year. The database reviewed by The Times — a sample of information gathered in 2017 and held by one company — reveals people’s travels in startling detail, accurate to within a few yards and in some cases updated more than 14,000 times a day.
These companies sell, use or analyze the data to cater to advertisers, retail outlets and even hedge funds seeking insights into consumer behavior. It’s a hot market, with sales of location-targeted advertising reaching an estimated $21 billion this year. IBM has gotten into the industry, with its purchase of the Weather Channel’s apps. The social network Foursquare remade itself as a location marketing company. Prominent investors in location start-ups include Goldman Sachs and Peter Thiel, the PayPal co-founder.
Businesses say their interest is in the patterns, not the identities, that the data reveals about consumers. They note that the information apps collect is tied not to someone’s name or phone number but to a unique ID. But those with access to the raw data — including employees or clients — could still identify a person without consent. They could follow someone they knew, by pinpointing a phone that regularly spent time at that person’s home address. Or, working in reverse, they could attach a name to an anonymous dot, by seeing where the device spent nights and using public records to figure out who lived there.
Many location companies say that when phone users enable location services, their data is fair game. But, The Times found, the explanations people see when prompted to give permission are often incomplete or misleading. An app may tell users that granting access to their location will help them get traffic information, but not mention that the data will be shared and sold. That disclosure is often buried in a vague privacy policy.
“Location information can reveal some of the most intimate details of a person’s life — whether you’ve visited a psychiatrist, whether you went to an A.A. meeting, who you might date,” said Senator Ron Wyden, Democrat of Oregon, who has proposed bills to limit the collection and sale of such data, which are largely unregulated in the United States.
“It’s not right to have consumers kept in the dark about how their data is sold and shared and then leave them unable to do anything about it,” he added.
Mobile Surveillance Devices
After Elise Lee, a nurse in Manhattan, saw that her device had been tracked to the main operating room at the hospital where she works, she expressed concern about her privacy and that of her patients.
“It’s very scary,” said Ms. Lee, who allowed The Times to examine her location history in the data set it reviewed. “It feels like someone is following me, personally.”
The mobile location industry began as a way to customize apps and target ads for nearby businesses, but it has morphed into a data collection and analysis machine.
Retailers look to tracking companies to tell them about their own customers and their competitors’. For a web seminar last year, Elina Greenstein, an executive at the location company GroundTruth, mapped out the path of a hypothetical consumer from home to work to show potential clients how tracking could reveal a person’s preferences. For example, someone may search online for healthy recipes, but GroundTruth can see that the person often eats at fast-food restaurants.
“We look to understand who a person is, based on where they’ve been and where they’re going, in order to influence what they’re going to do next,” Ms. Greenstein said.
Financial firms can use the information to make investment decisions before a company reports earnings — seeing, for example, if more people are working on a factory floor, or going to a retailer’s stores.
Health care facilities are among the more enticing but troubling areas for tracking, as Ms. Lee’s reaction demonstrated. Tell All Digital, a Long Island advertising firm that is a client of a location company, says it runs ad campaigns for personal injury lawyers targeting people anonymously in emergency rooms.
“The book ‘1984,’ we’re kind of living it in a lot of ways,” said Bill Kakis, a managing partner at Tell All.
Jails, schools, a military base and a nuclear power plant — even crime scenes — appeared in the data set The Times reviewed. One person, perhaps a detective, arrived at the site of a late-night homicide in Manhattan, then spent time at a nearby hospital, returning repeatedly to the local police station.
Two location firms, Fysical and SafeGraph, mapped people attending the 2017 presidential inauguration. On Fysical’s map, a bright red box near the Capitol steps indicated the general location of President Trump and those around him, cellphones pinging away. Fysical’s chief executive said in an email that the data it used was anonymous. SafeGraph did not respond to requests for comment.
More than 1,000 popular apps contain location-sharing code from such companies, according to 2018 data from MightySignal, a mobile analysis firm. Google’s Android system was found to have about 1,200 apps with such code, compared with about 200 on Apple’s iOS.
The most prolific company was Reveal Mobile, based in North Carolina, which had location-gathering code in more than 500 apps, including many that provide local news. A Reveal spokesman said that the popularity of its code showed that it helped app developers make ad money and consumers get free services.
To evaluate location-sharing practices, The Times tested 20 apps, most of which had been flagged by researchers and industry insiders as potentially sharing the data. Together, 17 of the apps sent exact latitude and longitude to about 70 businesses. Precise location data from one app, WeatherBug on iOS, was received by 40 companies. When contacted by The Times, some of the companies that received that data described it as “unsolicited” or “inappropriate.”
WeatherBug, owned by GroundTruth, asks users’ permission to collect their location and tells them the information will be used to personalize ads. GroundTruth said that it typically sent the data to ad companies it worked with, but that if they didn’t want the information they could ask to stop receiving it.
The Times also identified more than 25 other companies that have said in marketing materials or interviews that they sell location data or services, including targeted advertising.
The spread of this information raises questions about how securely it is handled and whether it is vulnerable to hacking, said Serge Egelman, a computer security and privacy researcher affiliated with the University of California, Berkeley.
“There are really no consequences” for companies that don’t protect the data, he said, “other than bad press that gets forgotten about.”
A Question of Awareness
Companies that use location data say that people agree to share their information in exchange for customized services, rewards and discounts. Ms. Magrin, the teacher, noted that she liked that tracking technology let her record her jogging routes.
Brian Wong, chief executive of Kiip, a mobile ad firm that has also sold anonymous data from some of the apps it works with, says users give apps permission to use and share their data. “You are receiving these services for free because advertisers are helping monetize and pay for it,” he said, adding, “You would have to be pretty oblivious if you are not aware that this is going on.”
But Ms. Lee, the nurse, had a different view. “I guess that’s what they have to tell themselves,” she said of the companies. “But come on.”
Ms. Lee had given apps on her iPhone access to her location only for certain purposes — helping her find parking spaces, sending her weather alerts — and only if they did not indicate that the information would be used for anything else, she said. Ms. Magrin had allowed about a dozen apps on her Android phone access to her whereabouts for services like traffic notifications.
But it is easy to share information without realizing it. Of the 17 apps that The Times saw sending precise location data, just three on iOS and one on Android told users in a prompt during the permission process that the information could be used for advertising. Only one app, GasBuddy, which identifies nearby gas stations, indicated that data could also be shared to “analyze industry trends.”
More typical was theScore, a sports app: When prompting users to grant access to their location, it said the data would help “recommend local teams and players that are relevant to you.” The app passed precise coordinates to 16 advertising and location companies.
A spokesman for theScore said that the language in the prompt was intended only as a “quick introduction to certain key product features” and that the full uses of the data were described in the app’s privacy policy.
The Weather Channel app, owned by an IBM subsidiary, told users that sharing their locations would let them get personalized local weather reports. IBM said the subsidiary, the Weather Company, discussed other uses in its privacy policy and in a separate “privacy settings” section of the app. Information on advertising was included there, but a part of the app called “location settings” made no mention of it.
The app did not explicitly disclose that the company had also analyzed the data for hedge funds — a pilot program that was promoted on the company’s website. An IBM spokesman said the pilot had ended. (IBM updated the app’s privacy policy on Dec. 5, after queries from The Times, to say that it might share aggregated location data for commercial purposes such as analyzing foot traffic.)
Even industry insiders acknowledge that many people either don’t read those policies or may not fully understand their opaque language. Policies for apps that funnel location information to help investment firms, for instance, have said the data is used for market analysis, or simply shared for business purposes.
“Most people don’t know what’s going on,” said Emmett Kilduff, the chief executive of Eagle Alpha, which sells data to financial firms and hedge funds. Mr. Kilduff said responsibility for complying with data-gathering regulations fell to the companies that collected it from people.
Many location companies say they voluntarily take steps to protect users’ privacy, but policies vary widely.
For example, Sense360, which focuses on the restaurant industry, says it scrambles data within a 1,000-foot square around the device’s approximate home location. Another company, Factual, says that it collects data from consumers at home, but that its database doesn’t contain their addresses.
Some companies say they delete the location data after using it to serve ads, some use it for ads and pass it along to data aggregation companies, and others keep the information for years.
Several people in the location business said that it would be relatively simple to figure out individual identities in this kind of data, but that they didn’t do it. Others suggested it would require so much effort that hackers wouldn’t bother.
It “would take an enormous amount of resources,” said Bill Daddi, a spokesman for Cuebiq, which analyzes anonymous location data to help retailers and others, and raised more than $27 million this year from investors including Goldman Sachs and Nasdaq Ventures. Nevertheless, Cuebiq encrypts its information, logs employee queries and sells aggregated analysis, he said.
There is no federal law limiting the collection or use of such data. Still, apps that ask for access to users’ locations, prompting them for permission while leaving out important details about how the data will be used, may run afoul of federal rules on deceptive business practices, said Maneesha Mithal, a privacy official at the Federal Trade Commission.
“You can’t cure a misleading just-in-time disclosure with information in a privacy policy,” Ms. Mithal said.
Following the Money
Apps form the backbone of this new location data economy.
The app developers can make money by directly selling their data, or by sharing it for location-based ads, which command a premium. Location data companies pay half a cent to two cents per user per month, according to offer letters to app makers reviewed by The Times.
Targeted advertising is by far the most common use of the information.
Google and Facebook, which dominate the mobile ad market, also lead in location-based advertising. Both companies collect the data from their own apps. They say they don’t sell it but keep it for themselves to personalize their services, sell targeted ads across the internet and track whether the ads lead to sales at brick-and-mortar stores. Google, which also receives precise location information from apps that use its ad services, said it modified that data to make it less exact.
Smaller companies compete for the rest of the market, including by selling data and analysis to financial institutions. This segment of the industry is small but growing, expected to reach about $250 million a year by 2020, according to the market research firm Opimas.
Apple and Google have a financial interest in keeping developers happy, but both have taken steps to limit location data collection. In the most recent version of Android, apps that are not in use can collect locations “a few times an hour,” instead of continuously.
Apple has been stricter, for example requiring apps to justify collecting location details in pop-up messages. But Apple’s instructions for writing these pop-ups do not mention advertising or data sale, only features like getting “estimated travel times.”
A spokesman said the company mandates that developers use the data only to provide a service directly relevant to the app, or to serve advertising that met Apple’s guidelines.
Apple recently shelved plans that industry insiders say would have significantly curtailed location collection. Last year, the company said an upcoming version of iOS would show a blue bar onscreen whenever an app not in use was gaining access to location data.
The discussion served as a “warning shot” to people in the location industry, David Shim, chief executive of the location company Placed, said at an industry event last year.
After examining maps showing the locations extracted by their apps, Ms. Lee, the nurse, and Ms. Magrin, the teacher, immediately limited what data those apps could get. Ms. Lee said she told the other operating-room nurses to do the same.
“I went through all their phones and just told them: ‘You have to turn this off. You have to delete this,’” Ms. Lee said. “Nobody knew.”
Records show a device entering Gracie Mansion, the mayor’s residence, before traveling to a Y.M.C.A. in Brooklyn that the mayor frequents.
It travels to an event on Staten Island that the mayor attended. Later, it returns to a home on Long Island.
An app on Lisa Magrin’s cellphone collected her location information, which was then shared with other companies. The data revealed her daily habits, including hikes with her dog, Lulu.Nathaniel Brooks for The New York Times
A notice that Android users saw when theScore, a sports app, asked for access to their location data.
The Weather Channel app showed iPhone users this message when it first asked for their location data.
In the data set reviewed by The Times, phone locations are recorded in sensitive areas including the Indian Point nuclear plant near New York City. By Michael H. Keller | Satellite imagery by Mapbox and DigitalGlobe
It’s not just Facebook: Android and iOS’s App Stores have incentivized an app economy where free apps make money by selling your personal data and location history to advertisers.
Image: Shutterstock
Monday morning, the New York Times published a horrifying investigation in which the publication reviewed a huge, “anonymized” dataset of smartphone location data from a third-party vendor, de-anonymized it, and tracked ordinary people through their day-to-day lives—including sensitive stops at places like Planned Parenthood, their homes, and their offices.
The article lays bare what the privacy-conscious have suspected for years: The apps on your smartphone are tracking you, and that for all the talk about “anonymization” and claims that the data is collected only in aggregate, our habits are so specific—and often unique—so that anonymized identifiers can often be reverse engineered and used to track individual people.
Along with the investigation, the New York Timespublished a guide to managing and restricting location data on specific apps. This is easier on iOS than it is Android, and is something everyone should be periodically doing. But the main takeaway, I think, is not just that we need to be more scrupulous about our location data settings. It’s that we need to be much, much more restrictive about the apps that we install on our phones.
Everywhere we go, we are carrying a device that not only has a GPS chip designed to track our location, but an internet or LTE connection designed to transmit that information to third parties, many of whom have monetized that data. Rough location data can be gleaned by tracking the cell phone towers your phone connects to, and the best way to guarantee privacy would be to have a dumb phone, an iPod Touch, or no phone at all. But for most people, that’s not terribly practical, and so I think it’s worth taking a look at the types of apps that we have installed on our phone, and their value propositions—both to us, and to their developers.
A good question to ask yourself when evaluating your apps is “why does this app exist?”
The early design decisions of Apple, Google, and app developers continue to haunt us all more than a decade later. Broadly and historically speaking, we have been willing to spend hundreds of dollars on a smartphone, but balk at the idea of spending $.99 on an app. Our reluctance to pay any money up front for apps has come at an unknowable but massive cost to our privacy. Even a lowly flashlight or fart noise app is not free to make, and the overwhelming majority of “free” apps are not altruistic—they are designed to make money, which usually means by harvesting and reselling your data.
A good question to ask yourself when evaluating your apps is “why does this app exist?” If it exists because it costs money to buy, or because it’s the free app extension of a service that costs money, then it is more likely to be able to sustain itself without harvesting and selling your data. If it’s a free app that exists for the sole purpose of amassing a large amount of users, then chances are it has been monetized by selling data to advertisers.
The New York Times noted that much of the data used in its investigation came from free weather and sports scores apps that turned around and sold their users’ data; hundreds of free games, flashlight apps, and podcast apps ask for permissions they don’t actually need for the express purpose of monetizing your data.
Even apps that aren’t blatantly sketchy data grabs often function that way: Facebook and its suite of apps (Instagram, Messenger, etc) collect loads of data about you both from your behavior on the app itself but also directly from your phone (Facebook went to great lengths to hide the fact that its Android app was collecting call log data.) And Android itself is a smartphone ecosystem that also serves as yet another data collection apparatus for Google. Unless you feel particularly inclined to read privacy policies that are dozens of pages long for every app you download, who knows what information bespoke apps for news, podcasts, airlines, ticket buying, travel, and social media are collecting and selling.
This problem is getting worse, not better: Facebook made WhatsApp, an app that managed to be profitable with a $1 per year subscription fee, into a “free” service because it believed it could make more money with an advertising-based business model.
What this means is that the dominant business model on our smartphones is one that’s predicated on monetizing you, and only through paying obsessive attention to your app permissions and seeking paid alternatives can you hope to minimize these impacts on yourself. If this bothers you, your only options are to get rid of your smartphone altogether or to rethink what apps you want installed on your phone and act accordingly.
It might be time to get rid of all the free single-use apps that are essentially re-sized websites. Generally speaking, it is safer, privacywise, to access your data on a browser, even if it’s more inconvenient. On second thought, it may be time to delete all your apps and start over using only apps that respect your privacy and that have sustainable business models that don’t rely on monetizing your data. On iOS, this might mean using more of Apple’s first party apps, even if they don’t work as well as free third-party versions.
For weeks, a small team of security researchers and developers have been putting the finishing touches on a new privacy app, which its founder says can nix some of the hidden threats that mobile users face — often without realizing.
Phones track your location, apps siphon off our data, and aggressive ads try to grab your attention. Your phone has long been a beacon of data, broadcasting to ad networks and data trackers, trying to build up profiles on you wherever you go to sell you things you’ll never want.
Will Strafach knows that all too well. A security researcher and former iPhone jailbreaker, Strafach has shifted his time digging into apps for insecure, suspicious and unethical behavior. Last year, he found AccuWeather was secretly sending precise location data without a user’s permission. And just a few months ago, he revealed a list of dozens of apps that were sneakily siphoning off their users’ tracking data to data monetization firms without their users’ explicit consent.
Now his team — including co-founder Joshua Hill and chief operating officer Chirayu Patel — will soon bake those findings into its new “smart firewall” app, which he says will filter and block traffic that invades a user’s privacy.
“We’re in a ‘wild west’ of data collection,” he said, “where data is flying out from your phone under the radar — not because people don’t care but there’s no real visibility and people don’t know it’s happening,” he told me in a call last week.
At its heart, the Guardian Mobile Firewall — currently in a closed beta — funnels all of an iPhone or iPad’s internet traffic through an encrypted virtual private network (VPN) tunnel to Guardian’s servers, outsourcing all of the filtering and enforcement to the cloud to help reduce performance issues on the device’s battery. It means the Guardian app can near-instantly spot if another app is secretly sending a device’s tracking data to a tracking firm, warning the user or giving the option to stop it in its tracks. The aim isn’t to prevent a potentially dodgy app from working properly, but to give users’ awareness and choice over what data leaves their device.
Strafach described the app as “like a junk email filter for your web traffic,” and you can see from of the app’s dedicated tabs what data gets blocked and why. A future version plans to allow users to modify or block their precise geolocation from being sent to certain servers. Strafach said the app will later tell a user how many times an app accesses device data, like their contact lists.
But unlike other ad and tracker blockers, the app doesn’t use overkill third-party lists that prevent apps from working properly. Instead, taking a tried-and-tested approach from the team’s own research. The team periodically scans a range of apps in the App Store to help identify problematic and privacy-invasive issues that are fed to the app to help improve over time. If an app is known to have security issues, the Guardian app can alert a user to the threat. The team plans to continue building machine learning models that help to identify new threats — including so-called “aggressive ads” — that hijack your mobile browser and redirect you to dodgy pages or apps.
Screenshots of the Guardian app, set to be released in December (Image: supplied)
Strafach said that the app will “err on the side of usability” by warning users first — with the option of blocking it. A planned future option will allow users to go into a higher, more restrictive privacy level — “Lockdown mode” — which will deny bad traffic by default until the user intervenes.
What sets the Guardian app from its distant competitors is its anti-data collection.
Whenever you use a VPN — to evade censorship, site blocks or surveillance — you have to put more trust in the VPN server to keep all of your internet traffic safe than your internet provider or cell carrier. Strafach said that neither he nor the team wants to know who uses the app. The less data they have, the less they know, and the safer and more private its users are.
“We don’t want to collect data that we don’t need,” said Strafach. “We consider data a liability. Our rule is to collect as little as possible. We don’t even use Google Analytics or any kind of tracking in the app — or even on our site, out of principle.”
The app works by generating a random set of VPN credentials to connect to the cloud. The connection uses IPSec (IKEv2) with a strong cipher suite, he said. In other words, the Guardian app isn’t a creepy VPN app like Facebook’s Onavo, which Apple pulled from the App Store for collecting data it shouldn’t have been. “On the server side, we’ll only see a random device identifier, because we don’t have accounts so you can’t be attributable to your traffic,” he said.
“We don’t even want to say ‘you can trust us not to do anything,’ because we don’t want to be in a position that we have to be trusted,” he said. “We really just want to run our business the old fashioned way. We want people to pay for our product and we provide them service, and we don’t want their data or send them marketing.”
“It’s a very hard line,” he said. “We would shut down before we even have to face that kind of decision. It would go against our core principles.”
I’ve been using the app for the past week. It’s surprisingly easy to use. For a semi-advanced user, it can feel unnatural to flip a virtual switch on the app’s main screen and allow it to run its course. Anyone who cares about their security and privacy are often always aware of their “opsec” — one wrong move and it can blow your anonymity shield wide open. Overall, the app works well. It’s non-intrusive, it doesn’t interfere, but with the “VPN” icon lit up at the top of the screen, there’s a constant reminder that the app is working in the background.
It’s impressive how much the team has kept privacy and anonymity so front of mind throughout the app’s design process — even down to allowing users to pay by Apple Pay and through in-app purchases so that no billing information is ever exchanged.
The app doesn’t appear to slow down the connection when browsing the web or scrolling through Twitter or Facebook, on neither LTE or a Wi-Fi network. Even streaming a medium-quality live video stream didn’t cause any issues. But it’s still early days, and even though the closed beta has a few hundred users — myself included — as with any bandwidth-intensive cloud service, the quality could fluctuate over time. Strafach said that the backend infrastructure is scalable and can plug-and-play with almost any cloud service in the case of outages.
In its pre-launch state, the company is financially healthy, scoring a round of initial seed funding to support getting the team together, the app’s launch, and maintaining its cloud infrastructure. Steve Russell, an experienced investor and board member, said he was “impressed” with the team’s vision and technology.
“Quality solutions for mobile security and privacy are desperately needed, and Guardian distinguishes itself both in its uniqueness and its effectiveness,” said Russell in an email.
He added that the team is “world class,” and has built a product that’s “sorely needed.”
Strafach said the team is running financially conservatively ahead of its public reveal, but that the startup is looking to raise a Series A to support its anticipated growth — but also the team’s research that feeds the app with new data. “There’s a lot we want to look into and we want to put out more reports on quite a few different topics,” he said.
As the team continue to find new threats, the better the app will become.
The app’s early adopter program is open, including its premium options. The app is expected to launch fully in December.
With all of the personal information it contains, Apple added plenty of security measures to your iPhone protect you and your device from unwanted access. In iOS 12, there are several changes to help keep your device even more secure and private, and the update built on previous improvements to ensure your data stays safe.
Even with these improvements, your iPhone’s overall security still largely depends on you — from the security measures you use to how much data you wish to share with Apple and other parties. Because of this, we’ve rounded up the new privacy settings in iOS 12 that you should check, along with settings that have existed since previous versions of iOS that still remain relevant.
1. Use Automated 2FA
Two-factor authentication, known commonly as 2FA, gives you an added layer of security for apps and other services in the form of a six-digit numeric PIN that’s sent to you via Messages. In the past, you had to retrieve and input a time-sensitive code, which made access cumbersome. To alleviate this, iOS 12 has made 2FA security codes available as AutoFill options.
In other words, you no longer have to jump from a login page over to Messages to retrieve your security code, then back again to type it in. Unfortunately, the auto-fill feature doesn’t extend to external 2FA apps like Google Authenticator, and there’s no concrete information as to whether it’ll be added on with future updates.
This is a security setting you should simply be aware of, considering how easy it makes 2FA. Once your iPhone gets updated to iOS 12, it would be a good idea to go through any online accounts that contain sensitive data and enable 2FA if it’s available.
2. Audit Your Passwords
To further beef up your privacy and security, iOS 12 has introduced Password Reuse Auditing, a feature that keeps track of saved passwords and flags identical ones for different accounts; This can be accessed by going to Settings –> Password & Accounts –> Website & App Passwords. From there, any accounts that have identical passwords will be marked with a triangle containing an exclamation point.
Tap on any of the suspect accounts, and select „Change Password on Website“ on the following page to create a new password.
Brute-force USB unlocking tools like Cellebrite and GrayShift have become popular in law enforcement circles nationwide due to their ability to bypass iOS restrictions on the number of incorrect passcode attempts. This enables officers to unlock confiscated devices by entering an unlimited amount of guesses until they finally get past the lock screen.
In an effort to combat this, iOS 12 has USB Restricted Mode, which requires you to unlock your iPhone with a password when connecting to a USB device. Unlike past iOS betas which only required a password for devices that haven’t been unlocked for seven days, iOS 12 (as well as iOS 11.4.1 before it) has significantly reduced the requirement window to one hour.
This stringent requirement effectively nullifies law enforcement’s ability to unlock suspect iPhones with USB unlocking tools, as they will have only a 60-minute window to gain access to the device before the password requirement kicks in. If you want to disable this feature, however, head to Settings –> Touch ID & Passcode, and tap on the toggle next to „USB Accessories“ so it’s green.
With its first anniversary fast approaching, it’s safe to say that Face ID has proved to be a reliable way of unlocking your iPhone X while keeping it secure from unwanted access. Nothing is bulletproof, however. Apple advertises a false acceptance rate of 1 in a million for Face ID, and considering there are 7.6 billion people on earth, that means roughly 7,600 other people could unlock your iPhone.
If that’s not enough to warrant concern, there’s an even higher chance of someone forcibly using your own face against your will to gain access to your iPhone. So if you want to maximize security, we recommend disabling Face ID altogether by going to Settings –> Face ID & Passcode. Instead, use a strong password, something longer than a six-digit numeric passcode.
If you must keep Face ID on, don’t worry. Apple has included a quick way to disable Face ID temporarily, in case you know your physical security is about to become compromised. Be sure to check out our guide below to find out more about this option, which leaves your phone’s security in the hands of your passcode.
Just like the iPhone X with Face ID, Touch ID on other iPhone models can be a problem. For one, you don’t want to store your fingerprint in any database, even if it’s locally on your iPhone, since someone could potentially pull that record with access to your device. It’s much safer in the long run to use a less-convenient passcode. You can disable Touch ID via the Touch ID & Passcode settings.
7. Disable Touch ID Temporarily
Again, just like with Face ID, you can disable Touch ID temporarily instead if you don’t want to lose the convenience of Touch ID permanently. With a certain button combo press, you can disable it before handing it over to law enforcement, thieves, or even nosy friends and family.
By default, the iPhone passcode is six numeric digits long, though you can still set it to four numeric digits for added convenience. While there is nothing inherently wrong with using these passcode limits, they aren’t the most secure. A four-digit passcode, for instance, has 10,000 possible combinations, and considering there are 85.8 million iPhone users in the United States alone, there just aren’t enough unique combinations to go around.
Increasing the passcode to six digits increases the number of possible combinations to one million and brings it up to par with Face ID’s odds. If you want to go beyond those odds and maximize your iPhone’s security, change your passcode to a password, as using a true password with a combination of letters, numbers, and special characters will make your lock screen virtually impenetrable.
Granted, entering a convoluted password into your phone every time you want to use it is not ideal, but it’s currently the most secure way to lock your iPhone. So if you want to maintain a balance between convenience and security, choose a six-digit passcode over a four-digit one, while making sure to avoid common passcodes like 123456 or six of the same number.
To change your iPhone’s password, go to Settings –> Touch ID & Passcode –> Change Passcode. Enter your old passcode when prompted, then tap „Passcode Options“ to choose which type of passcode you’d like to make.
If you connect your iPhone to your car either through Bluetooth or CarPlay, your iPhone may be recording the location of where you park. While this information may be useful to some, to others, it may feel like an outright invasion of privacy. So if you feel like the latter, you’ll naturally want to shut this feature off. To do so, open your Settings app, then tap on „Maps.“ From there, simply tap on the toggle next to „Show Parked Locations“ to turn the feature off.
10. Disable & Clear Significant Locations
„Significant Locations“ is a setting that lets Apple record a list of your most frequently visited locations. And while this may optimize some apps that rely on location services, the improvements might be outweighed by privacy concerns overall.
So if you’d rather not let Apple know about locations you frequently visit, head over to Settings –> Privacy –> Location Services –> System Services –> Significant Locations, then disable it. From there, you also have the added option of clearing the history that your iPhone may have accumulated over time.
11. Turn Off Location-Based Alerts, Apple Ads & Suggestions
When enabled, location-based alerts, Apple ads, and suggestions all track your location to provide targeted notifications, advertisements, and options. To say that these options are not the most privacy-centric features in iOS 12 would be an understatement. In fact, these settings are actually quite creepy.
So if you don’t want to be specifically targeted by Apple wherever you go, open your Settings app, select „Privacy,“ and tap on „System Services“ on the following page. From there, you can deactivate „Location-Based Alerts,“ „Location-Based Apple Ads,“ and „Location-Based Suggestions“ by turning their corresponding toggles off.
Having „Share My Location“ enabled lets you send your current whereabouts to a friend who requests it. While you need to mutually agree to this arrangement with another person using the Find My Friends app, there are ways of tracking your iPhone without your permission. If you’d like to avoid that risk altogether, disable the option by going to Settings –> Privacy –> Location Services –> Share My Location.
Alternatively, you can change the device that shares your location, if you have more than one attached to your Apple ID. You can also check with friends of yours you have approved to view your location.
13. Turn Off Analytics
Formerly „Diagnostics & Usage, the „Analytics“ page found within your iPhone’s Settings app contains options that share data from your phone to Apple in an effort to help identify bugs in the system and make iOS better overall. Think of it as a beta test, only for the official iOS 12 release.
While this information gives Apple the ability to detect issues and help keep iOS 12 running smoothly, you wouldn’t be alone in feeling that your iPhone may be sharing too much without your knowledge. If you’d like to end hidden communication between your Device and Apple, go to Settings –> Privacy –> Analytics.
From there, you have many options you can disable:
Turn off „Share iPhone & Watch Analytics“ to disable all analytics with Apple.
„Share With App Developers“ shares your app data with that app’s developer. Disable this setting to close that line of communication.
Disable „Share iCloud Analytics“ to prevent Apple from using your iCloud data to improve on apps and services associated with that information.
„Improve Health & Activity“ shares your health and activity data with Apple to improve these services on your iPhone. Disable this feature if you don’t want Apple to know about such private information.
„Improve Health Records“ shares pertinent health conditions such as medications, lab results, and other conditions with Apple. Disable this feature as you did with health and activity above.
„Improve Wheelchair Mode“ will send Apple your activity data if you use a wheelchair. Again, turn this feature off as you did „Improve Health & Activity,“ regardless of whether you’re in a wheelchair or not.
14. Limit Ad Tracking
„Limit Ad Tracking“ can be enabled if you prefer your ads to be directly targeted towards you and your interests. If you’re more focused on privacy, however, letting Apple share your data with advertisers may not be to your liking. This setting is one you actually turn on instead of the other way around. So head to Settings –> Privacy –> Advertising, then enable „Limit Ad Tracking.“
Notice how the option is Limit Ad Tracking, not Stop Ad Tracking. Even with this setting enabled, Apple claims that your iPhone connectivity, time setting, type, language, and location can be used to target advertising. If you disabled Location-Based Ads, location targeting will not apply to you, but all others will. Tap „View Ad Information“ to learn more.
15. Prevent Replying in Messages
Introduced in iOS 10, your iPhone gives you the option to 3D Touch messages to reply from your lock screen. While convenient, the feature is also easily accessed by other people. So if you’re worried about those around you replying to incoming messages on your iPhone, you might want to disable this option. Be sure to check out the article below to see how.
With „Raise to Wake“ enabled, you’ll simply need to raise your phone from a flat position to wake it up. As natural and convenient as this feature is, it does pose a privacy risk. If your iPhone turns face-up accidentally, for instance, anyone within view of your iPhone’s display may see messages and notifications that you want to keep private.
To avoid this scenario, head over to your iPhone’s Settings app and select „Display & Brightness.“ From there, simply tap on the toggle next to „Raise to Wake“ to disable the feature. If you don’t want to disable „Raise to Wake“ but still want your content private on the lock screen, you can disable previews instead.
Lock screen widgets are great for staying on top of your messages, notifications, calendar — basically whatever else you need to know without having to unlock your iPhone. The obvious downside is you don’t need to unlock your iPhone to view important information. Anyone can pick up your iPhone and potentially see who’s texting you what, in addition to finding out your agenda is for the day.
To avoid this potential breach in privacy, you could hit „Edit“ at the bottom of the lock screen, then delete all widgets. However, you will lose those widgets when you’ve unlocked your phone as well, not just on the lock screen. So if you want to deactivate the widgets for only the lock screen, simply head to the article below.
The Control Center went through a major revamp on iOS 11 and gave us the ability to customize the toggles with a number of features and options. Unfortunately, these nifty additions can be detrimental to you and your iPhone in terms of privacy and security.
While most content-sensitive apps require a passcode from the lock screen to access, there are apps that, at the very least, give users limited access without having to unlock the iPhone. If you have Notes activated, for instance, anyone can freely access it straight from the Control Center to write notes, though they cannot view written notes without unlocking your iPhone first.
You can disable any apps from the Control Center that you don’t want people having access to, but that means you won’t be able to access them when your iPhone is unlocked, either. An alternative option is to disable Control Center entirely from the lock menu by going to Settings –> Touch ID & Passcode and disabling the switch next to „Control Center.“ We’ll talk more about Passcode Lock later.
One app that should be disabled from Control Center is Wallet. While you do need your Touch ID, Face ID, or passcode to access any credit cards stored in your iPhone, other types of cards, like Starbucks, travel passes, and various other loyalty cards, do not. So if you want to prevent others from gaining access to these forms of currency, you’ll need to disable Wallet from Control Center.
To further customize options in your Control Center, open your Settings app, select „Control Center,“ then tap on „Customize“ on the following page.
19. Ask Websites Not to Track Me on Safari
„Ask Websites Not to Track Me“ gives you the option to decide whether or not to allow Safari to share your iPhone’s IP address with the websites you visit. For obvious privacy reasons, you’ll most likely not wish to share this information with sites, so to enable this setting, tap on „Safari“ within the Settings app, then enable the switch next to „Ask Websites Not To Track Me.“
Notice that the setting says Ask. Websites don’t have to comply, so there’s still a chance you’re being tracked. To learn more about this issue, check out the following guide.
Safari has alwasy blocked third-party cookies, but those third parties have always been able to get around the restriction with first-party cookies — cookies the site uses for the site itself. Think of it as nefarious advertisers leeching off a site’s own cookies that are needed to make your visit more convenient. If that’s all sounds confusing, check out our full guide below on what cross-site tracking is, why it matters, and how to stop it.
As just discussed, cookies allow websites to save bits of your information for faster reloading next time you visit. And while this feature makes web browsing more convenient, cookies aren’t exactly a benefit in terms of overall privacy.
Since iOS 11, Apple has streamlined the blocking of cookies by doing away with various options in favor of a blanket ban on all. To disable cookies, open the Settings app and tap on „Safari.“ From there, simply tap on „Block All Cookies“ to turn the option on. While you may notice a difference in performance on some sites, at least you know you’re securing your privacy.
22. Remove App & Website Passwords
Your iPhone and iCloud account have a built-in password manager to make entering passwords easier and more secure. While these passwords are protected by Face ID, Touch ID, or your iPhone’s passcode, disaster will ensue if your iPhone gets breached, with the thief having unfettered access to all of your passwords.
To protect yourself and manage passwords saved, visit Settings –> Passwords & Accounts –> App & Website Passwords, and input your passcode or Touch ID to view your saved passwords. To delete passwords individually, swipe left on each password and hit „Delete.“ To erase en masse, tap „Edit“ in the top-right corner, then select each password you’d like to remove. Tap „Delete“ in the top-left corner to finish up.
Besides keeping your passwords, your iPhone has the ability to store your personal information for AutoFill. This handy feature makes filling out forms online or in apps a breeze, as your iPhone can now automatically enter pertinent information such as your name, phone number, credit card numbers, and home address, to name a few.
Obviously, the downside is this personal information can be a potential boon for any would-be thief that manages to get into your iPhone. To protect yourself, open Settings, tap on „Safari,“ and hit „AutoFill“ on the following page. From there, you can investigate what information is already saved, such as Contact Info and Credit Cards, or disable all by toggling each slider off.
24. Turn Off Microphone Access for Apps
Many apps request microphone access for legitimate purposes. Waze, for instance, uses this access to let you speak to the app to aid in handsfree navigation. That said, there are sketchy apps out there that may not be as forthcoming with what they do when granted access to your iPhone’s microphone.
Naturally, you’ll want to manage which apps have access to your iPhone’s microphone, so open your Settings app and go to „Privacy“ and tap on „Microphone“ on the following page. Here, you will find a list of all apps that are approved to use your microphone. Disable any or all by tapping the toggle next to each app.
25. Disable Camera Access for Apps
Apps like Snapchat depend on camera access to function. The same can’t be said for many apps, however, and some may have gained unjustified access to your iPhone’s camera without you realizing. Because of this, we recommend making a habit out of periodically checking for any wayward apps that have been granted camera access and disabling them accordingly.
To do so, open your Settings app and select „Privacy,“ then tap on „Camera“ on the following page. From there, tap on the toggle next to any suspect apps to disable camera access on your iPhone.
26. Turn Off Location Services for Apps
Location services are essential for navigation apps like Waze to work, as it enables GPS tracking to tag your location and give you directions accurately. In addition to that, apps like Snapchat can use your position when taking photos to apply exciting and unique filters that are only available where you currently are. Some apps, however, may not be as forthcoming about how they use your location data.
Needless to say, we recommend going to Settings –> Privacy –> Locations Services to disable the service for certain apps. And while you have the option to kill „Location Services“ entirely, this will cause you to lose access to all location functions. It’s a much better option to go through each app, and make sure to set the apps you don’t want to have access to your location to „Never.“
27. Empty Out Recently Deleted Photos
Apple saves your deleted photos in a „Recently Deleted“ folder for 30 days before permanently erasing them to make retrieval of accidentally deleted photos easier. If someone were to gain access to your phone, however, they’d have access to any photos deleted within 30 days from that time.
So, in order to avert potential disaster, always be sure to head to the Recently Deleted folder within the Photos app and empty it out of unwanted photos whenever you delete photos from your other galleries.
28. Use Biometrics for App Store Purchases
Let’s say you decide to buy an app. You leave your iPhone for a moment to attend to something important, but as you do, someone manages to break in and gain access to the App Store. Because you just purchased an app, the App Store may not require your password to buy another app, so this person can go crazy buying expensive apps at your expense.
As a preventative measure, it’s always a good idea to require your authorization before purchasing any apps. So if you use Touch ID or Face ID, head over to Settings –> Touch ID & Passcode (or Face ID & Passcode on iPhone X). From there, tap on the toggle next to „Touch ID for iTunes & App Store“ to enable the feature. Enter your iTunes password to confirm and you’ll be all set.
If you don’t use Touch ID, tap on your name at the top of the Settings page. Then, go to iTunes & App Stores –> Password Settings. Set the preference to „Always Require“ for maximum security. As an added option, you also have the ability to always require a password for free downloads by toggling the security measure on.
29. Frequently Auto-Delete Messages
As far as deleting older conversations within the Messages app, Apple permanently stores all your messages on your iPhone by default and largely leaves it up to you to delete them manually. Even if you have Messages in iCloud enabled, messages will still be stored locally. As such, erasing conversations can be a tedious process, especially if you’re concerned about your privacy and have made manually cleaning out your older texts a part of your monthly routine.
Thankfully, your iPhone has a feature that lets you automate the process of deleting old messages and set your device to remove older conversations after a certain period of time. To do so, just jump over to Settings –> Messages –> Keep Messages. Choose either „30 Days“ or „1 Year,“ and your iPhone will make sure your messages never see a day beyond that time.
For more information on permanently deleting texts from your iPhone, check out the guide below.
By default, your lock screen contains a treasure trove of personal data like recent notifications, your Wallet, and the Today View, which is a collection of widgets of your most useful apps. Fortunately, many of the apps that contain this info can be specifically disabled from the lock screen by going to the „Touch ID & Passcode“ menu (or „Face ID & Passcode“ on iPhone X) within the Settings app.
From there, you can choose which apps you’d like to prevent access to from your lock screen. If you’d rather not have others see your texts, emails, or app alerts, or if you’d prefer people not see information from your apps in the Today View, you can disable those apps and features here.
If you re-read the first few chapters of The Innovator’s Dilemma and you insert “Apple” every time Clayton Christensen mentions “a company,” a certain picture emerges: Apple is a company on the verge of being disrupted, and the next great idea in tech and consumer electronics will not materialize from within the walls of its Cupertino spaceship.
The Innovator’s Dilemma, of course, is about the trap that successful companies fall into time and time again. They’re well managed, they’re responsive to their customers, and they’re market leaders. And yet, despite doing everything right, they fail to see the next wave of innovation coming, they get disrupted, and they ultimately fail.
In the case of Apple, the company is trapped by its success, and that success is spelled “iPhone.”
Take, for example, Christensen’s description of the principles of good management that inevitably lead to the downfall of successful companies: “that you should always listen to and respond to the needs of your best customers, and that you should focus investments on those innovations that promise the highest returns.”
Molly Wood (@mollywood) is an Ideas contributor at WIRED and the host and senior editor of Marketplace Tech, a daily national radio broadcast covering the business of technology. She has covered the tech industry at CNET, The New York Times, and in various print, television, digital and audio formats for nearly 20 years. (Ouch.)
Then think about the iPhone, which, despite some consumer-unfriendly advances like the lost headphone jack and ever-changing charging ports, has also been adjusted and tweaked and frozen by what customers want: bigger screens, great cameras, ease of use, and a consistent interface. And the bulk of Apple’s investment since 2007, when the iPhone came out, has been about maintaining, developing, and selling this one device.
In the last quarter of 2018, the iPhone accounted for $51 billion of Apple’s $84 billion in revenue. Its success, the economic halo around it, and its seeming invincibility since its launch have propelled Apple to heights few companies have ever imagined. But the device will also be its undoing.
Here’s what happens when you have a product that successful: You get comfortable. More accurately, you get protective. You don’t want to try anything new. The new things you do try have to be justified in the context of that precious jewel—the “core product.”
So even something like Apple’s Services segment—the brightest non-iPhone spot in its earnings lately—mostly consists of services that benefit the iPhone. It’s Apple Music, iTunes, iCloud—and although Apple doesn’t break out its numbers, the best estimate is that a third or more of its Services revenue is driven by the 30 percent cut it takes from … yep, apps downloaded from the App Store.
The other bright spot in the company’s latest earnings report is its Wearables, Home, and Accessories category. Here again, Apple doesn’t break out the numbers, but the wearables part of that segment is where all the growth is, and that means Apple Watches. And you know what’s still tied nice and tight to the iPhone? Apple Watches.
Even Apple’s best-selling accessories are most likely AirPods, which had a meme-tastic holiday season and are, safe to say, used mostly in conjunction with iPhones. (I’d bet the rest of the accessories dollars are coming from dongles and hubs, since there’s nary a port to be found on any of its new MacBooks.) As for stand-alones, its smart speakers are reportedly great, but they’re not putting a dent in Amazon or Google, by latest count. Apple TV, sure. Fine. But Roku shouldn’t have been embedded in a TV before Apple was.
And none of these efforts count as a serious attempt at diversification.
You may be tempted to argue that Apple is, in fact, working on other projects. The Apple acquisition rumors never cease; nor do the confident statements that the company definitely, absolutely, certainly has a magical innovation in the works that will spring full grown like Athena from the forehead of Zeus any day now. I’m here to say, I don’t think there’s a nascent warrior goddess hiding in there.
Witness Apple’s tottering half-steps into new markets that are unrelated to the iPhone: It was early with a voice assistant but has stalled behind Amazon and even Google Assistant. It wasn’t until last year that the company hired a bona fide machine-learning expert in John Giannandrea, former head of search and AI at Google—and he didn’t get put on the executive team until December 2018. That’s late.
There’s its half-hearted dabble in self-driving technology that was going to be a car, then became software, then became 200 people laid off. Its quailing decade-long attempt to build a streaming service would be sort of comical if there weren’t clearly so much money being thrown around, and so tentatively at that. Rumors of its launch go back as far as 2015, although now it’s supposed to launch in April—this time they mean it.
But even if the streaming service actually arrives, can it really compete against YouTube, PlayStation, Sling, DirecTV, Hulu, and just plain old Netflix? Apple’s original programming is also apparently “not coming as soon as you think.” Analysts are, at this point, outright begging Apple to buy a studio or other original content provider, just to have something to show against Netflix and Amazon originals.
Of course, lots of companies innovate through acquisition, and everyone loves to speculate about what companies Apple might buy. Rumors have ranged from GoPro to BlackBerry to Tesla to the chipmaker ARM. Maybe Netflix. Maybe Tesla. Maybe Disney. Maybe Wired. (Apple News is a hugely successful product … mostly on iPhones, of course.) But at every turn, Apple has declined to move, other than its $3 billion Beats buy in 2014 (which it appears to be abandoning, or cannibalizing, these days).
Now, let me be clear, once again. None of this is to suggest that Apple is doing anything wrong. Indeed, according to Christensen, one of the hallmarks of the innovator’s dilemma is the company’s success, smooth operations, great products, and happy customers. That’s one of the things that makes it a dilemma: A company doesn’t realize anything’s wrong, because, well, nothing is. Smartphone sales may be slowing, but Apple is still a beloved brand, its products are excellent, its history and cachet are unmatched. But that doesn’t mean it has a plan to survive the ongoing decline in global smartphones sales.
The Innovator’s Dilemma does say an entrenched company can sometimes pull out of the quicksand by setting up a small, autonomous spinoff that has the power to move fast, pursue markets that are too small to move the needle for a company making $84 billion a quarter, and innovate before someone else gets there first.
Well, Apple has no autonomous innovation divisions that I know of, and the guys in charge are the same guys who have been in charge for decades: Tim Cook, Eddy Cue, Phil Schiller, Craig Federighi, Jony Ive—all have been associated with Apple since the late ’80s or ’90s. (I mean, has there ever really been a time without Jony Ive?)
You see what I’m saying here: brilliant team with a long record of execution and unparalleled success. Possibly not a lot of fresh ideas.
And then there’s the final option for innovation, one that Apple has availed itself of many times in the past. As Steve Jobs often said, quoting Picasso: “Good artists copy; great artists steal.” The iPod was born of existing MP3 players; the iPhone improved on clunky, ugly smartphones already on the market. The MacOS and the computer mouse were developed to maturity (yes, with permission) after being invented at Xerox PARC.
So maybe Apple will find the hottest thing in tech that’s still slightly unknown and come out with a better version. But is there such a thing as a way-sexier cloud computing business?
I guess it’s possible that the rumored virtual- and augmented-reality headset that Apple is supposed to release in 2020 will take the world by storm and popularize VR in a way that no one imagined, and like AirPods, will take a look that’s painfully dorky on the surface and turn it into a not-quite-ironic must-have statement of affluence and cool. It’s happened before. But this time, I think the company will get beaten to that punch—or whatever punch is next. Apple will be around for a long time. But the next Apple just isn’t Apple.
Wouldn’t it be a different world if everybody thought the way you did? If everybody spontaneously conformed to your every wish, your every thought, your every feeling? Since life doesn’t work that way, you would do well to become skilled at the art of negotiation.
In negotiation, after all, neither party holds all the aces. Instead, negotiation proceeds (or should proceed) on a rather level playing field. Since both parties want to win, what is the best way to proceed? Here are five steps.
1. Establish the relationship
The wise negotiator establishes the relationship before proceeding further. Doing so allows you to get a feeling for the person with whom you are dealing, and vice versa. Though often ignored, „feeling“ itself is an essential part of negotiation. So, always be open and sincere. Honesty, integrity and dignity are palpable qualities, and the foundation upon which constructive negotiations are built.
You are best positioned to negotiate when the other party respects you, not only as a businessperson, but as a human being. Trust, which is gained through that respect, is the key to successful negotiation.
2. Choose ‚honey over vinegar.‘
You’ll do better with honey than with vinegar — but the honey must be genuine. Never underestimate the natural ability of other people to sense who you really are. Disingenuous, manipulative and secretive are feelings that simply cannot be hidden.
When negotiating, you too can sense if the other party’s values are subpar or lack integrity altogether. No greater red flag exists in the entire arena of negotiation.
Win-wins are the only way to go. If you approach a negotiation thinking only of yourself, you are a terrible negotiator. Understanding what all parties need, and working for all concerned is vital. Keep in mind that seeing things in only black and white (win-lose) creates limited thinking; creativity is essential to good negotiation.
Ultimately, all people involved should find themselves on the same side of the fence. You want to be a player, not a pain. Keep your eye on the big picture and don’t get caught up in the small stuff. Stay out of the weeds.
4. Embody your inner adult.
Never forget that everyone has an inner adult and an inner child. It is remarkable to witness how even high-level business deals break down because someone at the table starts thinking childishly, instigating that behavior in others. When you see this happening, keep in mind that everyone goes out of balance.
Be the stable anchor, the respectful adult at the table. Helping people come back into balance is often best done by example. Take the high road, embodying your inner adult. Don’t argue; instead, understand.
5. Respect the rhythm of the relationship.
Always remember that there is a rhythm to everything. Don’t push it. Oftentimes, it is best to say nothing. Never forget that silent pauses can be a very powerful tool. Give yourself and others the time and space to reflect upon everything that has been said.
Don’t rush it. Try to sense the natural and appropriate rhythm of all the people at the table, including yourself.
In closing
By implementing these five points, you will be well on your way to mastering the art of negotiation. Negotiation is all about relationships. By cultivating and maintaining a good rapport with everyone at the table, every player can win. You’re not just creating an agreement, you are cultivating a long-term relationship as well as a reputation.
By mastering the subtle art of negotiation, you establish yourself as a top-rank business person, and that in itself may lead to even greater opportunities in the future.
Desperate for data on its competitors, Facebook has been secretly paying people to install a “Facebook Research” VPN that lets the company suck in all of a user’s phone and web activity, similar to Facebook’s Onavo Protect app that Apple banned in June and that was removed in August. Facebook sidesteps the App Store and rewards teenagers and adults to download the Research app and give it root access to network traffic in what may be a violation of Apple policy so the social network can decrypt and analyze their phone activity, a TechCrunch investigation confirms.
Facebook admitted to TechCrunch it was running the Research program to gather data on usage habits.
Since 2016, Facebook has been paying users ages 13 to 35 up to $20 per month plus referral fees to sell their privacy by installing the iOS or Android “Facebook Research” app. Facebook even asked users to screenshot their Amazon order history page. The program is administered through beta testing services Applause, BetaBound and uTest to cloak Facebook’s involvement, and is referred to in some documentation as “Project Atlas” — a fitting name for Facebook’s effort to map new trends and rivals around the globe.
Seven hours after this story was published, Facebook told TechCrunch it would shut down the iOS version of its Research app in the wake of our report. But on Wednesday morning, an Apple spokesperson confirmed that Facebook violated its policies, and it had blocked Facebook’s Research app on Tuesday before the social network seemingly pulled it voluntarily (without mentioning it was forced to do so). You can read our full report on the development here.
An Apple spokesperson provided this statement. “We designed our Enterprise Developer Program solely for the internal distribution of apps within an organization. Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple. Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.”
Facebook’s Research program will continue to run on Android.
Facebook’s Research app requires users to ‘Trust’ it with extensive access to their dataWe asked Guardian Mobile Firewall’s security expert Will Strafach to dig into the Facebook Research app, and he told us that “If Facebook makes full use of the level of access they are given by asking users to install the Certificate, they will have the ability to continuously collect the following types of data: private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed.” It’s unclear exactly what data Facebook is concerned with, but it gets nearly limitless access to a user’s device once they install the app.
The strategy shows how far Facebook is willing to go and how much it’s willing to pay to protect its dominance — even at the risk of breaking the rules of Apple’s iOS platform on which it depends. Apple may have asked Facebook to discontinue distributing its Research app.
A more stringent punishment would be to revoke Facebook’s permission to offer employee-only apps. The situation could further chill relations between the tech giants. Apple’s Tim Cook has repeatedly criticized Facebook’s data collection practices. Facebook disobeying iOS policies to slurp up more information could become a new talking point.
Facebook’s Research program is referred to as Project Atlas on sign-up sites that don’t mention Facebook’s involvement
“The fairly technical sounding ‘install our Root Certificate’ step is appalling,” Strafach tells us. “This hands Facebook continuous access to the most sensitive data about you, and most users are going to be unable to reasonably consent to this regardless of any agreement they sign, because there is no good way to articulate just how much power is handed to Facebook when you do this.”
Facebook’s surveillance app
Facebook first got into the data-sniffing business when it acquired Onavo for around $120 million in 2014. The VPN app helped users track and minimize their mobile data plan usage, but also gave Facebook deep analytics about what other apps they were using. Internal documents acquired by Charlie Warzel and Ryan Mac of BuzzFeed News reveal that Facebook was able to leverage Onavo to learn that WhatsApp was sending more than twice as many messages per day as Facebook Messenger. Onavo allowed Facebook to spot WhatsApp’s meteoric rise and justify paying $19 billion to buy the chat startup in 2014. WhatsApp has since tripled its user base, demonstrating the power of Onavo’s foresight.
Over the years since, Onavo clued Facebook in to what apps to copy, features to build and flops to avoid. By 2018, Facebook was promoting the Onavo app in a Protect bookmark of the main Facebook app in hopes of scoring more users to snoop on. Facebook also launched the Onavo Bolt app that let you lock apps behind a passcode or fingerprint while it surveils you, but Facebook shut down the app the day it was discovered following privacy criticism. Onavo’s main app remains available on Google Play and has been installed more than 10 million times.
The backlash heated up after security expert Strafach detailed in March how Onavo Protect was reporting to Facebook when a user’s screen was on or off, and its Wi-Fi and cellular data usage in bytes even when the VPN was turned off. In June, Apple updated its developer policies to ban collecting data about usage of other apps or data that’s not necessary for an app to function. Apple proceeded to inform Facebook in August that Onavo Protect violated those data collection policies and that the social network needed to remove it from the App Store, which it did, Deepa Seetharaman of the WSJ reported.
But that didn’t stop Facebook’s data collection.
Project Atlas
TechCrunch recently received a tip that despite Onavo Protect being banished by Apple, Facebook was paying users to sideload a similar VPN app under the Facebook Research moniker from outside of the App Store. We investigated, and learned Facebook was working with three app beta testing services to distribute the Facebook Research app: BetaBound, uTest and Applause. Facebook began distributing the Research VPN app in 2016. It has been referred to as Project Atlas since at least mid-2018, around when backlash to Onavo Protect magnified and Apple instituted its new rules that prohibited Onavo. Previously, a similar program was called Project Kodiak. Facebook didn’t want to stop collecting data on people’s phone usage and so the Research program continued, in disregard for Apple banning Onavo Protect.
Facebook’s Research App on iOS
Ads (shown below) for the program run by uTest on Instagram and Snapchat sought teens 13-17 years old for a “paid social media research study.” The sign-up page for the Facebook Research program administered by Applause doesn’t mention Facebook, but seeks users “Age: 13-35 (parental consent required for ages 13-17).” If minors try to sign-up, they’re asked to get their parents’ permission with a form that reveal’s Facebook’s involvement and says “There are no known risks associated with the project, however you acknowledge that the inherent nature of the project involves the tracking of personal information via your child’s use of apps. You will be compensated by Applause for your child’s participation.” For kids short on cash, the payments could coerce them to sell their privacy to Facebook.
The Applause site explains what data could be collected by the Facebook Research app (emphasis mine):
“By installing the software, you’re giving our client permission to collect data from your phone that will help them understand how you browse the internet, and how you use the features in the apps you’ve installed . . . This means you’re letting our client collect information such as which apps are on your phone, how and when you use them, data about your activities and content within those apps, as well as how other people interact with you or your content within those apps. You are also letting our client collect information about your internet browsing activity (including the websites you visit and data that is exchanged between your device and those websites) and your use of other online services. There are some instances when our client will collect this information even where the app uses encryption, or from within secure browser sessions.”
Meanwhile, the BetaBound sign-up page with a URL ending in “Atlas” explains that “For $20 per month (via e-gift cards), you will install an app on your phone and let it run in the background.” It also offers $20 per friend you refer. That site also doesn’t initially mention Facebook, but the instruction manual for installing Facebook Research reveals the company’s involvement.
Facebook’s intermediary uTest ran ads on Snapchat and Instagram, luring teens to the Research program with the promise of money
Facebook seems to have purposefully avoided TestFlight, Apple’s official beta testing system, which requires apps to be reviewed by Apple and is limited to 10,000 participants. Instead, the instruction manual reveals that users download the app from r.facebook-program.com and are told to install an Enterprise Developer Certificate and VPN and “Trust” Facebook with root access to the data their phone transmits. Apple requires that developers agree to only use this certificate system for distributing internal corporate apps to their own employees. Randomly recruiting testers and paying them a monthly fee appears to violate the spirit of that rule.
Security expert Will Strafach found Facebook’s Research app contains lots of code from Onavo Protect, the Facebook-owned app Apple banned last year
Once installed, users just had to keep the VPN running and sending data to Facebook to get paid. The Applause-administered program requested that users screenshot their Amazon orders page. This data could potentially help Facebook tie browsing habits and usage of other apps with purchase preferences and behavior. That information could be harnessed to pinpoint ad targeting and understand which types of users buy what.
TechCrunch commissioned Strafach to analyze the Facebook Research app and find out where it was sending data. He confirmed that data is routed to “vpn-sjc1.v.facebook-program.com” that is associated with Onavo’s IP address, and that the facebook-program.com domain is registered to Facebook, according to MarkMonitor. The app can update itself without interacting with the App Store, and is linked to the email address PeopleJourney@fb.com. He also discovered that the Enterprise Certificate first acquired in 2016 indicates Facebook renewed it on June 27th, 2018 — weeks after Apple announced its new rules that prohibited the similar Onavo Protect app.
“It is tricky to know what data Facebook is actually saving (without access to their servers). The only information that is knowable here is what access Facebook is capable of based on the code in the app. And it paints a very worrisome picture,” Strafach explains. “They might respond and claim to only actually retain/save very specific limited data, and that could be true, it really boils down to how much you trust Facebook’s word on it. The most charitable narrative of this situation would be that Facebook did not think too hard about the level of access they were granting to themselves . . . which is a startling level of carelessness in itself if that is the case.”
In response to TechCrunch’s inquiry, a Facebook spokesperson confirmed it’s running the program to learn how people use their phones and other services. The spokesperson told us “Like many companies, we invite people to participate in research that helps us identify things we can be doing better. Since this research is aimed at helping Facebook understand how people use their mobile devices, we’ve provided extensive information about the type of data we collect and how they can participate. We don’t share this information with others and people can stop participating at any time.”
Facebook’s Research app requires Root Certificate access, which Facebook gather almost any piece of data transmitted by your phone
Facebook’s spokesperson claimed that the Facebook Research app was in line with Apple’s Enterprise Certificate program, but didn’t explain how in the face of evidence to the contrary. They said Facebook first launched its Research app program in 2016. They tried to liken the program to a focus group and said Nielsen and comScore run similar programs, yet neither of those ask people to install a VPN or provide root access to the network. The spokesperson confirmed the Facebook Research program does recruit teens but also other age groups from around the world. They claimed that Onavo and Facebook Research are separate programs, but admitted the same team supports both as an explanation for why their code was so similar.
Facebook’s Research program requested users screenshot their Amazon order history to provide it with purchase data
However, Facebook’s claim that it doesn’t violate Apple’s Enterprise Certificate policy is directly contradicted by the terms of that policy. Those include that developers “Distribute Provisioning Profiles only to Your Employees and only in conjunction with Your Internal Use Applications for the purpose of developing and testing”. The policy also states that “You may not use, distribute or otherwise make Your Internal Use Applications available to Your Customers” unless under direct supervision of employees or on company premises. Given Facebook’s customers are using the Enterprise Certificate-powered app without supervision, it appears Facebook is in violation.
Seven hours after this report was first published, Facebook updated its position and told TechCrunch that it would shut down the iOS Research app. Facebook noted that the Research app was started in 2016 and was therefore not a replacement for Onavo Protect. However, they do share similar code and could be seen as twins running in parallel. A Facebook spokesperson also provided this additional statement:
“Key facts about this market research program are being ignored. Despite early reports, there was nothing ‘secret’ about this; it was literally called the Facebook Research App. It wasn’t ‘spying’ as all of the people who signed up to participate went through a clear on-boarding process asking for their permission and were paid to participate. Finally, less than 5 percent of the people who chose to participate in this market research program were teens. All of them with signed parental consent forms.”
Facebook did not publicly promote the Research VPN itself and used intermediaries that often didn’t disclose Facebook’s involvement until users had begun the signup process. While users were given clear instructions and warnings, the program never stresses nor mentions the full extent of the data Facebook can collect through the VPN. A small fraction of the users paid may have been teens, but we stand by the newsworthiness of its choice not to exclude minors from this data collection initiative.
Facebook disobeying Apple so directly and then pulling the app could hurt their relationship. “The code in this iOS app strongly indicates that it is simply a poorly re-branded build of the banned Onavo app, now using an Enterprise Certificate owned by Facebook in direct violation of Apple’s rules, allowing Facebook to distribute this app without Apple review to as many users as they want,” Strafach tells us. ONV prefixes and mentions of graph.onavo.com, “onavoApp://” and “onavoProtect://” custom URL schemes litter the app. “This is an egregious violation on many fronts, and I hope that Apple will act expeditiously in revoking the signing certificate to render the app inoperable.”
Facebook is particularly interested in what teens do on their phones as the demographic has increasingly abandoned the social network in favor of Snapchat, YouTube and Facebook’s acquisition Instagram.Insights into how popular with teens is Chinese video music app TikTok and meme sharing led Facebook to launch a clone called Lasso and begin developing a meme-browsing feature called LOL, TechCrunch first reported. But Facebook’s desire for data about teens riles critics at a time when the company has been battered in the press. Analysts on tomorrow’s Facebook earnings call should inquire about what other ways the company has to collect competitive intelligence now that it’s ceased to run the Research program on iOS.
Last year when Tim Cook was asked what he’d do in Mark Zuckerberg’s position in the wake of the Cambridge Analytica scandal, he said “I wouldn’t be in this situation . . . The truth is we could make a ton of money if we monetized our customer, if our customer was our product. We’ve elected not to do that.” Zuckerberg told Ezra Klein that he felt Cook’s comment was “extremely glib.”
Now it’s clear that even after Apple’s warnings and the removal of Onavo Protect, Facebook was still aggressively collecting data on its competitors via Apple’s iOS platform. “I have never seen such open and flagrant defiance of Apple’s rules by an App Store developer,” Strafach concluded. Now that Facebook has ceased the program on iOS and its Android future is uncertain, it may either have to invent new ways to surveil our behavior amidst a climate of privacy scrutiny, or be left in the dark.
Additional reporting by Zack Whittaker. Updated with comment from Facebook, and on Wednesday with a statement from Apple.
It’s onto me, anyway. I am merely one anecdata point among billions, but I’m sure I’m not the only Facebook user who has found herself shying away from the very public, often performative, and even tiring habit of posting regular updates to Facebook and Instagram. Over the past year I’ve found myself thinking not about quitting social networks, but about redefining them. For me, that process has involved a lot more private messaging.
Facebook, it seems, has noticed. Last week, The New York Times reported that Facebook chief executive Mark Zuckerberg plans to unify Facebook Messenger, WhatsApp, and Instagram messaging on the backend of the services. This would make it possible for people relying on different flavors of Facebook apps to all gorge at the same messaging table. On the one hand, the move is truly Facebookian—just try to extricate yourself from Facebook, and it will try every which way to pull you back in. On the other hand, it makes sense for Facebook for a few reasons.
My personal relationship with Facebook is multi-faceted. I have a personal account and a journalist’s page. I also use Instagram and WhatsApp. But last year, I let my professional page languish. I stopped posting to my personal feed as frequently. Instead I turned to private messaging.
During a trip to Europe last fall, I shared everything I felt compelled to share with a small group of people on Apple Messages. The excursion to see one of the largest waves ever surfed by a human? I shared the photo in a private Slack message with coworkers, instead of posting on Facebook. Wedding photos no longer go up on Instagram. During the holidays, I happily embrace my role as halfway-decent photographer, but when I share the photos with friends and family, it’s only through Messages, WhatsApp, or private photo albums.
These tools have become my preferred method of communicating. It’s not some big revelation, or even anything that’s new; peer-to-peer messaging, or at least the guise of “private” messaging, is as old as the consumer internet itself. When our worlds expand in a way that feels unmanageable, our instinct is sometimes to shrink them until we’re comfortable again, for better or worse. Remember Path, the social network limited to just your closest circle? That didn’t work out, but the entire app was built upon the Dunbar theory that our species can’t handle more than 150 close friends. There just might have been something to that.
“I think a lot of people experience this,” says Margaret Morris, a clinical psychologist and author of Left to Our Own Devices. “When you post something in such a public way, the question is: What are the motivations? But when it’s in a private thread, it’s: Why am I sharing this? Oh, it’s because I think you’ll like this. I think we’ll connect over this. The altruistic motivations can be far more clear in private messaging.”
Of course, “altruism” in this case only applies to the friends exchanging messages and not the messaging service providers. Facebook’s efforts to unify its messaging platforms are at least partly rooted in a desire to monetize our activity, whether that’s by keeping us engaged in an outward-facing News Feed or within a little chat box. And there’s a major distinction between so-called private messages and what Morris calls “Privacy with a capital P.”
“There’s one kind of privacy, which is: what does my cousin know, or what does my co-worker know,” Morris says, “And then there’s the kind of privacy that’s about the data Facebook has.” Facebook’s plan is reportedly to offer end-to-end encryption on all of its messaging apps once the backend systems have been merged. As my WIRED colleague Lily Newman writes, cryptographers and privacy advocates already see obvious hurdles in making this work.
That’s why I often use Apple’s Messages and even iCloud photo sharing. There’s an explicit agreement that exists between the service provider and user: Buy our ridiculously expensive hardware, and we won’t sell your data. (While iCloud has been hacked before, Apple swears by the end-to-end encryption between iPhone users and says it doesn’t share Messages data with third-party apps). But just using Messages isn’t realistic, either. The platform is only functional between two iPhones. Not everyone can afford Apple products, and in other parts of the world, such as China or India, apps like WeChat and WhatsApp dominate private messaging. That means you’re going to end up using other apps if you plan to communicate outside of a bubble of iPhone lovers.
But beyond privacy with a capital P—which is, for many people, the most important consideration when it comes to social media—there’s the psychology of privacy when it comes to sharing updates about our personal lives, and connecting with other humans. Social networks have made human connections infinitely more possible and also turned the whole notion upside down on its head.
Morris, for example, sees posting something publicly to a Facebook feed as a yearning for interconnectedness, while a private messaging thread is a quest for what she calls attunement, a way to strengthen a bond between two people. But, she notes, some people take a screenshot from a private message and then, having failed in their quest for attunement, publish an identity-stripped version of it to their feed. Guilty as charged. Social networking is no longer just a feed or an app or a chat box or SMS, but some amalgamation of it all.
Posting private messages publicly is not something I plan to make a habit of, but there is still the urge sometimes to share. I’m still on Twitter. I’ll likely still post to Facebook and Instagram from time to time. At some point I may be looking for a sense of community that exists beyond my own small private messaging groups, for a tantalizing blend of familiarity and anonymity in a Facebook group of like-minded hobbyists. For some people, larger social networking communities are lifelines as they struggle with health, with family, with job worries, with life.
But right now, “private” messages are the way to share my life with the people who matter most, an attempt to splinter off my social interactions into something more satisfying—especially when posting to Facebook has never seemed less appealing.
Apple on Wednesday warned investors that its revenue for the last three months of 2018 would not live up to previous estimates, or even come particularly close. The main culprit appears to be China, where the trade war and a broader economic slowdown contributed to plummeting iPhone sales. But CEO Tim Cook’s letter to investors pointed to a secondary thread as well, one that Apple customers, environmentalists, and even the company itself should view not as a liability but an asset: People are holding onto their iPhones longer.
That’s not just in China. Cook noted that iPhone upgrades were “not as strong as we thought they would be” in developed markets as well, citing “macroeconomic conditions,” a shift in how carriers price smartphones, a strong US dollar, and temporarily discounted battery replacements. He neglected to mention the simple fact that an iPhone can perform capably for years—and consumers are finally getting wise.
As recently as 2015, smartphone users on average upgraded their phone roughly every 24 months, says Cliff Maldonado, founder of BayStreet Research, which tracks the mobile industry. As of the fourth quarter of last year, that had jumped to at least 35 months. “You’re looking at people holding onto their devices an extra year,” Maldonado says. “It’s been considerable.”
A few factors contribute to the trend, chief among them the shift from buying phones on a two-year contract—heavily subsidized by the carriers—to installment plans in which the customer pays full freight. T-Mobile introduced the practice in the US in 2014, and by 2015 it had become the norm. The full effects, though, have only kicked in more recently. People still generally pay for their smartphone over two years; once they’re paid off, though, their monthly bill suddenly drops by, say, $25.
The shift has also caused a sharp drop-off in carrier incentives. They turn out not to be worth it. “They’re actually encouraging that dynamic of holding your smartphone longer. It’s in their best interest,” Maldonado says. “It actually costs them to get you into a new phone, to do those promotions, to run the transaction and put it on their books and finance it.”
Bottom line: If your service is reliable and your iPhone still works fine, why go through the hassle?
“There’s not as many subsidies as there used to be from a carrier point of view,” Cook told CNBC Wednesday. “And where that didn’t all happen yesterday, if you’ve been out of the market for two or three years and you come back, it looks like that to you.”
Meanwhile, older iPhones work better, for longer, thanks to Apple itself. When Apple vice president Craig Federighi introduced iOS 12 in June at Apple’s Worldwide Developers Conference, he emphasized how much it improved the performance of older devices. Among the numbers he cited: The 2014 iPhone 6 Plus opens apps 40 percent faster with iOS 12 than it had with iOS 11, and its keyboard appears up to 50 percent faster than before. And while Apple’s battery scandal of a year ago was a black mark for the company, it at least reminded Apple owners that they didn’t necessarily need a new iPhone. Eligible iPhone owners found that a $29 battery replacement—it normally costs $79—made their iPhone 6 feel something close to new.
“There definitely has been a major shift in customer perception, after all the controversy,” says Kyle Wiens, founder of online repair community iFixit. “What it really did more than anything else was remind you that the battery on your phone really can be replaced. Apple successfully brainwashing the public into thinking the battery was something they never needed to think about led people to prematurely buy these devices.”
Combine all of that with the fact that new model iPhones—and Android phones for that matter—have lacked a killer feature, much less one that would inspire someone to spend $1,000 or more if they didn’t absolutely have to. “Phones used to be toys, and shiny objects,” Maldonado says. “Now they’re utilities. You’ve got to have it, and the joy of getting a new one is pretty minor. Facebook and email looks the same; the camera’s still great.”
In the near term, these dynamics aren’t ideal for Apple; its stock dropped more than 7 percent in after-hours trading following Wednesday’s news. But it’s terrific news for consumers, who have apparently realized that a smartphone does not have a two-year expiration date. That saves money in the long run. And pulling the throttle back on iPhone sales may turn out to be equally welcome news for the planet.
According to Apple’s most recent sustainability report, the manufacture of each Apple device generates on average 90 pounds of carbon emissions. Wiens suggests that the creation of each iPhone requires hundreds of pounds of raw materials.
Manufacturing electronics is environmentally intense, Wiens says. “We can’t live in a world where we’re making 3 billion new smartphones a year. We don’t have the resources for it. We have to reduce how many overall devices we’re making. There are lots of ways to do it, but it gets down to demand, and how many we’re buying. That’s not what Apple wants, but it’s what the environment needs.”
Which raises a question: Why does Apple bother extending the lives of older iPhones? The altruistic answer comes from Lisa Jackson, who oversees the company’s environmental efforts.
“We also make sure to design and build durable products that last as long as possible,” Jackson said at Apple’s September hardware event. “Because they last longer, you can keep using them. And keeping using them is the best thing for the planet.”
Given a long enough horizon, Apple may see a financial benefit from less frequent upgrades as well. An iPhone that lasts longer keeps customers in the iOS ecosystem longer. That becomes even more important as the company places greater emphasis not on hardware but on services like Apple Music. It also offers an important point of differentiation from Android, whose fragmented ecosystem means even flagship devices rarely continue to be fully supported beyond two years.
“In reality, the big picture is still very good for Apple,” Maldonado says. Compared with Android, “Apple’s in a better spot, because the phones last longer.”
That’s cold comfort today and doesn’t help a whit with China. But news that people are holding onto their iPhones longer should be taken for what it really is: A sign of progress and a win for everyone. Even Apple.
Andrew Pole had just started working as a statistician for Target in 2002, when two colleagues from the marketing department stopped by his desk to ask an odd question: “If we wanted to figure out if a customer is pregnant, even if she didn’t want us to know, can you do that? ”
Pole has a master’s degree in statistics and another in economics, and has been obsessed with the intersection of data and human behavior most of his life. His parents were teachers in North Dakota, and while other kids were going to 4-H, Pole was doing algebra and writing computer programs. “The stereotype of a math nerd is true,” he told me when I spoke with him last year. “I kind of like going out and evangelizing analytics.”
As the marketers explained to Pole — and as Pole later explained to me, back when we were still speaking and before Target told him to stop — new parents are a retailer’s holy grail. Most shoppers don’t buy everything they need at one store. Instead, they buy groceries at the grocery store and toys at the toy store, and they visit Target only when they need certain items they associate with Target — cleaning supplies, say, or new socks or a six-month supply of toilet paper. But Target sells everything from milk to stuffed animals to lawn furniture to electronics, so one of the company’s primary goals is convincing customers that the only store they need is Target. But it’s a tough message to get across, even with the most ingenious ad campaigns, because once consumers’ shopping habits are ingrained, it’s incredibly difficult to change them.
There are, however, some brief periods in a person’s life when old routines fall apart and buying habits are suddenly in flux. One of those moments — the moment, really — is right around the birth of a child, when parents are exhausted and overwhelmed and their shopping patterns and brand loyalties are up for grabs. But as Target’s marketers explained to Pole, timing is everything. Because birth records are usually public, the moment a couple have a new baby, they are almost instantaneously barraged with offers and incentives and advertisements from all sorts of companies. Which means that the key is to reach them earlier, before any other retailers know a baby is on the way. Specifically, the marketers said they wanted to send specially designed ads to women in their second trimester, which is when most expectant mothers begin buying all sorts of new things, like prenatal vitamins and maternity clothing. “Can you give us a list?” the marketers asked.
“We knew that if we could identify them in their second trimester, there’s a good chance we could capture them for years,” Pole told me. “As soon as we get them buying diapers from us, they’re going to start buying everything else too. If you’re rushing through the store, looking for bottles, and you pass orange juice, you’ll grab a carton. Oh, and there’s that new DVD I want. Soon, you’ll be buying cereal and paper towels from us, and keep coming back.”
The desire to collect information on customers is not new for Target or any other large retailer, of course. For decades, Target has collected vast amounts of data on every person who regularly walks into one of its stores. Whenever possible, Target assigns each shopper a unique code — known internally as the Guest ID number — that keeps tabs on everything they buy. “If you use a credit card or a coupon, or fill out a survey, or mail in a refund, or call the customer help line, or open an e-mail we’ve sent you or visit our Web site, we’ll record it and link it to your Guest ID,” Pole said. “We want to know everything we can.”
Also linked to your Guest ID is demographic information like your age, whether you are married and have kids, which part of town you live in, how long it takes you to drive to the store, your estimated salary, whether you’ve moved recently, what credit cards you carry in your wallet and what Web sites you visit. Target can buy data about your ethnicity, job history, the magazines you read, if you’ve ever declared bankruptcy or got divorced, the year you bought (or lost) your house, where you went to college, what kinds of topics you talk about online, whether you prefer certain brands of coffee, paper towels, cereal or applesauce, your political leanings, reading habits, charitable giving and the number of cars you own. (In a statement, Target declined to identify what demographic information it collects or purchases.) All that information is meaningless, however, without someone to analyze and make sense of it. That’s where Andrew Pole and the dozens of other members of Target’s Guest Marketing Analytics department come in.
Almost every major retailer, from grocery chains to investmentbanks to the U.S. Postal Service, has a “predictive analytics” department devoted to understanding not just consumers’ shopping habits but also their personal habits, so as to more efficiently market to them. “But Target has always been one of the smartest at this,” says Eric Siegel, a consultant and the chairman of a conference called Predictive Analytics World. “We’re living through a golden age of behavioral research. It’s amazing how much we can figure out about how people think now.”
The reason Target can snoop on our shopping habits is that, over the past two decades, the science of habit formation has become a major field of research in neurology and psychology departments at hundreds of major medical centers and universities, as well as inside extremely well financed corporate labs. “It’s like an arms race to hire statisticians nowadays,” said Andreas Weigend, the former chief scientist at Amazon.com. “Mathematicians are suddenly sexy.” As the ability to analyze data has grown more and more fine-grained, the push to understand how daily habits influence our decisions has become one of the most exciting topics in clinical research, even though most of us are hardly aware those patterns exist. One study from Duke University estimated that habits, rather than conscious decision-making, shape 45 percent of the choices we make every day, and recent discoveries have begun to change everything from the way we think about dieting to how doctors conceive treatments for anxiety, depression and addictions.
This research is also transforming our understanding of how habits function across organizations and societies. A football coach named Tony Dungy propelled one of the worst teams in the N.F.L. to the Super Bowl by focusing on how his players habitually reacted to on-field cues. Before he became Treasury secretary, Paul O’Neill overhauled a stumbling conglomerate, Alcoa, and turned it into a top performer in the Dow Jones by relentlessly attacking one habit — a specific approach to worker safety — which in turn caused a companywide transformation. The Obama campaign has hired a habit specialist as its “chief scientist” to figure out how to trigger new voting patterns among different constituencies.
Researchers have figured out how to stop people from habitually overeating and biting their nails. They can explain why some of us automatically go for a jog every morning and are more productive at work, while others oversleep and procrastinate. There is a calculus, it turns out, for mastering our subconscious urges. For companies like Target, the exhaustive rendering of our conscious and unconscious patterns into data sets and algorithms has revolutionized what they know about us and, therefore, how precisely they can sell.
Inside the brain-and-cognitive-sciences department of the Massachusetts Institute of Technology are what, to the casual observer, look like dollhouse versions of surgical theaters. There are rooms with tiny scalpels, small drills and miniature saws. Even the operating tables are petite, as if prepared for 7-year-old surgeons. Inside those shrunken O.R.’s, neurologists cut into the skulls of anesthetized rats, implanting tiny sensors that record the smallest changes in the activity of their brains.
An M.I.T. neuroscientist named Ann Graybiel told me that she and her colleagues began exploring habits more than a decade ago by putting their wired rats into a T-shaped maze with chocolate at one end. The maze was structured so that each animal was positioned behind a barrier that opened after a loud click. The first time a rat was placed in the maze, it would usually wander slowly up and down the center aisle after the barrier slid away, sniffing in corners and scratching at walls. It appeared to smell the chocolate but couldn’t figure out how to find it. There was no discernible pattern in the rat’s meanderings and no indication it was working hard to find the treat.
The probes in the rats’ heads, however, told a different story. While each animal wandered through the maze, its brain was working furiously. Every time a rat sniffed the air or scratched a wall, the neurosensors inside the animal’s head exploded with activity. As the scientists repeated the experiment, again and again, the rats eventually stopped sniffing corners and making wrong turns and began to zip through the maze with more and more speed. And within their brains, something unexpected occurred: as each rat learned how to complete the maze more quickly, its mental activity decreased. As the path became more and more automatic — as it became a habit — the rats started thinking less and less.
This process, in which the brain converts a sequence of actions into an automatic routine, is called “chunking.” There are dozens, if not hundreds, of behavioral chunks we rely on every day. Some are simple: you automatically put toothpaste on your toothbrush before sticking it in your mouth. Some, like making the kids’ lunch, are a little more complex. Still others are so complicated that it’s remarkable to realize that a habit could have emerged at all.
Take backing your car out of the driveway. When you first learned to drive, that act required a major dose of concentration, and for good reason: it involves peering into the rearview and side mirrors and checking for obstacles, putting your foot on the brake, moving the gearshift into reverse, removing your foot from the brake, estimating the distance between the garage and the street while keeping the wheels aligned, calculating how images in the mirrors translate into actual distances, all while applying differing amounts of pressure to the gas pedal and brake.
Now, you perform that series of actions every time you pull into the street without thinking very much. Your brain has chunked large parts of it. Left to its own devices, the brain will try to make almost any repeated behavior into a habit, because habits allow our minds to conserve effort. But conserving mental energy is tricky, because if our brains power down at the wrong moment, we might fail to notice something important, like a child riding her bike down the sidewalk or a speeding car coming down the street. So we’ve devised a clever system to determine when to let a habit take over. It’s something that happens whenever a chunk of behavior starts or ends — and it helps to explain why habits are so difficult to change once they’re formed, despite our best intentions.
To understand this a little more clearly, consider again the chocolate-seeking rats. What Graybiel and her colleagues found was that, as the ability to navigate the maze became habitual, there were two spikes in the rats’ brain activity — once at the beginning of the maze, when the rat heard the click right before the barrier slid away, and once at the end, when the rat found the chocolate. Those spikes show when the rats’ brains were fully engaged, and the dip in neural activity between the spikes showed when the habit took over. From behind the partition, the rat wasn’t sure what waited on the other side, until it heard the click, which it had come to associate with the maze. Once it heard that sound, it knew to use the “maze habit,” and its brain activity decreased. Then at the end of the routine, when the reward appeared, the brain shook itself awake again and the chocolate signaled to the rat that this particular habit was worth remembering, and the neurological pathway was carved that much deeper.
The process within our brains that creates habits is a three-step loop. First, there is a cue, a trigger that tells your brain to go into automatic mode and which habit to use. Then there is the routine, which can be physical or mental or emotional. Finally, there is a reward, which helps your brain figure out if this particular loop is worth remembering for the future. Over time, this loop — cue, routine, reward; cue, routine, reward — becomes more and more automatic. The cue and reward become neurologically intertwined until a sense of craving emerges. What’s unique about cues and rewards, however, is how subtle they can be. Neurological studies like the ones in Graybiel’s lab have revealed that some cues span just milliseconds. And rewards can range from the obvious (like the sugar rush that a morning doughnut habit provides) to the infinitesimal (like the barely noticeable — but measurable — sense of relief the brain experiences after successfully navigating the driveway). Most cues and rewards, in fact, happen so quickly and are so slight that we are hardly aware of them at all. But our neural systems notice and use them to build automatic behaviors.
Habits aren’t destiny — they can be ignored, changed or replaced. But it’s also true that once the loop is established and a habit emerges, your brain stops fully participating in decision-making. So unless you deliberately fight a habit — unless you find new cues and rewards — the old pattern will unfold automatically.
“We’ve done experiments where we trained rats to run down a maze until it was a habit, and then we extinguished the habit by changing the placement of the reward,” Graybiel told me. “Then one day, we’ll put the reward in the old place and put in the rat and, by golly, the old habit will re-emerge right away. Habits never really disappear.”
Luckily, simply understanding how habits work makes them easier to control. Take, for instance, a series of studies conducted a few years ago at Columbia University and the University of Alberta. Researchers wanted to understand how exercise habits emerge. In one project, 256 members of a health-insurance plan were invited to classes stressing the importance of exercise. Half the participants received an extra lesson on the theories of habit formation (the structure of the habit loop) and were asked to identify cues and rewards that might help them develop exercise routines.
The results were dramatic. Over the next four months, those participants who deliberately identified cues and rewards spent twice as much time exercising as their peers. Other studies have yielded similar results. According to another recent paper, if you want to start running in the morning, it’s essential that you choose a simple cue (like always putting on your sneakers before breakfast or leaving your running clothes next to your bed) and a clear reward (like a midday treat or even the sense of accomplishment that comes from ritually recording your miles in a log book). After a while, your brain will start anticipating that reward — craving the treat or the feeling of accomplishment — and there will be a measurable neurological impulse to lace up your jogging shoes each morning.
Our relationship to e-mail operates on the same principle. When a computer chimes or a smartphone vibrates with a new message, the brain starts anticipating the neurological “pleasure” (even if we don’t recognize it as such) that clicking on the e-mail and reading it provides. That expectation, if unsatisfied, can build until you find yourself moved to distraction by the thought of an e-mail sitting there unread — even if you know, rationally, it’s most likely not important. On the other hand, once you remove the cue by disabling the buzzing of your phone or the chiming of your computer, the craving is never triggered, and you’ll find, over time, that you’re able to work productively for long stretches without checking your in-box.
Some of the most ambitious habit experiments have been conducted by corporate America. To understand why executives are so entranced by this science, consider how one of the world’s largest companies, Procter & Gamble, used habit insights to turn a failing product into one of its biggest sellers. P.& G. is the corporate behemoth behind a whole range of products, from Downy fabric softener to Bounty paper towels to Duracell batteries and dozens of other household brands. In the mid-1990s, P.& G.’s executives began a secret project to create a new product that could eradicate bad smells. P.& G. spent millions developing a colorless, cheap-to-manufacture liquid that could be sprayed on a smoky blouse, stinky couch, old jacket or stained car interior and make it odorless. In order to market the product — Febreze — the company formed a team that included a former Wall Street mathematician named Drake Stimson and habit specialists, whose job was to make sure the television commercials, which they tested in Phoenix, Salt Lake City and Boise, Idaho, accentuated the product’s cues and rewards just right.
Video
TimesCast | Retailers‘ Predictions
February 16, 2012 – In a preview of this Sunday’s New York Times Magazine, Charles Duhigg details how some retailers profit by predicting major changes in your life.
The first ad showed a woman complaining about the smoking section of a restaurant. Whenever she eats there, she says, her jacket smells like smoke. A friend tells her that if she uses Febreze, it will eliminate the odor. The cue in the ad is clear: the harsh smell of cigarette smoke. The reward: odor eliminated from clothes. The second ad featured a woman worrying about her dog, Sophie, who always sits on the couch. “Sophie will always smell like Sophie,” she says, but with Febreze, “now my furniture doesn’t have to.” The ads were put in heavy rotation. Then the marketers sat back, anticipating how they would spend their bonuses. A week passed. Then two. A month. Two months. Sales started small and got smaller. Febreze was a dud.
The panicked marketing team canvassed consumers and conducted in-depth interviews to figure out what was going wrong, Stimson recalled. Their first inkling came when they visited a woman’s home outside Phoenix. The house was clean and organized. She was something of a neat freak, the woman explained. But when P.& G.’s scientists walked into her living room, where her nine cats spent most of their time, the scent was so overpowering that one of them gagged.
According to Stimson, who led the Febreze team, a researcher asked the woman, “What do you do about the cat smell?”
“No,” she said. “Isn’t it wonderful? They hardly smell at all!”
A similar scene played out in dozens of other smelly homes. The reason Febreze wasn’t selling, the marketers realized, was that people couldn’t detect most of the bad smells in their lives. If you live with nine cats, you become desensitized to their scents. If you smoke cigarettes, eventually you don’t smell smoke anymore. Even the strongest odors fade with constant exposure. That’s why Febreze was a failure. The product’s cue — the bad smells that were supposed to trigger daily use — was hidden from the people who needed it the most. And Febreze’s reward (an odorless home) was meaningless to someone who couldn’t smell offensive scents in the first place.
P.& G. employed a Harvard Business School professor to analyze Febreze’s ad campaigns. They collected hours of footage of people cleaning their homes and watched tape after tape, looking for clues that might help them connect Febreze to people’s daily habits. When that didn’t reveal anything, they went into the field and conducted more interviews. A breakthrough came when they visited a woman in a suburb near Scottsdale, Ariz., who was in her 40s with four children. Her house was clean, though not compulsively tidy, and didn’t appear to have any odor problems; there were no pets or smokers. To the surprise of everyone, she loved Febreze.
“I use it every day,” she said.
“What smells are you trying to get rid of?” a researcher asked.
“I don’t really use it for specific smells,” the woman said. “I use it for normal cleaning — a couple of sprays when I’m done in a room.”
The researchers followed her around as she tidied the house. In the bedroom, she made her bed, tightened the sheet’s corners, then sprayed the comforter with Febreze. In the living room, she vacuumed, picked up the children’s shoes, straightened the coffee table, then sprayed Febreze on the freshly cleaned carpet.
“It’s nice, you know?” she said. “Spraying feels like a little minicelebration when I’m done with a room.” At the rate she was going, the team estimated, she would empty a bottle of Febreze every two weeks.
When they got back to P.& G.’s headquarters, the researchers watched their videotapes again. Now they knew what to look for and saw their mistake in scene after scene. Cleaning has its own habit loops that already exist. In one video, when a woman walked into a dirty room (cue), she started sweeping and picking up toys (routine), then she examined the room and smiled when she was done (reward). In another, a woman scowled at her unmade bed (cue), proceeded to straighten the blankets and comforter (routine) and then sighed as she ran her hands over the freshly plumped pillows (reward). P.& G. had been trying to create a whole new habit with Febreze, but what they really needed to do was piggyback on habit loops that were already in place. The marketers needed to position Febreze as something that came at the end of the cleaning ritual, the reward, rather than as a whole new cleaning routine.
The company printed new ads showing open windows and gusts of fresh air. More perfume was added to the Febreze formula, so that instead of merely neutralizing odors, the spray had its own distinct scent. Television commercials were filmed of women, having finished their cleaning routine, using Febreze to spritz freshly made beds and just-laundered clothing. Each ad was designed to appeal to the habit loop: when you see a freshly cleaned room (cue), pull out Febreze (routine) and enjoy a smell that says you’ve done a great job (reward). When you finish making a bed (cue), spritz Febreze (routine) and breathe a sweet, contented sigh (reward). Febreze, the ads implied, was a pleasant treat, not a reminder that your home stinks.
And so Febreze, a product originally conceived as a revolutionary way to destroy odors, became an air freshener used once things are already clean. The Febreze revamp occurred in the summer of 1998. Within two months, sales doubled. A year later, the product brought in $230 million. Since then Febreze has spawned dozens of spinoffs — air fresheners, candles and laundry detergents — that now account for sales of more than $1 billion a year. Eventually, P.& G. began mentioning to customers that, in addition to smelling sweet, Febreze can actually kill bad odors. Today it’s one of the top-selling products in the world.
Andrew Pole was hired by Target to use the same kinds of insights into consumers’ habits to expand Target’s sales. His assignment was to analyze all the cue-routine-reward loops among shoppers and help the company figure out how to exploit them. Much of his department’s work was straightforward: find the customers who have children and send them catalogs that feature toys before Christmas. Look for shoppers who habitually purchase swimsuits in April and send them coupons for sunscreen in July and diet books in December. But Pole’s most important assignment was to identify those unique moments in consumers’ lives when their shopping habits become particularly flexible and the right advertisement or coupon would cause them to begin spending in new ways.
In the 1980s, a team of researchers led by a U.C.L.A. professor named Alan Andreasen undertook a study of peoples’ most mundane purchases, like soap, toothpaste, trash bags and toilet paper. They learned that most shoppers paid almost no attention to how they bought these products, that the purchases occurred habitually, without any complex decision-making. Which meant it was hard for marketers, despite their displays and coupons and product promotions, to persuade shoppers to change.
But when some customers were going through a major life event, like graduating from college or getting a new job or moving to a new town, their shopping habits became flexible in ways that were both predictable and potential gold mines for retailers. The study found that when someone marries, he or she is more likely to start buying a new type of coffee. When a couple move into a new house, they’re more apt to purchase a different kind of cereal. When they divorce, there’s an increased chance they’ll start buying different brands of beer.
Consumers going through major life events often don’t notice, or care, that their shopping habits have shifted, but retailers notice, and they care quite a bit. At those unique moments, Andreasen wrote, customers are “vulnerable to intervention by marketers.” In other words, a precisely timed advertisement, sent to a recent divorcee or new homebuyer, can change someone’s shopping patterns for years.
And among life events, none are more important than the arrival of a baby. At that moment, new parents’ habits are more flexible than at almost any other time in their adult lives. If companies can identify pregnant shoppers, they can earn millions.
The only problem is that identifying pregnant customers is harder than it sounds. Target has a baby-shower registry, and Pole started there, observing how shopping habits changed as a woman approached her due date, which women on the registry had willingly disclosed. He ran test after test, analyzing the data, and before long some useful patterns emerged. Lotions, for example. Lots of people buy lotion, but one of Pole’s colleagues noticed that women on the baby registry were buying larger quantities of unscented lotion around the beginning of their second trimester. Another analyst noted that sometime in the first 20 weeks, pregnant women loaded up on supplements like calcium, magnesium and zinc. Many shoppers purchase soap and cotton balls, but when someone suddenly starts buying lots of scent-free soap and extra-big bags of cotton balls, in addition to hand sanitizers and washcloths, it signals they could be getting close to their delivery date.
As Pole’s computers crawled through the data, he was able to identify about 25 products that, when analyzed together, allowed him to assign each shopper a “pregnancy prediction” score. More important, he could also estimate her due date to within a small window, so Target could send coupons timed to very specific stages of her pregnancy.
One Target employee I spoke to provided a hypothetical example. Take a fictional Target shopper named Jenny Ward, who is 23, lives in Atlanta and in March bought cocoa-butter lotion, a purse large enough to double as a diaper bag, zinc and magnesium supplements and a bright blue rug. There’s, say, an 87 percent chance that she’s pregnant and that her delivery date is sometime in late August. What’s more, because of the data attached to her Guest ID number, Target knows how to trigger Jenny’s habits. They know that if she receives a coupon via e-mail, it will most likely cue her to buy online. They know that if she receives an ad in the mail on Friday, she frequently uses it on a weekend trip to the store. And they know that if they reward her with a printed receipt that entitles her to a free cup of Starbucks coffee, she’ll use it when she comes back again.
In the past, that knowledge had limited value. After all, Jenny purchased only cleaning supplies at Target, and there were only so many psychological buttons the company could push. But now that she is pregnant, everything is up for grabs. In addition to triggering Jenny’s habits to buy more cleaning products, they can also start including offers for an array of products, some more obvious than others, that a woman at her stage of pregnancy might need.
Pole applied his program to every regular female shopper in Target’s national database and soon had a list of tens of thousands of women who were most likely pregnant. If they could entice those women or their husbands to visit Target and buy baby-related products, the company’s cue-routine-reward calculators could kick in and start pushing them to buy groceries, bathing suits, toys and clothing, as well. When Pole shared his list with the marketers, he said, they were ecstatic. Soon, Pole was getting invited to meetings above his paygrade. Eventually his paygrade went up.
At which point someone asked an important question: How are women going to react when they figure out how much Target knows?
“If we send someone a catalog and say, ‘Congratulations on your first child!’ and they’ve never told us they’re pregnant, that’s going to make some people uncomfortable,” Pole told me. “We are very conservative about compliance with all privacy laws. But even if you’re following the law, you can do things where people get queasy.”
About a year after Pole created his pregnancy-prediction model, a man walked into a Target outside Minneapolis and demanded to see the manager. He was clutching coupons that had been sent to his daughter, and he was angry, according to an employee who participated in the conversation.
“My daughter got this in the mail!” he said. “She’s still in high school, and you’re sending her coupons for baby clothes and cribs? Are you trying to encourage her to get pregnant?”
The manager didn’t have any idea what the man was talking about. He looked at the mailer. Sure enough, it was addressed to the man’s daughter and contained advertisements for maternity clothing, nursery furniture and pictures of smiling infants. The manager apologized and then called a few days later to apologize again.
On the phone, though, the father was somewhat abashed. “I had a talk with my daughter,” he said. “It turns out there’s been some activities in my house I haven’t been completely aware of. She’s due in August. I owe you an apology.”
When I approached Target to discuss Pole’s work, its representatives declined to speak with me. “Our mission is to make Target the preferred shopping destination for our guests by delivering outstanding value, continuous innovation and exceptional guest experience,” the company wrote in a statement. “We’ve developed a number of research tools that allow us to gain insights into trends and preferences within different demographic segments of our guest population.” When I sent Target a complete summary of my reporting, the reply was more terse: “Almost all of your statements contain inaccurate information and publishing them would be misleading to the public. We do not intend to address each statement point by point.” The company declined to identify what was inaccurate. They did add, however, that Target “is in compliance with all federal and state laws, including those related to protected health information.”
When I offered to fly to Target’s headquarters to discuss its concerns, a spokeswoman e-mailed that no one would meet me. When I flew out anyway, I was told I was on a list of prohibited visitors. “I’ve been instructed not to give you access and to ask you to leave,” said a very nice security guard named Alex.
Using data to predict a woman’s pregnancy, Target realized soon after Pole perfected his model, could be a public-relations disaster. So the question became: how could they get their advertisements into expectant mothers’ hands without making it appear they were spying on them? How do you take advantage of someone’s habits without letting them know you’re studying their lives?
Before I met Andrew Pole, before I even decided to write a book about the science of habit formation, I had another goal: I wanted to lose weight.
I had got into a bad habit of going to the cafeteria every afternoon and eating a chocolate-chip cookie, which contributed to my gaining a few pounds. Eight, to be precise. I put a Post-it note on my computer reading “NO MORE COOKIES.” But every afternoon, I managed to ignore that note, wander to the cafeteria, buy a cookie and eat it while chatting with colleagues. Tomorrow, I always promised myself, I’ll muster the willpower to resist.
Tomorrow, I ate another cookie.
When I started interviewing experts in habit formation, I concluded each interview by asking what I should do. The first step, they said, was to figure out my habit loop. The routine was simple: every afternoon, I walked to the cafeteria, bought a cookie and ate it while chatting with friends.
Next came some less obvious questions: What was the cue? Hunger? Boredom? Low blood sugar? And what was the reward? The taste of the cookie itself? The temporary distraction from my work? The chance to socialize with colleagues?
Rewards are powerful because they satisfy cravings, but we’re often not conscious of the urges driving our habits in the first place. So one day, when I felt a cookie impulse, I went outside and took a walk instead. The next day, I went to the cafeteria and bought a coffee. The next, I bought an apple and ate it while chatting with friends. You get the idea. I wanted to test different theories regarding what reward I was really craving. Was it hunger? (In which case the apple should have worked.) Was it the desire for a quick burst of energy? (If so, the coffee should suffice.) Or, as turned out to be the answer, was it that after several hours spent focused on work, I wanted to socialize, to make sure I was up to speed on office gossip, and the cookie was just a convenient excuse? When I walked to a colleague’s desk and chatted for a few minutes, it turned out, my cookie urge was gone.
All that was left was identifying the cue.
Deciphering cues is hard, however. Our lives often contain too much information to figure out what is triggering a particular behavior. Do you eat breakfast at a certain time because you’re hungry? Or because the morning news is on? Or because your kids have started eating? Experiments have shown that most cues fit into one of five categories: location, time, emotional state, other people or the immediately preceding action. So to figure out the cue for my cookie habit, I wrote down five things the moment the urge hit:
Where are you? (Sitting at my desk.)
What time is it? (3:36 p.m.)
What’s your emotional state? (Bored.)
Who else is around? (No one.)
What action preceded the urge? (Answered an e-mail.)
The next day I did the same thing. And the next. Pretty soon, the cue was clear: I always felt an urge to snack around 3:30.
Once I figured out all the parts of the loop, it seemed fairly easy to change my habit. But the psychologists and neuroscientists warned me that, for my new behavior to stick, I needed to abide by the same principle that guided Procter & Gamble in selling Febreze: To shift the routine — to socialize, rather than eat a cookie — I needed to piggyback on an existing habit. So now, every day around 3:30, I stand up, look around the newsroom for someone to talk to, spend 10 minutes gossiping, then go back to my desk. The cue and reward have stayed the same. Only the routine has shifted. It doesn’t feel like a decision, any more than the M.I.T. rats made a decision to run through the maze. It’s now a habit. I’ve lost 21 pounds since then (12 of them from changing my cookie ritual).
After Andrew Pole built his pregnancy-prediction model, after he identified thousands of female shoppers who were most likely pregnant, after someone pointed out that some of those women might be a little upset if they received an advertisement making it obvious Target was studying their reproductive status, everyone decided to slow things down.
The marketing department conducted a few tests by choosing a small, random sample of women from Pole’s list and mailing them combinations of advertisements to see how they reacted.
“We have the capacity to send every customer an ad booklet, specifically designed for them, that says, ‘Here’s everything you bought last week and a coupon for it,’ ” one Target executive told me. “We do that for grocery products all the time.” But for pregnant women, Target’s goal was selling them baby items they didn’t even know they needed yet.
“With the pregnancy products, though, we learned that some women react badly,” the executive said. “Then we started mixing in all these ads for things we knew pregnant women would never buy, so the baby ads looked random. We’d put an ad for a lawn mower next to diapers. We’d put a coupon for wineglasses next to infant clothes. That way, it looked like all the products were chosen by chance.
“And we found out that as long as a pregnant woman thinks she hasn’t been spied on, she’ll use the coupons. She just assumes that everyone else on her block got the same mailer for diapers and cribs. As long as we don’t spook her, it works.”
In other words, if Target piggybacked on existing habits — the same cues and rewards they already knew got customers to buy cleaning supplies or socks — then they could insert a new routine: buying baby products, as well. There’s a cue (“Oh, a coupon for something I need!”) a routine (“Buy! Buy! Buy!”) and a reward (“I can take that off my list”). And once the shopper is inside the store, Target will hit her with cues and rewards to entice her to purchase everything she normally buys somewhere else. As long as Target camouflaged how much it knew, as long as the habit felt familiar, the new behavior took hold.
Soon after the new ad campaign began, Target’s Mom and Baby sales exploded. The company doesn’t break out figures for specific divisions, but between 2002 — when Pole was hired — and 2010, Target’s revenues grew from $44 billion to $67 billion. In 2005, the company’s president, Gregg Steinhafel, boasted to a room of investors about the company’s “heightened focus on items and categories that appeal to specific guest segments such as mom and baby.”
Pole was promoted. He has been invited to speak at conferences. “I never expected this would become such a big deal,” he told me the last time we spoke.
A few weeks before this article went to press, I flew to Minneapolis to try and speak to Andrew Pole one last time. I hadn’t talked to him in more than a year. Back when we were still friendly, I mentioned that my wife was seven months pregnant. We shop at Target, I told him, and had given the company our address so we could start receiving coupons in the mail. As my wife’s pregnancy progressed, I noticed a subtle upswing in the number of advertisements for diapers and baby clothes arriving at our house.
Pole didn’t answer my e-mails or phone calls when I visited Minneapolis. I drove to his large home in a nice suburb, but no one answered the door. On my way back to the hotel, I stopped at a Target to pick up some deodorant, then also bought some T-shirts and a fancy hair gel. On a whim, I threw in some pacifiers, to see how the computers would react. Besides, our baby is now 9 months old. You can’t have too many pacifiers.
When I paid, I didn’t receive any sudden deals on diapers or formula, to my slight disappointment. It made sense, though: I was shopping in a city I never previously visited, at 9:45 p.m. on a weeknight, buying a random assortment of items. I was using a corporate credit card, and besides the pacifiers, hadn’t purchased any of the things that a parent needs. It was clear to Target’s computers that I was on a business trip. Pole’s prediction calculator took one look at me, ran the numbers and decided to bide its time. Back home, the offers would eventually come. As Pole told me the last time we spoke: “Just wait. We’ll be sending you coupons for things you want before you even know you want them.”