The confusing rollout of meaningful social interactions—marked by internal dissent, blistering external criticism, genuine efforts at reform, and foolish mistakes—set the stage for Facebook’s 2018. This is the story of that annus horribilis, based on interviews with 65 current and former employees. It’s ultimately a story about the biggest shifts ever to take place inside the world’s biggest social network. But it’s also about a company trapped by its own pathologies and, perversely, by the inexorable logic of its own recipe for success.
Facebook’s powerful network effects have kept advertisers from fleeing, and overall user numbers remain healthy if you include people on Instagram, which Facebook owns. But the company’s original culture and mission kept creating a set of brutal debts that came due with regularity over the past 16 months. The company floundered, dissembled, and apologized. Even when it told the truth, people didn’t believe it. Critics appeared on all sides, demanding changes that ranged from the essential to the contradictory to the impossible. As crises multiplied and diverged, even the company’s own solutions began to cannibalize each other. And the most crucial episode in this story—the crisis that cut the deepest—began not long after Davos, when some reporters from The New York Times, The Guardian, and Britain’s Channel 4 News came calling. They’d learned some troubling things about a shady British company called Cambridge Analytica, and they had some questions.
15 Months of Fresh Hell Inside Facebook
Scandals. Backstabbing. Resignations. Record profits. Time Bombs. In early 2018, Mark Zuckerberg set out to fix Facebook. Here’s how that turned out:
The Ad ID
Persistent identifiers are the bread and butter of the online tracking industry. They allow companies to learn the websites that you visit and the apps that you use, including what you do within those apps. A persistent identifier is just a unique number that is used to either identify you or your device. Your Social Security Number and phone number are examples of persistent identifiers used in real life; cookies use persistent identifiers to identify you across websites.
On your mobile device, there are many different types of persistent identifiers that are used by app developers and third parties contacted by those apps. For example, one app might send an advertising network your device’s serial number. When a different app on your same phone sends that same advertising network your device’s serial number, that advertising network now knows that you use both of these apps, and can use that information to profile you. This sort of profiling is what is meant by “behavioral advertising.” That is, they track your behaviors so that they can infer your interests from those behaviors, and then send you ads targeted to those inferred interests.
On the web, if you don’t want to be tracked in this manner, you can periodically clear your cookies or configure your browser to simply not accept cookies (though this breaks a lot of the web, given that there are many other uses for cookies beyond tracking). Clearing your cookies resets all of the persistent identifiers, which means that new persistent identifiers will be sent to third parties, making it more difficult for them to associate your future online activities with the previous profile they had constructed.
Regarding the persistent identifiers used by mobile apps, up until a few years ago, there was no way of doing the equivalent of clearing your cookies: many of the persistent identifiers used to track your mobile app activities were based in hardware, such the device’s serial number, IMEI, WiFi MAC address, SIM card serial number, etc. Many apps used (and still use) the Android ID for tracking purposes, which while not based in hardware, can only be reset by performing a factory reset on the device and deleting all of its data. Thus, there wasn’t an easy way for users to do the equivalent of clearing their cookies.
However, this changed in 2013 with the creation of the “ad ID”: both Android and iOS unveiled a new persistent identifier based in software that provides the user with privacy controls to reset that identifier at will (similar to clearing cookies).
Of course, being able to reset the ad identifier is only a good privacy-preserving solution if it is the only identifier being collected from the device. Imagine the following situation:
- An app sends both the ad ID and the IMEI (a non-resettable hardware-based identifier) to a data broker.
- Concerned with her privacy, the user uses one of the above privacy settings panels to reset her phone’s ad ID.
- Later, when using a different app, the same data broker is sent the new ad ID alongside the IMEI.
- The data broker sees that while the ad IDs are different between these two transmissions, the IMEI is the same, and therefore they must have come from the same device. Knowing this, the data broker can then add the second transmission to the user’s existing profile.
In this case, sending a non-resettable identifier alongside the ad ID completely undermines the privacy-preserving properties of the ad ID: resetting it does not prevent tracking. For this reason, both iOS and Android have policies that prohibit developers from transmitting other identifiers alongside the ad ID. For example, in 2017, it was major news that Uber’s app had violated iOS App Store privacy guidelines by collecting non-resettable persistent identifiers. Tim Cook personally threatened to have the Uber app removed from the store. Similarly, Google’s Play Store policy says that the ad ID cannot be transmitted alongside other identifiers without users’ explicit consent, and that for advertising purposes, the ad ID is the only identifier that can be used:
Association with personally-identifiable information or other identifiers. The advertising identifier must not be connected to personally-identifiable information or associated with any persistent device identifier (for example: SSAID, MAC address, IMEI, etc.) without explicit consent of the user.
Violations of Ad ID Policies
I examined the AppCensus database to examine compliance with this policy. That is, are there apps violating this policy by transmitting the ad ID alongside other persistent identifiers to advertisers? When I performed this experiment last September, there were approximately 24k apps in our database that we had observed transmitting the ad ID. Of these, approximately 17k (i.e., ~70%) were transmitting the ad ID alongside other persistent identifiers. Based on the data recipients of some of the most popular offenders, these are clearly being used for advertising purposes:
|App Name||Installs||Data Types||Recipient|
|Clean Master – Antivirus, Cleaner & Booster||1B||Ad ID + Android ID||t.appsflyer.com|
|Subway Surfers||1B||Android ID||api.vungle.com|
|Flipboard: News For Our Time||500M||Ad ID + Android ID||ad.flipboard.com|
|My Talking Tom||500M||Ad ID + Android ID||m2m1.inner-active.mobi|
|Temple Run 2||500M||Ad ID + Android ID||live.chartboost.com|
|3D Bowling||100M||Ad ID + Android ID + IMEI||ws.tapjoyads.com|
|8 Ball Pool||100M||Ad ID + Android ID||ws.tapjoyads.com|
|Agar.io||100M||Ad ID + Android ID||ws.tapjoyads.com|
|Angry Birds Classic||100M||Android ID||ads.api.vungle.com|
|Audiobooks from Audible||100M||Ad ID + Android ID||api.branch.io|
|Azar||100M||Ad ID + Android ID||api.branch.io|
|B612 – Beauty & Filter Camera||100M||Ad ID + Android ID||t.appsflyer.com|
|Banana Kong||100M||Ad ID + Android ID||live.chartboost.com|
|Battery Doctor – Battery Life Saver & Battery Cooler||100M||Ad ID + Android ID + IMEI||t.appsflyer.com|
|BeautyPlus – Easy Photo Editor & Selfie Camera||100M||Ad ID + Android ID||t.appsflyer.com,
|Bus Rush||100M||Ad ID + Android ID||ads.api.vungle.com,
|CamScanner – Phone PDF Creator||100M||Ad ID + Android ID + IMEI||t.appsflyer.com|
|Cheetah Keyboard – Emoji & Stickers Keyboard||100M||Ad ID + Android ID||t.appsflyer.com|
|Cooking Fever||100M||Ad ID + Android ID||ws.tapjoyads.com|
|Cut The Rope Full FREE||100M||Ad ID + Android ID||ws.tapjoyads.com|
This is just the top 20 most popular apps that are violating this policy, sorted alphabetically. All of the domains receiving the data in the right-most column are either advertising networks, or companies otherwise involved in tracking users’ interactions with ads (i.e., to use Google’s language, “any advertising purposes”). In fact, as of today, there are over 18k distinct apps transmitting the Ad ID alongside other persistent identifiers.
In September, our research group reported just under 17k apps to Google that were transmitting the ad ID alongside other identifiers. The data we gave them included the data types being transmitted and a list of the recipient domains, which included some of the following companies involved in mobile advertising:
The majority of these have the word “ads” in the hostname. Looking at the traffic shows that they are either being used to place ads in apps, or track user engagement with ads.
It has been 5 months since we submitted that report, and we have not received anything from Google about whether they plan to address this pervasive problem. In the interim, more apps now appear to be violating Google’s policy. The problem with all of this is that Google is providing users with privacy controls (see above image), but those privacy controls don’t actually do anything because they only control the ad ID, and we’ve shown that in the vast majority of cases, other persistent identifiers are being collected by apps in addition to the ad ID.
Germany’s Federal Cartel Office, or Bundeskartellamt, on Thursday banned Facebook from combining user data from its various platforms such as WhatsApp and Instagram without explicit user permission.
The decision, which comes as the result of a nearly three-year antitrust investigation into Facebook’s data gathering practices, also bans the social media company from gleaning user data from third-party sites unless they voluntarily consent.
“With regard to Facebook’s future data processing policy, we are carrying out what can be seen as an internal divestiture of Facebook’s data,” Bundeskartellamt President Andreas Mundt said in a release. “In [the] future, Facebook will no longer be allowed to force its users to agree to the practically unrestricted collection and assigning of non-Facebook data to their Facebook user accounts.”
Mundt noted that combining user data from various sources “substantially contributed to the fact that Facebook was able to build a unique database for each individual user and thus to gain market power.”
Experts agreed with the decision. “It is high time to regulate the internet giants effectively!” said Marc Al-Hames, general manager of German data protection technologies developer Cliqz GmbH. “Unregulated data capitalism inevitably creates unfair conditions.”
Al-Hames noted that apps like WhatsApp have become “indispensable for many young people,” who feel compelled to join if they want to be part of the social scene. “Social media create social pressure,” he said. “And Facebook exploits this mercilessly: Give me your data or you’re an outsider.”
He called the practice an abuse of dominant market position. “But that’s not all: Facebook monitors our activities regardless of whether we are a member of one of its networks or not. Even those who consciously renounce the social networks for the sake of privacy will still be spied out,” he said, adding that Cliqz and Ghostery stats show that “every fourth of our website visits are monitored by Facebook’s data collection technologies, so-called trackers.”
The Bundeskartellamt’s decision will prevent Facebook from collecting and using data without restriction. “Voluntary consent means that the use of Facebook’s services must [now] be subject to the users’ consent to their data being collected and combined in this way,” said Mundt. “If users do not consent, Facebook may not exclude them from its services and must refrain from collecting and merging data from different sources.”
The ban drew support and calls for it to be expanded to other companies.
“This latest move by Germany’s competition regulator is welcome,” said Morten Brøgger, CEO of secure collaboration platform Wire. “Compromising user privacy for profit is a risk no exec should be willing to take.”
Brøgger contends that Facebook has not fully understood digital privacy’s importance. “From emails suggesting cashing in on user data for money, to the infamous Cambridge Analytica scandal, the company is taking steps back in a world which is increasingly moving towards the protection of everyone’s data,” he said.
“The lesson here is that you cannot simply trust firms that rely on the exchange of data as its main offering, Brøgger added, “and firms using Facebook-owned applications should have a rethink about the platforms they use to do business.”
Al-Hames said regulators shouldn’t stop with Facebook, which he called the number-two offender. “By far the most important data monopolist is Alphabet. With Google search, the Android operating system, the Play Store app sales platform and the Chrome browser, the internet giant collects data on virtually everyone in the Western world,” Al-Hames said. “And even those who want to get free by using alternative services stay trapped in Alphabet’s clutches: With a tracker reach of nearly 80 percent of all page loads Alphabet probably knows more about them than their closest friends or relatives. When it comes to our data, the top priority of the market regulators shouldn’t be Facebook, it should be Alphabet!”
Desperate for data on its competitors, Facebook has been secretly paying people to install a “Facebook Research” VPN that lets the company suck in all of a user’s phone and web activity, similar to Facebook’s Onavo Protect app that Apple banned in June and that was removed in August. Facebook sidesteps the App Store and rewards teenagers and adults to download the Research app and give it root access to network traffic in what may be a violation of Apple policy so the social network can decrypt and analyze their phone activity, a TechCrunch investigation confirms.
Facebook admitted to TechCrunch it was running the Research program to gather data on usage habits.
Since 2016, Facebook has been paying users ages 13 to 35 up to $20 per month plus referral fees to sell their privacy by installing the iOS or Android “Facebook Research” app. Facebook even asked users to screenshot their Amazon order history page. The program is administered through beta testing services Applause, BetaBound and uTest to cloak Facebook’s involvement, and is referred to in some documentation as “Project Atlas” — a fitting name for Facebook’s effort to map new trends and rivals around the globe.
Seven hours after this story was published, Facebook told TechCrunch it would shut down the iOS version of its Research app in the wake of our report. But on Wednesday morning, an Apple spokesperson confirmed that Facebook violated its policies, and it had blocked Facebook’s Research app on Tuesday before the social network seemingly pulled it voluntarily (without mentioning it was forced to do so). You can read our full report on the development here.
An Apple spokesperson provided this statement. “We designed our Enterprise Developer Program solely for the internal distribution of apps within an organization. Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple. Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.”
Facebook’s Research program will continue to run on Android.
Facebook’s Research app requires users to ‘Trust’ it with extensive access to their dataWe asked Guardian Mobile Firewall’s security expert Will Strafach to dig into the Facebook Research app, and he told us that “If Facebook makes full use of the level of access they are given by asking users to install the Certificate, they will have the ability to continuously collect the following types of data: private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed.” It’s unclear exactly what data Facebook is concerned with, but it gets nearly limitless access to a user’s device once they install the app.
The strategy shows how far Facebook is willing to go and how much it’s willing to pay to protect its dominance — even at the risk of breaking the rules of Apple’s iOS platform on which it depends. Apple may have asked Facebook to discontinue distributing its Research app.
A more stringent punishment would be to revoke Facebook’s permission to offer employee-only apps. The situation could further chill relations between the tech giants. Apple’s Tim Cook has repeatedly criticized Facebook’s data collection practices. Facebook disobeying iOS policies to slurp up more information could become a new talking point.
“The fairly technical sounding ‘install our Root Certificate’ step is appalling,” Strafach tells us. “This hands Facebook continuous access to the most sensitive data about you, and most users are going to be unable to reasonably consent to this regardless of any agreement they sign, because there is no good way to articulate just how much power is handed to Facebook when you do this.”
Facebook’s surveillance app
Facebook first got into the data-sniffing business when it acquired Onavo for around $120 million in 2014. The VPN app helped users track and minimize their mobile data plan usage, but also gave Facebook deep analytics about what other apps they were using. Internal documents acquired by Charlie Warzel and Ryan Mac of BuzzFeed News reveal that Facebook was able to leverage Onavo to learn that WhatsApp was sending more than twice as many messages per day as Facebook Messenger. Onavo allowed Facebook to spot WhatsApp’s meteoric rise and justify paying $19 billion to buy the chat startup in 2014. WhatsApp has since tripled its user base, demonstrating the power of Onavo’s foresight.
Over the years since, Onavo clued Facebook in to what apps to copy, features to build and flops to avoid. By 2018, Facebook was promoting the Onavo app in a Protect bookmark of the main Facebook app in hopes of scoring more users to snoop on. Facebook also launched the Onavo Bolt app that let you lock apps behind a passcode or fingerprint while it surveils you, but Facebook shut down the app the day it was discovered following privacy criticism. Onavo’s main app remains available on Google Play and has been installed more than 10 million times.
The backlash heated up after security expert Strafach detailed in March how Onavo Protect was reporting to Facebook when a user’s screen was on or off, and its Wi-Fi and cellular data usage in bytes even when the VPN was turned off. In June, Apple updated its developer policies to ban collecting data about usage of other apps or data that’s not necessary for an app to function. Apple proceeded to inform Facebook in August that Onavo Protect violated those data collection policies and that the social network needed to remove it from the App Store, which it did, Deepa Seetharaman of the WSJ reported.
But that didn’t stop Facebook’s data collection.
TechCrunch recently received a tip that despite Onavo Protect being banished by Apple, Facebook was paying users to sideload a similar VPN app under the Facebook Research moniker from outside of the App Store. We investigated, and learned Facebook was working with three app beta testing services to distribute the Facebook Research app: BetaBound, uTest and Applause. Facebook began distributing the Research VPN app in 2016. It has been referred to as Project Atlas since at least mid-2018, around when backlash to Onavo Protect magnified and Apple instituted its new rules that prohibited Onavo. Previously, a similar program was called Project Kodiak. Facebook didn’t want to stop collecting data on people’s phone usage and so the Research program continued, in disregard for Apple banning Onavo Protect.
Ads (shown below) for the program run by uTest on Instagram and Snapchat sought teens 13-17 years old for a “paid social media research study.” The sign-up page for the Facebook Research program administered by Applause doesn’t mention Facebook, but seeks users “Age: 13-35 (parental consent required for ages 13-17).” If minors try to sign-up, they’re asked to get their parents’ permission with a form that reveal’s Facebook’s involvement and says “There are no known risks associated with the project, however you acknowledge that the inherent nature of the project involves the tracking of personal information via your child’s use of apps. You will be compensated by Applause for your child’s participation.” For kids short on cash, the payments could coerce them to sell their privacy to Facebook.
The Applause site explains what data could be collected by the Facebook Research app (emphasis mine):
“By installing the software, you’re giving our client permission to collect data from your phone that will help them understand how you browse the internet, and how you use the features in the apps you’ve installed . . . This means you’re letting our client collect information such as which apps are on your phone, how and when you use them, data about your activities and content within those apps, as well as how other people interact with you or your content within those apps. You are also letting our client collect information about your internet browsing activity (including the websites you visit and data that is exchanged between your device and those websites) and your use of other online services. There are some instances when our client will collect this information even where the app uses encryption, or from within secure browser sessions.”
Meanwhile, the BetaBound sign-up page with a URL ending in “Atlas” explains that “For $20 per month (via e-gift cards), you will install an app on your phone and let it run in the background.” It also offers $20 per friend you refer. That site also doesn’t initially mention Facebook, but the instruction manual for installing Facebook Research reveals the company’s involvement.
Facebook seems to have purposefully avoided TestFlight, Apple’s official beta testing system, which requires apps to be reviewed by Apple and is limited to 10,000 participants. Instead, the instruction manual reveals that users download the app from r.facebook-program.com and are told to install an Enterprise Developer Certificate and VPN and “Trust” Facebook with root access to the data their phone transmits. Apple requires that developers agree to only use this certificate system for distributing internal corporate apps to their own employees. Randomly recruiting testers and paying them a monthly fee appears to violate the spirit of that rule.
Once installed, users just had to keep the VPN running and sending data to Facebook to get paid. The Applause-administered program requested that users screenshot their Amazon orders page. This data could potentially help Facebook tie browsing habits and usage of other apps with purchase preferences and behavior. That information could be harnessed to pinpoint ad targeting and understand which types of users buy what.
TechCrunch commissioned Strafach to analyze the Facebook Research app and find out where it was sending data. He confirmed that data is routed to “vpn-sjc1.v.facebook-program.com” that is associated with Onavo’s IP address, and that the facebook-program.com domain is registered to Facebook, according to MarkMonitor. The app can update itself without interacting with the App Store, and is linked to the email address PeopleJourney@fb.com. He also discovered that the Enterprise Certificate first acquired in 2016 indicates Facebook renewed it on June 27th, 2018 — weeks after Apple announced its new rules that prohibited the similar Onavo Protect app.
“It is tricky to know what data Facebook is actually saving (without access to their servers). The only information that is knowable here is what access Facebook is capable of based on the code in the app. And it paints a very worrisome picture,” Strafach explains. “They might respond and claim to only actually retain/save very specific limited data, and that could be true, it really boils down to how much you trust Facebook’s word on it. The most charitable narrative of this situation would be that Facebook did not think too hard about the level of access they were granting to themselves . . . which is a startling level of carelessness in itself if that is the case.”
[Update: TechCrunch also found that Google’s Screenwise Meter surveillance app also breaks the Enterprise Certificate policy, though it does a better job of revealing the company’s involvement and how it works than Facebook does.]
“Flagrant defiance of Apple’s rules”
In response to TechCrunch’s inquiry, a Facebook spokesperson confirmed it’s running the program to learn how people use their phones and other services. The spokesperson told us “Like many companies, we invite people to participate in research that helps us identify things we can be doing better. Since this research is aimed at helping Facebook understand how people use their mobile devices, we’ve provided extensive information about the type of data we collect and how they can participate. We don’t share this information with others and people can stop participating at any time.”
Facebook’s spokesperson claimed that the Facebook Research app was in line with Apple’s Enterprise Certificate program, but didn’t explain how in the face of evidence to the contrary. They said Facebook first launched its Research app program in 2016. They tried to liken the program to a focus group and said Nielsen and comScore run similar programs, yet neither of those ask people to install a VPN or provide root access to the network. The spokesperson confirmed the Facebook Research program does recruit teens but also other age groups from around the world. They claimed that Onavo and Facebook Research are separate programs, but admitted the same team supports both as an explanation for why their code was so similar.
However, Facebook’s claim that it doesn’t violate Apple’s Enterprise Certificate policy is directly contradicted by the terms of that policy. Those include that developers “Distribute Provisioning Profiles only to Your Employees and only in conjunction with Your Internal Use Applications for the purpose of developing and testing”. The policy also states that “You may not use, distribute or otherwise make Your Internal Use Applications available to Your Customers” unless under direct supervision of employees or on company premises. Given Facebook’s customers are using the Enterprise Certificate-powered app without supervision, it appears Facebook is in violation.
Seven hours after this report was first published, Facebook updated its position and told TechCrunch that it would shut down the iOS Research app. Facebook noted that the Research app was started in 2016 and was therefore not a replacement for Onavo Protect. However, they do share similar code and could be seen as twins running in parallel. A Facebook spokesperson also provided this additional statement:
“Key facts about this market research program are being ignored. Despite early reports, there was nothing ‘secret’ about this; it was literally called the Facebook Research App. It wasn’t ‘spying’ as all of the people who signed up to participate went through a clear on-boarding process asking for their permission and were paid to participate. Finally, less than 5 percent of the people who chose to participate in this market research program were teens. All of them with signed parental consent forms.”
Facebook did not publicly promote the Research VPN itself and used intermediaries that often didn’t disclose Facebook’s involvement until users had begun the signup process. While users were given clear instructions and warnings, the program never stresses nor mentions the full extent of the data Facebook can collect through the VPN. A small fraction of the users paid may have been teens, but we stand by the newsworthiness of its choice not to exclude minors from this data collection initiative.
Facebook disobeying Apple so directly and then pulling the app could hurt their relationship. “The code in this iOS app strongly indicates that it is simply a poorly re-branded build of the banned Onavo app, now using an Enterprise Certificate owned by Facebook in direct violation of Apple’s rules, allowing Facebook to distribute this app without Apple review to as many users as they want,” Strafach tells us. ONV prefixes and mentions of graph.onavo.com, “onavoApp://” and “onavoProtect://” custom URL schemes litter the app. “This is an egregious violation on many fronts, and I hope that Apple will act expeditiously in revoking the signing certificate to render the app inoperable.”
Facebook is particularly interested in what teens do on their phones as the demographic has increasingly abandoned the social network in favor of Snapchat, YouTube and Facebook’s acquisition Instagram. Insights into how popular with teens is Chinese video music app TikTok and meme sharing led Facebook to launch a clone called Lasso and begin developing a meme-browsing feature called LOL, TechCrunch first reported. But Facebook’s desire for data about teens riles critics at a time when the company has been battered in the press. Analysts on tomorrow’s Facebook earnings call should inquire about what other ways the company has to collect competitive intelligence now that it’s ceased to run the Research program on iOS.
Last year when Tim Cook was asked what he’d do in Mark Zuckerberg’s position in the wake of the Cambridge Analytica scandal, he said “I wouldn’t be in this situation . . . The truth is we could make a ton of money if we monetized our customer, if our customer was our product. We’ve elected not to do that.” Zuckerberg told Ezra Klein that he felt Cook’s comment was “extremely glib.”
Now it’s clear that even after Apple’s warnings and the removal of Onavo Protect, Facebook was still aggressively collecting data on its competitors via Apple’s iOS platform. “I have never seen such open and flagrant defiance of Apple’s rules by an App Store developer,” Strafach concluded. Now that Facebook has ceased the program on iOS and its Android future is uncertain, it may either have to invent new ways to surveil our behavior amidst a climate of privacy scrutiny, or be left in the dark.
Additional reporting by Zack Whittaker. Updated with comment from Facebook, and on Wednesday with a statement from Apple.
Apple on Wednesday warned investors that its revenue for the last three months of 2018 would not live up to previous estimates, or even come particularly close. The main culprit appears to be China, where the trade war and a broader economic slowdown contributed to plummeting iPhone sales. But CEO Tim Cook’s letter to investors pointed to a secondary thread as well, one that Apple customers, environmentalists, and even the company itself should view not as a liability but an asset: People are holding onto their iPhones longer.
That’s not just in China. Cook noted that iPhone upgrades were “not as strong as we thought they would be” in developed markets as well, citing “macroeconomic conditions,” a shift in how carriers price smartphones, a strong US dollar, and temporarily discounted battery replacements. He neglected to mention the simple fact that an iPhone can perform capably for years—and consumers are finally getting wise.
As recently as 2015, smartphone users on average upgraded their phone roughly every 24 months, says Cliff Maldonado, founder of BayStreet Research, which tracks the mobile industry. As of the fourth quarter of last year, that had jumped to at least 35 months. “You’re looking at people holding onto their devices an extra year,” Maldonado says. “It’s been considerable.”
A few factors contribute to the trend, chief among them the shift from buying phones on a two-year contract—heavily subsidized by the carriers—to installment plans in which the customer pays full freight. T-Mobile introduced the practice in the US in 2014, and by 2015 it had become the norm. The full effects, though, have only kicked in more recently. People still generally pay for their smartphone over two years; once they’re paid off, though, their monthly bill suddenly drops by, say, $25.
The shift has also caused a sharp drop-off in carrier incentives. They turn out not to be worth it. “They’re actually encouraging that dynamic of holding your smartphone longer. It’s in their best interest,” Maldonado says. “It actually costs them to get you into a new phone, to do those promotions, to run the transaction and put it on their books and finance it.”
Bottom line: If your service is reliable and your iPhone still works fine, why go through the hassle?
“There’s not as many subsidies as there used to be from a carrier point of view,” Cook told CNBC Wednesday. “And where that didn’t all happen yesterday, if you’ve been out of the market for two or three years and you come back, it looks like that to you.”
Meanwhile, older iPhones work better, for longer, thanks to Apple itself. When Apple vice president Craig Federighi introduced iOS 12 in June at Apple’s Worldwide Developers Conference, he emphasized how much it improved the performance of older devices. Among the numbers he cited: The 2014 iPhone 6 Plus opens apps 40 percent faster with iOS 12 than it had with iOS 11, and its keyboard appears up to 50 percent faster than before. And while Apple’s battery scandal of a year ago was a black mark for the company, it at least reminded Apple owners that they didn’t necessarily need a new iPhone. Eligible iPhone owners found that a $29 battery replacement—it normally costs $79—made their iPhone 6 feel something close to new.
“There definitely has been a major shift in customer perception, after all the controversy,” says Kyle Wiens, founder of online repair community iFixit. “What it really did more than anything else was remind you that the battery on your phone really can be replaced. Apple successfully brainwashing the public into thinking the battery was something they never needed to think about led people to prematurely buy these devices.”
Combine all of that with the fact that new model iPhones—and Android phones for that matter—have lacked a killer feature, much less one that would inspire someone to spend $1,000 or more if they didn’t absolutely have to. “Phones used to be toys, and shiny objects,” Maldonado says. “Now they’re utilities. You’ve got to have it, and the joy of getting a new one is pretty minor. Facebook and email looks the same; the camera’s still great.”
In the near term, these dynamics aren’t ideal for Apple; its stock dropped more than 7 percent in after-hours trading following Wednesday’s news. But it’s terrific news for consumers, who have apparently realized that a smartphone does not have a two-year expiration date. That saves money in the long run. And pulling the throttle back on iPhone sales may turn out to be equally welcome news for the planet.
According to Apple’s most recent sustainability report, the manufacture of each Apple device generates on average 90 pounds of carbon emissions. Wiens suggests that the creation of each iPhone requires hundreds of pounds of raw materials.
Manufacturing electronics is environmentally intense, Wiens says. “We can’t live in a world where we’re making 3 billion new smartphones a year. We don’t have the resources for it. We have to reduce how many overall devices we’re making. There are lots of ways to do it, but it gets down to demand, and how many we’re buying. That’s not what Apple wants, but it’s what the environment needs.”
Which raises a question: Why does Apple bother extending the lives of older iPhones? The altruistic answer comes from Lisa Jackson, who oversees the company’s environmental efforts.
“We also make sure to design and build durable products that last as long as possible,” Jackson said at Apple’s September hardware event. “Because they last longer, you can keep using them. And keeping using them is the best thing for the planet.”
Given a long enough horizon, Apple may see a financial benefit from less frequent upgrades as well. An iPhone that lasts longer keeps customers in the iOS ecosystem longer. That becomes even more important as the company places greater emphasis not on hardware but on services like Apple Music. It also offers an important point of differentiation from Android, whose fragmented ecosystem means even flagship devices rarely continue to be fully supported beyond two years.
“In reality, the big picture is still very good for Apple,” Maldonado says. Compared with Android, “Apple’s in a better spot, because the phones last longer.”
That’s cold comfort today and doesn’t help a whit with China. But news that people are holding onto their iPhones longer should be taken for what it really is: A sign of progress and a win for everyone. Even Apple.
Surprisingly a number of students and generation Y digital natives turn against social media giants.
BERKELEY, Calif. — A job at Facebook sounds pretty plum. The interns make around $8,000 a month, and an entry-level software engineer makes about $140,000 a year. The food is free. There’s a walking trail with indigenous plants and a juice bar.
But the tone among highly sought-after computer scientists about the social network is changing. On a recent night at the University of California, Berkeley, as a group of young engineers gathered to show off their tech skills, many said they would avoid taking jobs at the social network.
“I’ve heard a lot of employees who work there don’t even use it,” said Niky Arora, 19, an engineering student, who was recently invited to a Facebook recruiting event at the company’s headquarters in Menlo Park, Calif. “I just don’t believe in the product because like, Facebook, the baseline of everything they do is desire to show people more ads.”
Emily Zhong, 20, a computer science major, piped up. “Surprisingly, a lot of my friends now are like, ‘I don’t really want to work for Facebook,’” she said, citing “privacy stuff, fake news, personal data, all of it.”
“Before it was this glorious, magical thing to work there,” said Jazz Singh, 18, also studying computer science. “Now it’s like, just because it does what you want doesn’t mean it’s doing good.”
As Facebook has been rocked by scandal after scandal, some young engineers are souring on the company. Many are still taking jobs there, but those who do are doing it a little more quietly, telling their friends that they will work to change it from within or that they have carved out more ethical work at a company whose reputation has turned toxic.
Facebook, which employs more than 30,000 full-time workers around the world, said, “In 2018, we’ve hired more engineers than ever before.” The company added, “We continue to see strong engagement and excitement within the engineering community at the prospect of joining our company.”
The changing attitudes are happening beyond Facebook. Across Silicon Valley, tech recruiters said job applicants in general were asking more hard questions during interviews, wanting to know specifically what they would be asked to do at the company. Career coaches said they had tech employees reaching out to get tips on handling moral quandaries. The questions include “How do I avoid a project I disagree with?” and “How do I remind my bosses of the company mission statement?”
“Employees are wising up to the fact that you can have a mission statement on your website, but when you’re looking at how the company creates new products or makes decisions, the correlation between the two is not so tightly aligned,” said David Chie, the head of Palo Alto Staffing, a tech job placement service in Silicon Valley. “Everyone’s having this conversation.”
When engineers apply for jobs, they are also doing it differently.
“They do a lot more due diligence,” said Heather Johnston, Bay Area district president for the tech job staffing agency Robert Half. “Before, candidates were like: ‘Oh, I don’t want to do team interviews. I want a one-and-done.’” Now, she added, job candidates “want to meet the team.”
“They’re not just going to blindly take a company because of the name anymore,” she said.
Yet while many of the big tech companies have been hit by a change in public perception, Facebook seems uniquely tarred among young workers.
“I’ve had a couple of clients recently say they’re not as enthusiastic about Facebook because they’re frustrated with what they see happening politically or socially,” said Paul Freiberger, president of Shimmering Careers, a career counseling group based in San Mateo, Calif. “It’s privacy and political news, and concern that it’s going to be hard to correct these things from inside.”
Chad Herst, a leadership and career coach based in San Francisco since 2008, said that now, for the first time, he had clients who wanted to avoid working for big social media companies like Facebook or Twitter.
“They’re concerned about where democracy is going, that social media polarizes us, and they don’t want to be building it,” Mr. Herst said. “People really have been thinking about the mission of the company and what the companies are trying to achieve a little more.”
He said one client, a midlevel executive at Facebook, wanted advice on how to shift her group’s work to encourage users to connect offline as well. But she found resistance internally to her efforts.
“She was trying to figure out: ‘How do I politic this? How do I language this?’” Mr. Herst said. “And I was telling her to bring up some of Mark Zuckerberg’s past statements about connecting people.”
On the recent evening at the University of California, Berkeley, around 2,200 engineering students from around the country gathered for Cal Hacks 5.0 — a competition to build the best apps. The event spanned a weekend, so teenage competitors dragged pillows around with them. The hosts handed out 2,000 burritos as students registered.
It was also a hiring event. Recruiters from Facebook and Alphabet set up booths (free sunglasses from Facebook; $200 in credit to the Google Cloud platform from Alphabet).
In the auditorium, the head of Y Combinator, a start-up incubator and investment firm, gave opening remarks, recommending that young people avoid jobs in big tech.
“You get to program your life on a totally different scale,” said Michael Seibel, who leads Y Combinator. “The worst thing that can happen to you is you get a job at Google.” He called those jobs “$100,000-a-year welfare” — meaning, he said, that workers can get tethered to the paycheck and avoid taking risks.
The event then segued to a word from the sponsor, Microsoft. Justin Garrett, a Microsoft recruiter who on his LinkedIn profile calls himself a senior technical evangelist, stepped onstage, laughing a little.
“So, Michael’s a tough guy to follow, especially when you work for one of those big companies,” Mr. Garrett said. “He called it welfare. I like to call it tremendous opportunity.”
Then students flooded into the stadium, which was filled with long tables of computers where they would stay and compete. In the middle of the scrum, three friends joked around. Caleb Thomas, 21, was gently made fun of because he had accepted an internship at Facebook.
“Come on, guys,” Mr. Thomas said.
“These are the realities of how the business works,” said Samuel Resendez, 20, a computer science student at the University of Southern California.
It turned out Mr. Resendez had interned at Facebook in the summer. Olivia Brown, 20, head of Stanford’s Computer Science and Social Good club and an iOS intern at Mozilla, called him out on it. “But you still worked at Facebook, too,” she said.
“Well, at least I signed before Cambridge Analytica,” Mr. Resendez said, a little bashful about the data privacy and election manipulation scandal that rocked the company this year. “Ninety-five percent of what Facebook is doing is delivering memes.”
Ms. Brown said a lot of students criticize Facebook and talk about how they would not work there, but ultimately join. “Everyone cares about ethics in tech before they get a contract,” she said.
Ms. Brown said she thought that could change soon, though, as the social stigma of working for Facebook began outweighing the financial benefits.
“Defense companies have had this reputation for a long time,” she said. “Social networks are just getting that.”
- Back in April, the WhatsApp cofounder Jan Koum announced plans to leave Facebook.
- But he’s still showing up to the office once a month so he can continue to collect $450 million in Facebook stock he’s contractually due from when Facebook bought his company.
- It’s a high-dollar example of „rest and vest,“ in which big tech companies pay senior employees who don’t do much work.
- Koum has already sold over $7 billion in Facebook stock.
The WhatsApp cofounder Jan Koum said in April that he planned to leave Facebook, which bought his company for $19 billion in 2014. He’s already sold $7.1 billion worth of Facebook shares.
But he’s still showing up to the office, The Wall Street Journal reports, to collect one last payday: $450 million in stock.
Koum is resting and vesting, in Silicon Valley lingo, a state that often refers to wealthy entrepreneurs and engineers with one foot out the door at big tech companies who are allowed to continue to be officially employed until they’re able to collect stock and options in quarterly or annual increments.
Usually, stock awards after a merger are distributed on a four-year vesting cliff — if you last all four years, you get your entire stock grant. Koum’s last vesting date is November. He showed up at Facebook’s offices in mid-July, fulfilling a requirement of his employment contract, according to The Wall Street Journal.
„Resting and vesting“ is an open secret in Silicon Valley, Business Insider has reported. At some companies, the employees are called „coasters.“ The HBO show „Silicon Valley“ even spoofed it in an episode in which engineers hang out on a roof and don’t do any work.
„I’ve actually had a number of people, including today at Google X … send me pictures of themselves on a roof, kicking back doing nothing, with the hashtag ‚unassigned‘ or ‚rest and vest.‘ It’s something that really happens, and apparently, somewhat often,“ Josh Brener, the actor who plays the lucky character who got to rest and vest in HBO’s „Silicon Valley,“ told Business Insider last year.
From Business Insider’s report on the phenomenon:
„Facebook, for instance, has a fairly hush bonus program called ‚discretionary equity,‘ a former Facebook engineer who received it said.
„DE is when the company hands an engineer a massive, extra chunk of restricted stock units, worth tens to hundreds of thousands of dollars. It’s a thank-you for a job well done. It also helps keep the person from jumping ship because DE vests over time. These are bonus grants that are signed by top executives, sometimes even CEO Mark Zuckerberg.“
Koum’s payday isn’t related to discretionary equity; it’s instead a result of the over 20 million restricted shares of Facebook he received when he sold WhatsApp. He has one more vesting day in August and one in November, according to filings with the Securities and Exchange Commission.
Koum reportedly decided to leave Facebook in the middle of a spat over how to integrate advertising into WhatsApp. A WhatsApp representative declined to comment, but The Journal reports that Koum is still employed at the social-networking giant.
When Koum left, he wrote that he was taking time off to collect „rare air-cooled Porsches“ and play ultimate Frisbee.
How many Porsches can one buy with $450 million?