Schlagwort-Archive: Facebook

How Facebook Undermines Privacy Protections for Its 2 Billion WhatsApp Users

WhatsApp assures users that no one can see their messages — but the company has an extensive monitoring operation and regularly shares personal information with prosecutors.

 

Series: The Social Machine

How Facebook Plays by Its Own set of Rules

Clarification, Sept. 8, 2021: A previous version of this story caused unintended confusion about the extent to which WhatsApp examines its users’ messages and whether it breaks the encryption that keeps the exchanges secret. We’ve altered language in the story to make clear that the company examines only messages from threads that have been reported by users as possibly abusive. It does not break end-to-end encryption.

When Mark Zuckerberg unveiled a new “privacy-focused vision” for Facebook in March 2019, he cited the company’s global messaging service, WhatsApp, as a model. Acknowledging that “we don’t currently have a strong reputation for building privacy protective services,” the Facebook CEO wrote that “I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won’t stick around forever. This is the future I hope we will help bring about. We plan to build this the way we’ve developed WhatsApp.”

Zuckerberg’s vision centered on WhatsApp’s signature feature, which he said the company was planning to apply to Instagram and Facebook Messenger: end-to-end encryption, which converts all messages into an unreadable format that is only unlocked when they reach their intended destinations. WhatsApp messages are so secure, he said, that nobody else — not even the company — can read a word. As Zuckerberg had put it earlier, in testimony to the U.S. Senate in 2018, “We don’t see any of the content in WhatsApp.”

 

WhatsApp emphasizes this point so consistently that a flag with a similar assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”

Given those sweeping assurances, you might be surprised to learn that WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through millions of private messages, images and videos. They pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute.

The workers have access to only a subset of WhatsApp messages — those flagged by users and automatically forwarded to the company as possibly abusive. The review is one element in a broader monitoring operation in which the company also reviews material that is not encrypted, including data about the sender and their account.

Policing users while assuring them that their privacy is sacrosanct makes for an awkward mission at WhatsApp. A 49-slide internal company marketing presentation from December, obtained by ProPublica, emphasizes the “fierce” promotion of WhatsApp’s “privacy narrative.” It compares its “brand character” to “the Immigrant Mother” and displays a photo of Malala ​​Yousafzai, who survived a shooting by the Taliban and became a Nobel Peace Prize winner, in a slide titled “Brand tone parameters.” The presentation does not mention the company’s content moderation efforts.

WhatsApp’s director of communications, Carl Woog, acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove “the worst” abusers. But Woog told ProPublica that the company does not consider this work to be content moderation, saying: “We actually don’t typically use the term for WhatsApp.” The company declined to make executives available for interviews for this article, but responded to questions with written comments. “WhatsApp is a lifeline for millions of people around the world,” the company said. “The decisions we make around how we build our app are focused around the privacy of our users, maintaining a high degree of reliability and preventing abuse.”

WhatsApp’s denial that it moderates content is noticeably different from what Facebook Inc. says about WhatsApp’s corporate siblings, Instagram and Facebook. The company has said that some 15,000 moderators examine content on Facebook and Instagram, neither of which is encrypted. It releases quarterly transparency reports that detail how many accounts Facebook and Instagram have “actioned” for various categories of abusive content. There is no such report for WhatsApp.

Deploying an army of content reviewers is just one of the ways that Facebook Inc. has compromised the privacy of WhatsApp users. Together, the company’s actions have left WhatsApp — the largest messaging app in the world, with two billion users — far less private than its users likely understand or expect. A ProPublica investigation, drawing on data, documents and dozens of interviews with current and former employees and contractors, reveals how, since purchasing WhatsApp in 2014, Facebook has quietly undermined its sweeping security assurances in multiple ways. (Two articles this summer noted the existence of WhatsApp’s moderators but focused on their working conditions and pay rather than their effect on users’ privacy. This article is the first to reveal the details and extent of the company’s ability to scrutinize messages and user data — and to examine what the company does with that information.)

Many of the assertions by content moderators working for WhatsApp are echoed by a confidential whistleblower complaint filed last year with the U.S. Securities and Exchange Commission. The complaint, which ProPublica obtained, details WhatsApp’s extensive use of outside contractors, artificial intelligence systems and account information to examine user messages, images and videos. It alleges that the company’s claims of protecting users’ privacy are false. “We haven’t seen this complaint,” the company spokesperson said. The SEC has taken no public action on it; an agency spokesperson declined to comment.

Facebook Inc. has also downplayed how much data it collects from WhatsApp users, what it does with it and how much it shares with law enforcement authorities. For example, WhatsApp shares metadata, unencrypted records that can reveal a lot about a user’s activity, with law enforcement agencies such as the Department of Justice. Some rivals, such as Signal, intentionally gather much less metadata to avoid incursions on its users’ privacy, and thus share far less with law enforcement. (“WhatsApp responds to valid legal requests,” the company spokesperson said, “including orders that require us to provide on a real-time going forward basis who a specific person is messaging.”)

WhatsApp user data, ProPublica has learned, helped prosecutors build a high-profile case against a Treasury Department employee who leaked confidential documents to BuzzFeed News that exposed how dirty money flows through U.S. banks.

Like other social media and communications platforms, WhatsApp is caught between users who expect privacy and law enforcement entities that effectively demand the opposite: that WhatsApp turn over information that will help combat crime and online abuse. WhatsApp has responded to this dilemma by asserting that it’s no dilemma at all. “I think we absolutely can have security and safety for people through end-to-end encryption and work with law enforcement to solve crimes,” said Will Cathcart, whose title is Head of WhatsApp, in a YouTube interview with an Australian think tank in July.

The tension between privacy and disseminating information to law enforcement is exacerbated by a second pressure: Facebook’s need to make money from WhatsApp. Since paying $22 billion to buy WhatsApp in 2014, Facebook has been trying to figure out how to generate profits from a service that doesn’t charge its users a penny.

That conundrum has periodically led to moves that anger users, regulators or both. The goal of monetizing the app was part of the company’s 2016 decision to start sharing WhatsApp user data with Facebook, something the company had told European Union regulators was technologically impossible. The same impulse spurred a controversial plan, abandoned in late 2019, to sell advertising on WhatsApp. And the profit-seeking mandate was behind another botched initiative in January: the introduction of a new privacy policy for user interactions with businesses on WhatsApp, allowing businesses to use customer data in new ways. That announcement triggered a user exodus to competing apps.

WhatsApp’s increasingly aggressive business plan is focused on charging companies for an array of services — letting users make payments via WhatsApp and managing customer service chats — that offer convenience but fewer privacy protections. The result is a confusing two-tiered privacy system within the same app where the protections of end-to-end encryption are further eroded when WhatsApp users employ the service to communicate with businesses.

The company’s December marketing presentation captures WhatsApp’s diverging imperatives. It states that “privacy will remain important.” But it also conveys what seems to be a more urgent mission: the need to “open the aperture of the brand to encompass our future business objectives.”


 

I. “Content Moderation Associates”

In many ways, the experience of being a content moderator for WhatsApp in Austin is identical to being a moderator for Facebook or Instagram, according to interviews with 29 current and former moderators. Mostly in their 20s and 30s, many with past experience as store clerks, grocery checkers and baristas, the moderators are hired and employed by Accenture, a huge corporate contractor that works for Facebook and other Fortune 500 behemoths.

The job listings advertise “Content Review” positions and make no mention of Facebook or WhatsApp. Employment documents list the workers’ initial title as “content moderation associate.” Pay starts around $16.50 an hour. Moderators are instructed to tell anyone who asks that they work for Accenture, and are required to sign sweeping non-disclosure agreements. Citing the NDAs, almost all the current and former moderators interviewed by ProPublica insisted on anonymity. (An Accenture spokesperson declined comment, referring all questions about content moderation to WhatsApp.)

When the WhatsApp team was assembled in Austin in 2019, Facebook moderators already occupied the fourth floor of an office tower on Sixth Street, adjacent to the city’s famous bar-and-music scene. The WhatsApp team was installed on the floor above, with new glass-enclosed work pods and nicer bathrooms that sparked a tinge of envy in a few members of the Facebook team. Most of the WhatsApp team scattered to work from home during the pandemic. Whether in the office or at home, they spend their days in front of screens, using a Facebook software tool to examine a stream of “tickets,” organized by subject into “reactive” and “proactive” queues.

Collectively, the workers scrutinize millions of pieces of WhatsApp content each week. Each reviewer handles upwards of 600 tickets a day, which gives them less than a minute per ticket. WhatsApp declined to reveal how many contract workers are employed for content review, but a partial staffing list reviewed by ProPublica suggests that, at Accenture alone, it’s more than 1,000. WhatsApp moderators, like their Facebook and Instagram counterparts, are expected to meet performance metrics for speed and accuracy, which are audited by Accenture.

Their jobs differ in other ways. Because WhatsApp’s content is encrypted, artificial intelligence systems can’t automatically scan all chats, images and videos, as they do on Facebook and Instagram. Instead, WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service. This forwards five messages — the allegedly offending one along with the four previous ones in the exchange, including any images or videos — to WhatsApp in unscrambled form, according to former WhatsApp engineers and moderators. Automated systems then feed these tickets into “reactive” queues for contract workers to assess.

Artificial intelligence initiates a second set of queues — so-called proactive ones — by scanning unencrypted data that WhatsApp collects about its users and comparing it against suspicious account information and messaging patterns (a new account rapidly sending out a high volume of chats is evidence of spam), as well as terms and images that have previously been deemed abusive. The unencrypted data available for scrutiny is extensive. It includes the names and profile images of a user’s WhatsApp groups as well as their phone number, profile photo, status message, phone battery level, language and time zone, unique mobile phone ID and IP address, wireless signal strength and phone operating system, as a list of their electronic devices, any related Facebook and Instagram accounts, the last time they used the app and any previous history of violations.

The WhatsApp reviewers have three choices when presented with a ticket for either type of queue: Do nothing, place the user on “watch” for further scrutiny, or ban the account. (Facebook and Instagram content moderators have more options, including removing individual postings. It’s that distinction — the fact that WhatsApp reviewers can’t delete individual items — that the company cites as its basis for asserting that WhatsApp reviewers are not “content moderators.”)

WhatsApp moderators must make subjective, sensitive and subtle judgments, interviews and documents examined by ProPublica show. They examine a wide range of categories, including “Spam Report,” “Civic Bad Actor” (political hate speech and disinformation), “Terrorism Global Credible Threat,” “CEI” (child exploitative imagery) and “CP” (child pornography). Another set of categories addresses the messaging and conduct of millions of small and large businesses that use WhatsApp to chat with customers and sell their wares. These queues have such titles as “business impersonation prevalence,” “commerce policy probable violators” and “business verification.”

Moderators say the guidance they get from WhatsApp and Accenture relies on standards that can be simultaneously arcane and disturbingly graphic. Decisions about abusive sexual imagery, for example, can rest on an assessment of whether a naked child in an image appears adolescent or prepubescent, based on comparison of hip bones and pubic hair to a medical index chart. One reviewer recalled a grainy video in a political-speech queue that depicted a machete-wielding man holding up what appeared to be a severed head: “We had to watch and say, ‘Is this a real dead body or a fake dead body?’”

In late 2020, moderators were informed of a new queue for alleged “sextortion.” It was defined in an explanatory memo as “a form of sexual exploitation where people are blackmailed with a nude image of themselves which have been shared by them or someone else on the Internet.” The memo said workers would review messages reported by users that “include predefined keywords typically used in sextortion/blackmail messages.”

WhatsApp’s review system is hampered by impediments, including buggy language translation. The service has users in 180 countries, with the vast majority located outside the U.S. Even though Accenture hires workers who speak a variety of languages, for messages in some languages there’s often no native speaker on site to assess abuse complaints. That means using Facebook’s language-translation tool, which reviewers said could be so inaccurate that it sometimes labeled messages in Arabic as being in Spanish. The tool also offered little guidance on local slang, political context or sexual innuendo. “In the three years I’ve been there,” one moderator said, “it’s always been horrible.”

The process can be rife with errors and misunderstandings. Companies have been flagged for offering weapons for sale when they’re selling straight shaving razors. Bras can be sold, but if the marketing language registers as “adult,” the seller can be labeled a forbidden “sexually oriented business.” And a flawed translation tool set off an alarm when it detected kids for sale and slaughter, which, upon closer scrutiny, turned out to involve young goats intended to be cooked and eaten in halal meals.

The system is also undercut by the human failings of the people who instigate reports. Complaints are frequently filed to punish, harass or prank someone, according to moderators. In messages from Brazil and Mexico, one moderator explained, “we had a couple of months where AI was banning groups left and right because people were messing with their friends by changing their group names” and then reporting them. “At the worst of it, we were probably getting tens of thousands of those. They figured out some words the algorithm did not like.”

Other reports fail to meet WhatsApp standards for an account ban. “Most of it is not violating,” one of the moderators said. “It’s content that is already on the internet, and it’s just people trying to mess with users.” Still, each case can reveal up to five unencrypted messages, which are then examined by moderators.

The judgment of WhatsApp’s AI is less than perfect, moderators say. “There were a lot of innocent photos on there that were not allowed to be on there,” said Carlos Sauceda, who left Accenture last year after nine months. “It might have been a photo of a child taking a bath, and there was nothing wrong with it.” As another WhatsApp moderator put it, “A lot of the time, the artificial intelligence is not that intelligent.”

Facebook’s written guidance to WhatsApp moderators acknowledges many problems, noting “we have made mistakes and our policies have been weaponized by bad actors to get good actors banned. When users write inquiries pertaining to abusive matters like these, it is up to WhatsApp to respond and act (if necessary) accordingly in a timely and pleasant manner.” Of course, if a user appeals a ban that was prompted by a user report, according to one moderator, it entails having a second moderator examine the user’s content.


 

*£%#£$&@+*&+@@@£#+@&§_$£&£@_§##*$#$§+&+@&&%_$$@@

In public statements and on the company’s websites, Facebook Inc. is noticeably vague about WhatsApp’s monitoring process. The company does not provide a regular accounting of how WhatsApp polices the platform. WhatsApp’s FAQ page and online complaint form note that it will receive “the most recent messages” from a user who has been flagged. They do not, however, disclose how many unencrypted messages are revealed when a report is filed, or that those messages are examined by outside contractors. (WhatsApp told ProPublica it limits that disclosure to keep violators from “gaming” the system.)

By contrast, both Facebook and Instagram post lengthy “Community Standards” documents detailing the criteria its moderators use to police content, along with articles and videos about “the unrecognized heroes who keep Facebook safe” and announcements on new content-review sites. Facebook’s transparency reports detail how many pieces of content are “actioned” for each type of violation. WhatsApp is not included in this report.

When dealing with legislators, Facebook Inc. officials also offer few details — but are eager to assure them that they don’t let encryption stand in the way of protecting users from images of child sexual abuse and exploitation. For example, when members of the Senate Judiciary Committee grilled Facebook about the impact of encrypting its platforms, the company, in written follow-up questions in Jan. 2020, cited WhatsApp in boasting that it would remain responsive to law enforcement. “Even within an encrypted system,” one response noted, “we will still be able to respond to lawful requests for metadata, including potentially critical location or account information… We already have an encrypted messaging service, WhatsApp, that — in contrast to some other encrypted services — provides a simple way for people to report abuse or safety concerns.”

Sure enough, WhatsApp reported 400,000 instances of possible child-exploitation imagery to the National Center for Missing and Exploited Children in 2020, according to its head, Cathcart. That was ten times as many as in 2019. “We are by far the industry leaders in finding and detecting that behavior in an end-to-end encrypted service,” he said.

During his YouTube interview with the Australian think tank, Cathcart also described WhatsApp’s reliance on user reporting and its AI systems’ ability to examine account information that isn’t subject to encryption. Asked how many staffers WhatsApp employed to investigate abuse complaints from an app with more than two billion users, Cathcart didn’t mention content moderators or their access to encrypted content. “There’s a lot of people across Facebook who help with WhatsApp,” he explained. “If you look at people who work full time on WhatsApp, it’s above a thousand. I won’t get into the full breakdown of customer service, user reports, engineering, etc. But it’s a lot of that.”

In written responses for this article, the company spokesperson said: “We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive. This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.” The spokesperson noted that WhatsApp has released new privacy features, including “more controls about how people’s messages can disappear” or be viewed only once. He added, “Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp we receive the content they send us.”


 

III. “Deceiving Users” About Personal Privacy

Since the moment Facebook announced plans to buy WhatsApp in 2014, observers wondered how the service, known for its fervent commitment to privacy, would fare inside a corporation known for the opposite. Zuckerberg had become one of the wealthiest people on the planet by using a “surveillance capitalism” approach: collecting and exploiting reams of user data to sell targeted digital ads. Facebook’s relentless pursuit of growth and profits has generated a series of privacy scandals in which it was accused of deceiving customers and regulators.

By contrast, WhatsApp knew little about its users apart from their phone numbers and shared none of that information with third parties. WhatsApp ran no ads, and its co-founders, Jan Koum and Brian Acton, both former Yahoo engineers, were hostile to them. “At every company that sells ads,” they wrote in 2012, “a significant portion of their engineering team spends their day tuning data mining, writing better code to collect all your personal data, upgrading the servers that hold all the data and making sure it’s all being logged and collated and sliced and packed and shipped out,” adding: “Remember, when advertising is involved you the user are the product.” At WhatsApp, they noted, “your data isn’t even in the picture. We are simply not interested in any of it.”

Zuckerberg publicly vowed in a 2014 keynote speech that he would keep WhatsApp “exactly the same.” He declared, “We are absolutely not going to change plans around WhatsApp and the way it uses user data. WhatsApp is going to operate completely autonomously.”

In April 2016, WhatsApp completed its long-planned adoption of end-to-end encryption, which helped establish the app as a prized communications platform in 180 countries, including many where text messages and phone calls are cost-prohibitive. International dissidents, whistleblowers and journalists also turned to WhatsApp to escape government eavesdropping.

Four months later, however, WhatsApp disclosed it would begin sharing user data with Facebook — precisely what Zuckerberg had said would not happen — a move that cleared the way for an array of future revenue-generating plans. The new WhatsApp terms of service said the app would share information such as users’ phone numbers, profile photos, status messages and IP addresses for the purposes of ad targeting, fighting spam and abuse and gathering metrics. “By connecting your phone number with Facebook’s systems,” WhatsApp explained, “Facebook can offer better friend suggestions and show you more relevant ads if you have an account with them.”

Such actions were increasingly bringing Facebook into the crosshairs of regulators. In May 2017, European Union antitrust regulators fined the company 110 million euros (about $122 million) for falsely claiming three years earlier that it would be impossible to link the user information between WhatsApp and the Facebook family of apps. The EU concluded that Facebook had “intentionally or negligently” deceived regulators. Facebook insisted its false statements in 2014 were not intentional, but didn’t contest the fine.

By the spring of 2018, the WhatsApp co-founders, now both billionaires, were gone. Acton, in what he later described as an act of “penance” for the “crime” of selling WhatsApp to Facebook, gave $50 million to a foundation backing Signal, a free encrypted messaging app that would emerge as a WhatsApp rival. (Acton’s donor-advised fund has also given money to ProPublica.)

Meanwhile, Facebook was under fire for its security and privacy failures as never before. The pressure culminated in a landmark $5 billion fine by the Federal Trade Commission in July 2019 for violating a previous agreement to protect user privacy. The fine was almost 20 times greater than any previous privacy-related penalty, according to the FTC, and Facebook’s transgressions included “deceiving users about their ability to control the privacy of their personal information.”

The FTC announced that it was ordering Facebook to take steps to protect privacy going forward, including for WhatsApp users: “As part of Facebook’s order-mandated privacy program, which covers WhatsApp and Instagram, Facebook must conduct a privacy review of every new or modified product, service, or practice before it is implemented, and document its decisions about user privacy.” Compliance officers would be required to generate a “quarterly privacy review report” and share it with the company and, upon request, the FTC.

Facebook agreed to the FTC’s fine and order. Indeed, the negotiations for that agreement were the backdrop, just four months before that, for Zuckerberg’s announcement of his new commitment to privacy.

By that point, WhatsApp had begun using Accenture and other outside contractors to hire hundreds of content reviewers. But the company was eager not to step on its larger privacy message — or spook its global user base. It said nothing publicly about its hiring of contractors to review content.


 

IV$ “W+ Kill P_op%§ Base@%On$Met§data”

Even as Zuckerberg was touting Facebook Inc.’s new commitment to privacy in 2019, he didn’t mention that his company was apparently sharing more of its WhatsApp users’ metadata than ever with the parent company — and with law enforcement.

To the lay ear, the term “metadata” can sound abstract, a word that evokes the intersection of literary criticism and statistics. To use an old, pre-digital analogy, metadata is the equivalent of what’s written on the outside of an envelope — the names and addresses of the sender and recipient and the postmark reflecting where and when it was mailed — while the “content” is what’s written on the letter sealed inside the envelope. So it is with WhatsApp messages: The content is protected, but the envelope reveals a multitude of telling details (as noted: time stamps, phone numbers and much more).

Those in the information and intelligence fields understand how crucial this information can be. It was metadata, after all, that the National Security Agency was gathering about millions of Americans not suspected of a crime, prompting a global outcry when it was exposed in 2013 by former NSA contractor Edward Snowden. “Metadata absolutely tells you everything about somebody’s life,” former NSA general counsel Stewart Baker once said. “If you have enough metadata, you don’t really need content.” In a symposium at Johns Hopkins University in 2014, Gen. Michael Hayden, former director of both the CIA and NSA, went even further: “We kill people based on metadata.”

U.S. law enforcement has used WhatsApp metadata to help put people in jail. ProPublica found more than a dozen instances in which the Justice Department sought court orders for the platform’s metadata since 2017. These represent a fraction of overall requests, known as pen register orders (a phrase borrowed from the technology used to track numbers dialed by landline telephones), as many more are kept from public view by court order. U.S. government requests for data on outgoing and incoming messages from all Facebook platforms increased by 276% from the first half of 2017 to the second half of 2020, according to Facebook Inc. statistics (which don’t break out the numbers by platform). The company’s rate of handing over at least some data in response to such requests has risen from 84% to 95% during that period.

It’s not clear exactly what government investigators have been able to gather from WhatsApp, as the results of those orders, too, are often kept from public view. Internally, WhatsApp calls such requests for information about users “prospective message pairs,” or PMPs. These provide data on a user’s messaging patterns in response to requests from U.S. law enforcement agencies, as well as those in at least three other countries — the United Kingdom, Brazil and India — according to a person familiar with the matter who shared this information on condition of anonymity. Law enforcement requests from other countries might only receive basic subscriber profile information.

WhatsApp metadata was pivotal in the arrest and conviction of Natalie “May” Edwards, a former Treasury Department official with the Financial Crimes Enforcement Network, for leaking confidential banking reports about suspicious transactions to BuzzFeed News. The FBI’s criminal complaint detailed hundreds of messages between Edwards and a BuzzFeed reporter using an “encrypted application,” which interviews and court records confirmed was WhatsApp. “On or about August 1, 2018, within approximately six hours of the Edwards pen becoming operative — and the day after the July 2018 Buzzfeed article was published — the Edwards cellphone exchanged approximately 70 messages via the encrypted application with the Reporter-1 cellphone during an approximately 20-minute time span between 12:33 a.m. and 12:54 a.m.,” FBI Special Agent Emily Eckstut wrote in her October 2018 complaint. Edwards and the reporter used WhatsApp because Edwards believed the platform to be secure, according to a person familiar with the matter.

Edwards was sentenced on June 3 to six months in prison after pleading guilty to a conspiracy charge and reported to prison last week. Edwards’ attorney declined to comment, as did representatives from the FBI and the Justice Department.

WhatsApp has for years downplayed how much unencrypted information it shares with law enforcement, largely limiting mentions of the practice to boilerplate language buried deep in its terms of service. It does not routinely keep permanent logs of who users are communicating with and how often, but company officials confirmed they do turn on such tracking at their own discretion — even for internal Facebook leak investigations — or in response to law enforcement requests. The company declined to tell ProPublica how frequently it does so.

The privacy page for WhatsApp assures users that they have total control over their own metadata. It says users can “decide if only contacts, everyone, or nobody can see your profile photo” or when they last opened their status updates or when they last opened the app. Regardless of the settings a user chooses, WhatsApp collects and analyzes all of that data — a fact not mentioned anywhere on the page.


 

V. “Opening the Aperture to Encompass Business Objectives”

The conflict between privacy and security on encrypted platforms seems to be only intensifying. Law enforcement and child safety advocates have urged Zuckerberg to abandon his plan to encrypt all of Facebook’s messaging platforms. In June 2020, three Republican senators introduced the “Lawful Access to Encrypted Data Act,” which would require tech companies to assist in providing access to even encrypted content in response to law enforcement warrants. For its part, WhatsApp recently sued the Indian government to block its requirement that encrypted apps provide “traceability” — a method to identify the sender of any message deemed relevant to law enforcement. WhatsApp has fought similar demands in other countries.

Other encrypted platforms take a vastly different approach to monitoring their users than WhatsApp. Signal employs no content moderators, collects far less user and group data, allows no cloud backups and generally rejects the notion that it should be policing user activities. It submits no child exploitation reports to NCMEC.

Apple has touted its commitment to privacy as a selling point. Its iMessage system displays a “report” button only to alert the company to suspected spam, and the company has made just a few hundred annual reports to NCMEC, all of them originating from scanning outgoing email, which is unencrypted.

But Apple recently took a new tack, and appeared to stumble along the way. Amid intensifying pressure from Congress, in August the company announced a complex new system for identifying child-exploitative imagery on users’ iCloud backups. Apple insisted the new system poses no threat to private content, but privacy advocates accused the company of creating a backdoor that potentially allows authoritarian governments to demand broader content searches, which could result in the targeting of dissidents, journalists or other critics of the state. On Sept. 3, Apple announced it would delay implementation of the new system.

Still, it’s Facebook that seems to face the most constant skepticism among major tech platforms. It is using encryption to market itself as privacy-friendly, while saying little about the other ways it collects data, according to Lloyd Richardson, the director of IT at the Canadian Centre for Child Protection. “This whole idea that they’re doing it for personal protection of people is completely ludicrous,” Richardson said. “You’re trusting an app owned and written by Facebook to do exactly what they’re saying. Do you trust that entity to do that?” (On Sept. 2, Irish authorities announced that they are fining WhatsApp 225 million euros, about $267 million, for failing to properly disclose how the company shares user information with other Facebook platforms. WhatsApp is contesting the finding.)

Facebook’s emphasis on promoting WhatsApp as a paragon of privacy is evident in the December marketing document obtained by ProPublica. The “Brand Foundations” presentation says it was the product of a 21-member global team across all of Facebook, involving a half-dozen workshops, quantitative research, “stakeholder interviews” and “endless brainstorms.” Its aim: to offer “an emotional articulation” of WhatsApp’s benefits, “an inspirational toolkit that helps us tell our story,” and a “brand purpose to champion the deep human connection that leads to progress.” The marketing deck identifies a feeling of “closeness” as WhatsApp’s “ownable emotional territory,” saying the app delivers “the closest thing to an in-person conversation.”

WhatsApp should portray itself as “courageous,” according to another slide, because it’s “taking a strong, public stance that is not financially motivated on things we care about,” such as defending encryption and fighting misinformation. But the presentation also speaks of the need to “open the aperture of the brand to encompass our future business objectives. While privacy will remain important, we must accommodate for future innovations.”

WhatsApp is now in the midst of a major drive to make money. It has experienced a rocky start, in part because of broad suspicions of how WhatsApp will balance privacy and profits. An announced plan to begin running ads inside the app didn’t help; it was abandoned in late 2019, just days before it was set to launch. Early this January, WhatsApp unveiled a change in its privacy policy — accompanied by a one-month deadline to accept the policy or get cut off from the app. The move sparked a revolt, impelling tens of millions of users to flee to rivals such as Signal and Telegram.

The policy change focused on how messages and data would be handled when users communicate with a business in the ever-expanding array of WhatsApp Business offerings. Companies now could store their chats with users and use information about users for marketing purposes, including targeting them with ads on Facebook or Instagram.

Elon Musk tweeted “Use Signal,” and WhatsApp users rebelled. Facebook delayed for three months the requirement for users to approve the policy update. In the meantime, it struggled to convince users that the change would have no effect on the privacy protections for their personal communications, with a slightly modified version of its usual assurance: “WhatsApp cannot see your personal messages or hear your calls and neither can Facebook.” Just as when the company first bought WhatsApp years before, the message was the same: Trust us.

Correction

Sept. 10, 2021: This story originally stated incorrectly that Apple’s iMessage system has no “report” button. The iMessage system does have a report button, but only for suspected spam (not for suspected abusive content).

https://www.propublica.org/article/how-facebook-undermines-privacy-protections-for-its-2-billion-whatsapp-users

Apple wants to protect privacy — Facebook wants to ‚inflict pain‘

Facebook, Mark Zuckerberg, literally wants to inflict pain on Apple, on Tim Cook. To make them hurt. To lobby the government against them, to claim anti-trust, to do everything they can to paint Apple dirty. Why? Because Apple wants to give us, the customers, the users, the ability to choose whether or not Facebook gets to track us outside their own apps, across other apps, even across the web. Apple considers this simple level of privacy and dignity a fundamental human right. And… Facebook… well, Facebook seems intent on seeing it as an existential threat.

App Tracking Transparency

Starting in iOS 14.5, if an app wants to track your activities in other apps and on the web — well, it absolutely still can; it just has to ask your permission first. That’s it.

It’s called App Tracking Transparency, and it means that, if you’re in the Facebook app, and you’re in your favorite knitting group or whatever, talking about all the knitting, all the knitting, Facebook can serve you personalized ads about knitting, because they know you’re more likely to click on that than on… something random. And that’s all fine. That’s all 1st-party, meaning all happening in the same app, and nothing about that is changing. Not at all.

If you leave the Facebook app, and then go to Lego.com and then jeep.com, open a journaling app, your to-do list, play a couple of games, and then go back to Facebook, well, normally, Facebook tries to follow you across all those apps and websites as well, across anything that uses any of their software plugins or social hooks, so that they can serve you ads based on what you do in those apps and sites as well. And this is what’s changing, at least a very tiny little bit. This 3rd-party tracking. And all that’s changing is that Apple wants Facebook — or any app for that matter — to ask your permission before tracking you. That’s literally it.

Any app that wants to share your data with another app or service, or sell your activity to a data broker, can still do it. They simply have to ask you first.

1st vs. 3rd Party Tracking

Facebook Ios 14 Tracking PromptSource: MacRumors

It doesn’t even apply to other apps the same company owns. So Facebook can still 1st party track us across the big blue app and Facebook.com, Instagram, WhatsApp, Oculus, Messenger, any other app or website they own. Which is like half the social web at this point. It’s only if they want to track us across apps and websites they don’t own that they have to ask.

It’s no different than what other apps have had to ask before they access our photos or contacts or camera, or our physical location; all this means is that they now have to ask us before they can monitor our digital location as well.

Because, just like we’re concerned an app might steal our private photos, spam our contacts, listen in or spy on us with the camera or mic, or stalk us and sell our location in the real world, we’re increasingly concerned about apps stalking us in the digital world.

It’s why we see so many conspiracy theories about apps like Facebook or Instagram using the mic to listen in to our conversations — because they’re so damn good at serving us targeted ads that we think they must be all up in our brainstems to do it.

But they’re not. They’re just… that… damn… good… That damn good at profiling us based on our behavior so they can target us with those ads. And again, Apple isn’t saying they can’t do that anymore, that they can’t track our digital activity. Just like Apple isn’t saying, apps can’t edit our photos or find our friends or transmit our voices or faces across the internet or give us turn-by-turn traffic directions. All Apple’s saying is… like with all those other apps — they simply have to ask us first.

Some people will be fine with it. We’re getting the ads anyway, so they may prefer those ads be as personalized as possible. Others won’t. They’ll find it creepy and demand it stop. And now, for the first time, we’ll all get what we want.

Except for Facebook, which seems to think giving us a choice is wrong. Probably because they’re worried if we’re given a choice, we’ll choose to block them. To say no.

Make the case

FacebookSource: iMore

Rather than making a case for us to say yes, to argue the value they can deliver, Facebook is taking out ads in newspapers, lobbying governments, claiming anti-trust violations, saying this will hurt small apps and small business — as if any of them, from the biggest tech companies to the smallest online merchants own our data and have a greater right to it than we do. As if it belongs to them, not us. By divine right.

Now, some people are confusing and conflating how App Tracking Transparency applies to Apple’s own apps. Intentionally or accidentally spreading disinformation about Apple having a double standard, not playing fair, giving themselves a separate setting. And… they’re actually right. But not really. Apple’s standard here isn’t double — it’s higher.

That separate setting doesn’t stop Apple from doing 3rd-party tracking or serving personalized ads based on your activity elsewhere because Apple doesn’t do that… at all… to begin with. Not any of it. What that second setting does is stop Apple from serving 1st-party ads. Like, suggested apps in the App Store. The equivalent of Facebook serving you that knitting ad while you’re in the Facebook knitting group.

And that’s the reason it’s a second, separate setting. Because it’s legacy, but also because the new one applies to all apps. The old one, sadly, at this point, only to Apple. And conflating 3rd and 1st party tracking in the same interface panel — well, that’s what would be really confusing.

Other people are saying the wording on the popup is unfair. That „Allow Facebook to Track Your Activities Across Other companies Apps and Websites“ is scary and chilling. That it should be something closer to „Allow Facebook to Serve You Personalized Ads.“

Which is such a steaming pile of poop emojis. And everyone knows it. Because personalizing ads isn’t all they can do with that permission. It’s not all they can do with the access, far from it. And everyone knows that as well. It’s like… a giant Facebook Thirst Trap, and they think we’re all going to fall for it.

Asked and answered

Mark Zuckerberg in front of the Facebook logoSource: iMore

See, Photo apps don’t get to ask for permission to apply filters, contacts apps to find friends, conferencing apps to place video calls, location services for turn-by-turn. They have to ask for full access. For blanket permissions. Because that’s what they get. And once they have it, they can steal our photos, spam our contacts, record what we’re doing, or sell our location to collection agents because that’s the access we’ve given them. So they don’t get to lie about the limitations, cherry-pick the most benign use cases, diminish or try and dismiss the very real risk of an app not just serving us personalized ads but selling our online activity to data brokers. We get to know the full scope, so we get to make the most informed decision.

Even then, Apple’s not stopping any of that anyway. All they’re doing is requiring Facebook and any other app to ask us first and then to respect our decision.

Apple can’t stop all of it anyway. All they can do is block the iOS-specific ad identifier. Not all of Facebook or any other service’s software plugins or web hooks. All they can do is hope Facebook and others honor our choice and cut that stuff out — out of their own accord. Based on the honor system.

Even that — the honor system — seems to be too much for Facebook. Because it’s not ending Facebook or any small apps or businesses, like at all. That’s absurd. They’re too busy doing that themselves with Cambridge Analytica, Onavo VPN, algorithmic malfeasance, betraying WhatsApp and Oculus login promises, and the list goes on and on. If anything, Apple is prompting them to clean up their act. Encouraging them to do the most minimally decent, user-centric thing imaginable so they can start regaining our trust.

Source: https://www.imore.com/apple-wants-protect-privacy-facebook-wants-inflict-pain

The mass surveillance of society has made companies extremely wealthy

The Facebook news ban revealed how problematic it is to rely on corporations to provide fundamental public services

By business reporter Gareth Hutchens

Graphic shows two people on laptops in front of the Facebook logo.
Facebook harvests our personal data in unimaginable quantities, Gareth Hutchens writes.(Reuters: Dado Ruvic)

The fog lifted for a moment.

Last week, when Facebook blocked Australians from viewing and sharing „news content“ on its platform, we saw what role it plays in Australian society.

Community groups, charities, sport clubs, arts centres, unions and emergency services all rely on the social media giant.

Its platform plays the role of an important public messaging board.

But in a country with so little civil society infrastructure, our heavy reliance on a corporation to provide such a fundamental public service is deeply problematic.

Facebook, Inc. doesn’t care about your fundraiser or political protest.

It couldn’t care less about your art exhibition.

What it cares about is your personal data, which it harvests in unimaginable quantities.

And the methods it uses to keep its 2.7 billion monthly active users „engaged“ on its website (so it can keep learning more about them) are also deeply problematic.

Jaron Lanier, one of the founders of the field of virtual reality, has been warning about social media and tech giants for years.

„Everyone has been placed under a level of dystopian surveillance straight out of a dystopian science fiction novel,“ he wrote in 2018 about the technological architecture created by these companies.

„Spying is accomplished mostly through connected personal devices — especially, for now, smartphones — that people keep practically glued to their bodies.

„Data is gathered about each person’s communications, interests, movements, contact with others, emotional reactions to circumstances, facial expressions, purchases, vital signs: an ever-growing, boundless variety of data.“

Mr Lanier says the ocean of personal data these companies extract from the internet is turned into behavioural data that allows them to predict and manipulate our behaviour.

 
Play Video. Duration: 57 seconds
„Facebook was wrong“: Josh Frydenberg criticises restrictions on Australian news.

„[These] platforms have proudly reported on experimenting with making people sad, changing voter turnout, and reinforcing brand loyalty,“ he said.

Just one example: in 2014, Facebook executives apologised after a scientific paper revealed the company had conducted secret psychological tests on 700,000 users, without its users‘ knowledge, in which it tried to manipulate its users‘ emotions to see what effect it would have on the status updates they posted or how they would use Facebook’s „like“ button.

Surveillance capitalism

It’s worth remembering what Facebook is.

It is a member of a group of companies that are engaged in something called „surveillance capitalism“.

According to Professor Shoshana Zuboff, the author who coined the term, surveillance capitalism refers to the „new economic order“ that has emerged in the age of the internet and smartphone.

She says the companies that practice it lay claim to our personal information, our „data“, as „free raw material“ to be aggressively harvested.

Some of the data they collect are used for product or service improvement, but the rest is considered as a proprietary „behavioural surplus“.

That surplus data is then fed into machine intelligence which turns the data into „prediction products“ that „anticipate what you will do now, soon and later“.

According to Professor Zuboff, social media companies trade those „prediction products“ in a new kind of marketplace for behavioural predictions which she calls „behavioural futures markets“.

„Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are eager to lay bets on our future behaviour,“ she wrote in her 2019 book, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.

„The competitive dynamics of these new markets drive surveillance capitalists to acquire ever-more-predictive sources of behavioural surplus: our voices, personalities, and emotions.

„Surveillance capitalists discovered that the most-predictive behavioural data come from intervening in the state of play in order to nudge, coax, tune, and herd behaviour towards profitable outcomes.

It has become difficult to escape this bold market project, whose tentacles reach from the gentle herding of innocent Pokemon Go players to eat, drink, and purchase in the restaurants, bars, fast-food joints, and shops that pay to play in its behavioural futures markets to the ruthless expropriation of surplus from Facebook profiles for the purposes of shaping individual behaviour, whether it’s buying pimple cream at 5:45pm on a Friday, clicking ‚yes‘ on an offer of new running shoes as the endorphins race through your brain after your long Sunday morning run, or voting next week.

„Just as industrial capitalism was driven to the continuous intensification of the means of production, so surveillance capitalists and their market players are not locked into the continuous intensification of the means of behavioural modification and the gathering might of instrumentarian power.“

Facebook CEO Mark Zuckerberg gestures with his arms and smiles as he speaks.
Mark Zuckerberg’s Facebook is a member of a group of companies engaged in „surveillance capitalism“.(AP: Trent Nelson via The Salt Lake Tribune)

Google invented surveillance capitalism

Professor Zuboff says Google invented and perfected surveillance capitalism in the early 2000s „in much the same way that a century ago General Motors invented and perfected managerial capitalism“.

„Google was the pioneer of surveillance capitalism in thought and practice, the deep pocket research and development, and the trailblazer in experimentation and implementation, but it is no longer the only actor on this path,“ she wrote.

„Surveillance capitalism quickly spread to Facebook and later to Microsoft. Evidence suggests that Amazon has veered in this direction, and it is a constant challenge to Apple, both as an external threat and as a source of internal debate and conflict.“

She published those words in 2019.

A little later that year, the Guardian described the book as an „epoch-defining international bestseller, drawing comparisons to Rachel Carson’s Silent Spring“.

The mass surveillance of society has made companies extremely wealthy

One of the points Professor Zuboff has repeatedly made about surveillance capitalism is how profitable it is for the companies that practice it.

The ocean of personal data they hoover up is turned into unimaginable wealth and power, making the companies more powerful than nation-states.

It helps to explain why those tech companies have come to dominate stock markets.

A screenshot of the ABC News page on Facebook showing no posts
News organisations including the ABC have been impacted, along with community groups, charities, sport clubs, arts centres, unions, emergency services and more.(Supplied)

Last year, when researchers at the International Monetary Fund tried to figure out why there seemed to be a large disconnect between stock markets and the real world during one of the worst global recessions in memory, one thesis they considered was that the outsize influence of the big five tech companies — Google, Facebook, Microsoft, Amazon and Apple, which accounted for 22 per cent of the market capitalisation on US stock markets — was making US financial markets appear healthier than they were.

At any rate, it comes back to the question of what type of organisation should be running a country’s quasi-public messaging board.

Are we happy to leave it to surveillance capitalists to run a „public good“ of that kind?

Source: https://www.abc.net.au/news/2021-02-21/when-facebook-banned-news-australia-we-saw-role-it-plays/13175698

Facebook’s devastating display of defiance is vintage Zuckerberg

Facebook’s decision to ban legitimate news from being shared in the middle of a global pandemic is a breathtaking display of defiance. It is also entirely consistent with the social media behemoth’s belligerent corporate character.

The move – which inadvertently resulted in Facebook pages of health departments in Queensland, WA and ACT being wiped just before a critical vaccine rollout begins – shocked the Australian media and political establishment. But, in hindsight, nobody should have been surprised. This was vintage Zuckerberg. You don’t blitzscale your way from Harvard dorm room to trillion-dollar titan in the space of a few years without putting lots of noses out of joint.

Facebook CEO Mark Zuckerberg arrives to testify before a joint hearing of Congress.
Facebook CEO Mark Zuckerberg arrives to testify before a joint hearing of Congress.Credit:AP

The Australian government’s media bargaining code, which is at the centre of the dispute, has been endlessly debated over the past year. Media companies say they should be paid for producing journalism that benefits the platforms, but they lack the bargaining power to extract any value for it. Tech giants claim they do not really benefit from the existence of news, that news represents a small part of the overall activity on their platforms, and since they actually send these news organisations free traffic they shouldn’t be paying them anything.

There are merits to both sides of the argument.

Yet there is little doubt stronger regulation of Google and Facebook is urgently needed. The two companies have scarily dominant positions in their respective markets of search and social media, and also an entrenched duopoly in digital advertising. Meanwhile, their ascent has coincided with a host of societal problems ranging from rising misinformation and fake news, to a troubling surge in online conspiracy theories and growing internet addiction.

The media bargaining code attempts to revolve the digital duopoly’s market dominance by using the threat of arbitration to force Google and Facebook to strike commercial deals with media companies. Could there have been a more straightforward solution? A digital platform tax or levy may have been cleaner and simpler and has existing parallels elsewhere in the economy.

There are already taxes on addictive and harmful products (think cigarette excise), and levies on disruptive new market entrants that are used to compensate legacy incumbents also exist (for example, the levies on Uber rides that are distributed to taxi licence holders).

Regardless, the debate about the merits of the media bargaining code in Australia has now become moot. The bill to bring the code into law has sailed through the lower house of Parliament and is all but certain to be passed by the Senate. Facebook is effectively saying that the overwhelming majority of elected officials in a sovereign parliament are wrong.

It is possible that a news-free Facebook could be positive for society and the media industry in the medium term. But at this fragile moment in history – a once in a century health crisis coupled with a fake news epidemic – for the primary gateway to information for millions of people to block critical information from being shared was chillingly irresponsible.

Throughout its relatively short history, Facebook has pursued a win at all costs, take no prisoners approach to business. It has also shown little regard for the wreckage it has left behind. For many years its official corporate mantra was “move fast and break things”.

When a potential competitor emerges, Facebook either buys it (as it did with WhatsApp and Instagram) or copies its key features (as it has done with Snapchat and Tiktok).

Facebook has pursued a win at all costs, take no prisoners approach to business.
Facebook has pursued a win at all costs, take no prisoners approach to business.Credit:Bloomberg

It has repeatedly abused the privacy of its users and demonstrated a shocking ineptitude at thwarting the misinformation and conspiracy theories that have flourished on its platform, which are now demonstrably weakening democracies.

The spat over the media bargaining code highlights the fiendishly complex task governments face in regulating digital giants with operations that span the globe, billions of users and perhaps unrivalled power.

Tech proponents argue Australia’s regulation is deeply flawed – and to an extent they may have a point. But there is flawed regulation all across the economy. Most wildly profitable and dominant companies (even Google) begrudgingly accept these kinds of impositions as part of their social licence to operate, a cost of doing business. Not Facebook.

Mark Zuckerberg’s middle finger to the Australian government has been noticed all around the world. Already Canada is signaling it will copy the media code, while Europe (which has tried repeatedly to force the digital giants to pay news organisations, with much less success than Australia) is likely to follow.

Facebook has repeatedly shown it does not mind a scrap. But this may be its biggest fight yet, and it is only just beginning.

Source: https://www.smh.com.au/business/companies/facebook-s-devastating-display-of-defiance-is-vintage-zuckerberg-20210219-p5741b.html

How you farewell a Facebook account. And what you can do next

If the lack of news is a deal-breaker for your use of Facebook, how can you delete your account – and what are the consequences?

 

With Facebook blocking all news pages and links from its Australian service, some people will be weighing up how they’ll continue to use the social media platform.

Facebook is ubiquitous, and for many of us serves as a link to our friends, family, events, photos and memories. After Facebook’s snap decision on Thursday to block Australians from seeing news articles on its platform, some users began experimenting with loopholes to continuing sharing news, even resorting to breaking up the text in creative ways or using pictures of cats when posting news stories, to throw Facebook off the scent. But in the hours since, those loopholes appear to have been closed.

Is the lack of news a deal-breaker for your use of Facebook? If so, how will you go about deleting your account – and what are the consequences? And are there good alternatives for services that serve news to you?

How will I get my news?

If you previously relied mostly on Facebook for news it’s time to find an alternative, and the service(s) you choose will depend on how you like to consume your content.

If you’re moving to a new social media network, Twitter is an obvious choice. On Twitter, as with Facebook, you get to pick your friends, companies, personalities and outlets, and see their updates in a feed. A lot of news outlets post the same stories to Facebook and Twitter, and may even be more active on the latter now Facebook is out. One advantage of Twitter is you can follow a wide variety of news without crowding your feed too much. For example, you can save curated lists of people and outlets, say, by topic or friend group, to keep things separated. Or you can save specific searches so you’re always up to date on a specific topic or hashtag (those little phrases starting with # that people use to categorise comments, like #auspol for Australian politics).

 

You could also try Reddit or Discord, if you’re more into discussing the news with a like-minded community.

If you’re sticking with Facebook to keep up with friends, you might just want a straight news service or aggregator to get the latest headlines. Google News is available on every type of device and is good for either skimming the headlines or diving deep into a topic. It has curated “top stories”, suggestions based on your tastes, and you can save favourite sources and topics to a custom feed. On mobile phones, a News Showcase feature lets you read some usually paywalled stories for free. Apple News is similar if you solely use Apple devices, though its premium offering Apple News+ is more curated and you need to pay for it.

For a more DIY option you can collect things called RSS Feeds, which show you every article published on a given website, but they can be messy. Some more advanced RSS reading services, like Feedly, make it easier to create your own news service.

Finally, you can always go directly to the outlets you like. Bookmark the topic pages on websites you’re interested in, or many news outlets also offer newsletters, podcasts and apps to make accessing news more convenient.

What happens to my photos and posts if I delete Facebook?

If you’ve been on the social network for years you might wonder what the repercussions would be if you deleted that app and nuked your account. And the truth is, depending on how you’ve used it, there can be consequences.

 

Completely deleting your Facebook account will delete all the posts and photos you’ve shared on the service, and remove you from conversations and posts on other people’s Facebook feeds. You will no longer be able to use Facebook Messenger or access any conversations you had there.

If you used Facebook to sign up to other services, such as Spotify or Instagram, you may find it difficult to access them once your account is deleted. Facebook hardware products, such as Portal smart displays and Oculus VR (virtual reality) headsets, require a Facebook account for most functions. In the case of Oculus, you could lose any games you paid for if you delete Facebook.

After 30 days your Facebook account data becomes unrecoverable, although Facebook says it may take 90 days until all your data is gone from its servers.

So how do I do it without losing all my stuff?

For a less nuclear option you can “deactivate” your account; in which case the company keeps your data and you can still use Messenger. Other apps and websites can still log you in with Facebook, and you can reinstate your account in the future.

So if you’re removing yourself from Facebook, you first have to decide whether you’d like the option to come back later. If you do, you should choose a deactivation. If not, you want a deletion. Either way you will go to the same place.

How do you delete or deactivate a Facebook account?

On a computer:

  1. Log in to Facebook and hit the triangle at the top right of the page.
  2. Click on Settings and Privacy, and then Settings.
  3. Click on Your Facebook Information, and then Deactivation or Deletion.

On the mobile app:

  1. Tap the three horizontal lines at the bottom (iPhone) or top (Android) right of the screen.
  2. Scroll down and tap Settings and Privacy, and then Settings.
  3. Scroll down and tap Account Ownership and Control, then Deactivation and Deletion. See below for how to recoup your old posts, including photos.

Deactivation is as simple as entering your password and confirming a few times, but if you’re deleting your account and want to keep your stuff there are a few loose ends to tie up first.

When leaving Facebook, you have a choice of a deactivation where Facebook keeps all your data, or a total deletion that locks you out for good.

When leaving Facebook, you have a choice of a deactivation where Facebook keeps all your data, or a total deletion that locks you out for good.

Facebook can send your photos and videos directly to another service, such as Dropbox or Google Photos. Or, alternatively, you can download and store any or all information from your Facebook account. This can take some time if you want to keep everything, as it might include years of posts, photos, videos, comments, messages, event details and group discussions, marketplace listings, location information and advertising data. To do either of these things, follow the steps above but at step three choose Transfer a Copy of Your Photos, or Download Your Information.

How do you access Instagram if you’ve ditched Facebook?

Next, you’ll want to make sure you can still access other services. You can keep using Instagram after a Facebook deletion but you may need to make some changes. Before deleting Facebook go to Instagram’s settings, hit Accounts Center, then Logging in Across Accounts, and make sure it’s turned off. If you originally signed up to Instagram via Facebook, this will prompt you to create a password. Now your Instagram and Facebook accounts are separated – but be aware they are the same company and do share your data.

 

As for non-Facebook apps and services you used Facebook to sign up for, most will have an option in their settings to choose a different login or unlink from Facebook. If you’re unsure if this applies to any services you use, go to Facebook’s settings and hit Apps and Websites to see a list of services you’ve linked to Facebook.

What are some other services for sharing photos?

Google Photos and Apple iCloud are services you may already be using to back up pics from your phone. But you can also use them to share pictures with others, tag people and make comments. If you’re specifically wanting to share photos of the kids you can set up shared folders in Google Photos that do this automatically. Tinybeans is another good app specifically made for sharing photos of kids with family members and friends.

If you’re deleting Facebook entirely and want a Messenger replacement, Signal is probably closest since it’s secure and has seamless integration between mobile and web. You could say the same for WhatsApp, but if you’re completely expunging Facebook from your life that’s a no-go. If you need all the goofy stickers and video chat features, your phone’s default iMessage or Android Messenger is as good as you may get.

Groups and events are the hardest Facebook features to replace, as it can feel like you’re going to miss out if you’re not on Facebook. But there are alternatives, just make sure you have a phone number and/or active email for each of your friends before you leave. Paperless Post is a good service that lets you create events, send invites and track RSVPs, and you can always create a group chat on your messaging platform of choice.

Source: https://www.smh.com.au/technology/how-you-farewell-a-facebook-account-and-what-you-can-do-next-20210219-p573wy.html

It’s time to unfriend Facebook when it resorts to starving us of news

 

If there was ever any doubt about Facebook’s cavalier attitude to the network of users it has created, this news blackout is definitive. To Facebook, we are all merely pieces of data to be observed, exploited and monetised. As citizens we are worthless.

Australians need to respond with our mouses. We need to unfriend Facebook and find alternative places to connect and collaborate, free of its surveillance models and reckless self-interest.

 

The 30 per cent of Australians who rely on Facebook as their primary source of news will have to find it elsewhere or live a fact-free life following the Big Tech behemoth’s decision on Thursday to purge journalism from its site.

Overnight, Facebook has removed access to its users from any site that smells like news: not only local major mastheads such as The Sydney Morning Herald and The Age, but also specialist sites like The Conversation and global leaders such as The New York Times.

News blackout ... Facebook is ignoring the public interest while acting in self-interest.

News blackout … Facebook is ignoring the public interest while acting in self-interest. Credit:iStock

It also seems Fire and Rescue NSW, the Bureau of Meteorology, MS Research Australia, Doctors without Borders and state health departments are among many placed on the blacklist, showing the scope of the Mark Zuckerberg edict from Silicon Valley.

This is an arrogant and reckless move that will be dangerous for all Australians who are relying on an evidence-based response to a global pandemic, but also self-destructive to Facebook. While Facebook argues it does not make much money from news in its network, it is wilfully turning a blind eye to its value. News provides the facts and evidence to anchor what it claims is a ubiquitous digital experience.

If there was ever any doubt about Facebook’s cavalier attitude to the network of users it has created, this news blackout is definitive. To Facebook, we are all merely pieces of data to be observed, exploited and monetised. As citizens we are worthless.

By rejecting the decisions of our elected representatives to implement the findings of the Australian Competition and Consumer Commission’s review of its monopoly power, Facebook is asserting its commercial interests should prevail over the public interest. Indeed, Facebook seems more comfortable with its networks supporting despots and dictatorships by algorithmically fomenting division than respecting a government working in support of democracy.

This decision was made hours after our elected leaders from across the political spectrum endorsed the work of experts to deliver a significant reform that will make our democracy stronger.

The News Media Bargaining Code, the brainchild of the ACCC and its chairman Rod Sims, was a systemic response to the monopoly power that Google and Facebook exert over advertising and its impact on public interest journalism.

 

Under Australian law there is now a legal mechanism to place a value on fact-based news within the digital platforms that have come to dominate our online world with their algorithmically powered engines of division, distortion and denial.

The spectre of the code – with its global precedence – has already begun to do its job. Google has rushed to finalise premium-content deals with media organisations. These deals will not only make the Australian media, which has shed more than 5000 jobs in the past decade, stronger; it will help address the built-in weaknesses of digital platforms that refuse to discriminate fact from fiction.

And they were only the first step in the program of digital platform reform that the ACCC has laid out to address the power of the Google/Facebook monopoly.

 

A review of privacy laws is currently under way, looking at the way Australians’ personal information is collected and monetised by online platforms with a view to designing consumer rights and protections. A separate process is focussing on the responsibilities social media should have to address harmful misinformation and disinformation, dispelling for good the myth that they are platforms with no broader social obligations for the harm they cause.

There’s also a review of the creepy world of ad-tech, where automated, virtual trading floors are running real-time auctions for our attention every time we visit a news page.

But this sort of expression on democratic reform is a red line for Facebook, which believes its network is stronger than our public institutions.

Australians need to respond with our mouses. We need to unfriend Facebook and find alternative places to connect and collaborate, free of its surveillance models and reckless self-interest.

Peter Lewis is the director of the Centre for Responsible Technology.

Source: https://www.smh.com.au/national/it-s-time-to-unfriend-facebook-when-it-resorts-to-starving-us-of-news-20210218-p573lt.html

 

Is it time to leave WhatsApp – and is Signal the answer!

 

The Facebook-owned messaging service has been hit by a global backlash over privacy. Many users are migrating to Signal or Telegram. Should you join them?

Whatsapp, Signal and Telegram app icons  on a smartphone screen
WhatsApp, Signal and Telegram: three leading choices for messaging services. Photograph: Rafael Henrique/Sopa Images/RexShutterstock
 

Earlier this month, WhatsApp issued a new privacy policy along with an ultimatum: accept these new terms, or delete WhatsApp from your smartphone. But the new privacy policy wasn’t particularly clear, and it was widely misinterpreted to mean WhatsApp would be sharing more sensitive personal data with its parent company Facebook. Unsurprisingly, it prompted a fierce backlash, with many users threatening to stop using the service.

WhatsApp soon issued a clarification, explaining that the new policy only affects the way users’ accounts interact with businesses (ie not with their friends) and does not mandate any new data collection. The messaging app also delayed the introduction of the policy by three months. Crucially, WhatsApp said, the new policy doesn’t affect the content of your chats, which remain protected by end-to-end encryption – the “gold standard” of security that means no one can view the content of messages, even WhatsApp, Facebook, or the authorities.

 

But the damage had already been done. The bungled communication attempts have raised awareness that WhatsApp does collect a lot of data, and some of this could be shared with Facebook. The BBC reported that Signal was downloaded 246,000 times worldwide in the week before WhatsApp announced the change on 4 January, and 8.8m times the week after.

WhatsApp does share some data with Facebook, including phone numbers and profile name, but this has been happening for years. WhatsApp has stated that in the UK and EU the update does not share further data with Facebook – because of strict privacy regulation, known as the general update to data protection regulation (GDPR). The messaging app doesn’t gather the content of your chats, but it does collect the metadata attached to them – such as the sender, the time a message was sent and who it was sent to. This can be shared with “Facebook companies”.

Facebook’s highly criticised data collection ethos has eroded trust in the social network. Its practices can put vulnerable people at risk, says Emily Overton, a data protection expert and managing director of RMGirl. She cites the example of Facebook’s “people you may know” algorithm exposing sex workers’ real names to their clients – despite both parties taking care to set up fake identities. “The more data they profile, the more they put people in vulnerable positions at risk.”

And the social network isn’t known for keeping promises. When Facebook bought WhatsApp in 2014, it pledged to keep the two services separate. Yet only a few years later, Facebook announced aims to integrate the messaging systems of Facebook, Instagram and WhatsApp. This appears to have stalled owing to technical and regulatory difficulties around encryption, but it’s still the long-term plan.


Why are people choosing Signal over Telegram?

Signal, a secure messaging app recommended by authorities such as the Electronic Frontier Foundation and Edward Snowden, has been the main beneficiary of the WhatsApp exodus. Another messaging app, Telegram, has also experienced an uptick in downloads, but Signal has been topping the charts on the Apple and Android app stores.

Signal benefits from being the most similar to WhatsApp in terms of features, while Telegram has had problems as a secure and private messaging app, with its live location feature recently coming under fire for privacy infringements. Crucially, Telegram is not end-to-end encrypted by default, instead storing your data in the cloud. Signal is end-to-end encrypted, collects less data than Telegram and stores messages on your device rather than in the cloud.


Does Signal have all the features I am used to and why is it more private?

Yes, Signal has most of the features you are used to on WhatsApp, such as stickers and emojis. You can set up and name groups, and it’s easy to send a message: just bring up the pen sign in the right-hand corner.

Signal has a desktop app, and you can voice and video chat with up to eight people. Like WhatsApp, Signal uses your phone number as your identity, something that has concerned some privacy and security advocates. However, the company has introduced pin codes in the hope of moving to a more secure and private way of identifying users in the future.

As well as being end-to-end encrypted, both WhatsApp and Signal have a “disappearing messages” feature for additional privacy. The major difference is how each app is funded. WhatsApp is owned by Facebook, whose business model is based on advertising. Signal is privacy focused and has no desire to analyse, share or profit from users’ private information, says Jake Moore, cybersecurity specialist at ESET.

Signal is supported by the non-profit Signal Foundation, set up in 2018 by WhatsApp founder Brian Acton and security researcher (and Signal Messenger CEO) Moxie Marlinspike, who created an encryption protocol that is used by several messaging services, including WhatsApp and Skype as well as Signal itself. Acton, who left Facebook in 2017 after expressing concerns over how the company operated, donated an initial $50m to Signal, and the open-source app is now funded by the community. Essentially that means developers across the world will continually work on it and fix security issues as part of a collaborative effort, making the app arguably more secure.

But there are concerns over whether Signal can maintain this free model as its user base increases to the tens, or potentially in the future, hundreds of millions. Signal is adamant it can continue to offer its service for free. “As a non-profit, we simply need to break even,” says Aruna Harder, the app’s COO.

Signal is exclusively supported by grants and donations, says Acton. “We believe that millions of people value privacy enough to sustain it, and we’re here to demonstrate that there is an alternative to the ad-based business models that exploit user privacy.”


I want to move to Signal. How do you persuade WhatsApp groups to switch?

The momentum away from WhatsApp does appear to be building, and you may find more of your friends have switched to Signal already. But persuading a larger contact group can be more challenging.

Overton has been using Signal for several years and says all her regular contacts use the app. “Even when dating online, I ask the person I want to go on a date with to download Signal, or they don’t get my number.”

Some Signal advocates have already begun to migrate their groups over from WhatsApp. Jim Creese, a security expert, is moving a neighbourhood text group of 100 people to Signal. He is starting with a smaller sub-group of 20, some of whom struggle with technology. Creese says most are ambivalent about switching “as long as the new method isn’t more difficult”.

He advises anyone who’s moving groups across apps to focus on the “why” first. “Explain the reasons for the change, how it is likely to affect them, and the benefits. Don’t rush the process. While WhatsApp might not be where you want to be today, there’s no emergency requiring an immediate move.”

Moore thinks the shift away from WhatsApp will continue to gain momentum, but he says it will take time to move everyone across. Until then, it’s likely you will need to keep both WhatsApp and Signal on your phone.

Moore is in the process of moving a family chat to Signal, for the second time. “When I originally tried, one family member didn’t understand my concerns and thought I was being overcautious.

“However, the recent news has helped him understand the potential issues and why moving isn’t such a bad idea. The next hurdle will be getting my mother to download a new app and use it for the first time without me physically assisting her.”

Source: https://www.theguardian.com/technology/2021/jan/24/is-it-time-to-leave-whatsapp-and-is-signal-the-answer

WhatsApp Has Shared Your Data With Facebook for Years, Actually

WhatsApp Has Shared Your Data With Facebook for Years, Actually

“I don’t trust any product made by Facebook,” says Evan Greer, deputy director of the digital rights group Fight for the Future. “Their business model is surveillance. Never forget that.”

A pop-up notification has alerted the messaging app’s users to a practice that’s been in place since 2016.

two guys on the phone
Your encrypted messages are still safe, but it’s a rude awakening for many WhatsApp users.Photograph: Noam Galai/Getty Images

Since Facebook acquired WhatsApp in 2014, users have wondered and worried about how much data would flow between the two platforms. Many of them experienced a rude awakening this week, as a new in-app notification raises awareness about a step WhatsApp actually took to share more with Facebook back in 2016.

On Monday, WhatsApp updated its terms of use and privacy policy, primarily to expand on its practices around how WhatsApp business users can store their communications. A pop-up has been notifying users that as of February 8, the app’s privacy policy will change and they must accept the terms to keep using the app. As part of that privacy policy refresh, WhatsApp also removed a passage about opting out of sharing certain data with Facebook: „If you are an existing user, you can choose not to have your WhatsApp account information shared with Facebook to improve your Facebook ads and products experiences.“ 

Some media outlets and confused WhatsApp users understandably assumed that this meant WhatsApp had finally crossed a line, requiring data-sharing with no alternative. But in fact the company says that the privacy policy deletion simply reflects how WhatsApp has shared data with Facebook since 2016 for the vast majority of its now 2 billion-plus users.

When WhatsApp launched a major update to its privacy policy in August 2016, it started sharing user information and metadata with Facebook. At that time, the messaging service offered its billion existing users 30 days to opt out of at least some of the sharing. If you chose to opt out at the time, WhatsApp will continue to honor that choice. The feature is long gone from the app settings, but you can check whether you’re opted out through the “Request account info” function in Settings. 

Meanwhile, the billion-plus users WhatsApp has added since 2016, along with anyone who missed that opt-out window, have had their data shared with Facebook all this time. WhatsApp emphasized to WIRED that this week’s privacy policy changes do not actually impact WhatsApp’s existing practices or behavior around sharing data with Facebook. 

“Our updated Terms and Privacy Policy provide more information on how we process your data, and our commitment to privacy,” WhatsApp wrote on Monday. “As part of the Facebook Companies, WhatsApp partners with Facebook to offer experiences and integrations across Facebook’s family of apps and products.”

„I don’t trust any product made by Facebook.“

Evan Greer, Fight for the Future

None of this has at any point impacted WhatsApp’s marquee feature: end-to-end encryption. Messages, photos, and other content you send and receive on WhatsApp can only be viewed on your smartphone and the devices of the people you choose to message with. WhatsApp and Facebook itself can’t access your communications. In fact, Facebook CEO Mark Zuckerberg has repeatedly affirmed his commitment to expanding end-to-end encryption offerings as part of tying the company’s different communication platforms together. But that doesn’t mean there isn’t still a trove of other data WhatsApp can collect and share about how you use the app. The company says it collects user information „to operate, provide, improve, understand, customize, support, and market our Services.”

In practice, this means that WhatsApp shares a lot of intel with Facebook, including  account information like your phone number, logs of how long and how often you use WhatsApp, information about how you interact with other users, device identifiers, and other device details like IP address, operating system, browser details, battery health information, app version, mobile network, language and time zone. Transaction and payment data, cookies, and location information are also all fair game to share with Facebook depending on the permissions you grant WhatsApp in the first place.

“WhatsApp is great for protecting the privacy of your message content,” says Johns Hopkins University cryptographer Matthew Green. “But it feels like the privacy of everything else you do is up for grabs.“Get WIRED for $5. SubscribeAdvertisement

Facebook purchased WhatsApp in 2014 and noted at the time that it and the company’s chat platform Messenger would operate as “standalone” products. The slow shift toward integration has been controversial internally, and may have contributed to the departure in late 2017 and 2018, respectively, of WhatsApp cofounders Brian Acton and Jan Koum. A few months after leaving, Acton cofounded the nonprofit Signal Foundation. The organization maintains and develops the open source Signal Protocol, which WhatsApp and the secure messaging app Signal, among others, use to implement end-to-end encryption.

“Today privacy is becoming a much more mainstream discussion,” Acton said at the WIRED25 conference in 2019. „People are asking questions about privacy, and they want security and privacy built into the terms of service.”

Though this week’s WhatsApp privacy policy revisions don’t actually alter the messaging service’s behavior, it’s significant that users may have thought the company was offering an opt-out option all these years that didn’t actually exist. A level of data-sharing that some users disagree with and even fear has already been going on. Given the reality that Facebook has owned WhatsApp for the better part of a decade, this clarification seems to some like simply reckoning with the inevitable.

“I don’t trust any product made by Facebook,” says Evan Greer, deputy director of the digital rights group Fight for the Future. “Their business model is surveillance. Never forget that.”

source: https://www.wired.com/story/whatsapp-facebook-data-share-notification/

Sex, Beer, and Coding: Inside Facebook’s Wild Early Days

Adam Fisher @ Wired Magazine Source

Image may contain Mark Zuckerberg Clothing Apparel Human Person Face Jacket and Coat

Mark Zuckerberg and his cofounders moved from Harvard to Palo Alto, California, in March 2004. The whole enterprise began as something of a lark.Scott Beale

 

Sex, Beer, and Coding: Inside Facebook’s Wild Early Days

When the young Mark Zuckerberg moved to Palo Alto in 2004, he and his buddies built a corporate proto-culture that continues to influence the company today.
Image may contain Mark Zuckerberg Clothing Apparel Human Person Face Jacket and Coat
Mark Zuckerberg and his cofounders moved from Harvard to Palo Alto, California, in March 2004. The whole enterprise began as something of a lark.Scott Beale

This story is excerpted from Valley of Genius, by Adam Fisher.

Everyone who has seen The Social Network knows the story of Facebook’s founding. It was at Harvard in the spring semester of 2004. What people tend to forget, however, is that Facebook was only based in Cambridge for a few short months. Back then it was called TheFacebook.com, and it was a college-specific carbon copy of Friendster, a pioneering social network based in Silicon Valley.

Mark Zuckerberg’s knockoff site was a hit on campus, and so he and a few school chums decided to move to Silicon Valley after finals and spend the summer there rolling Facebook out to other colleges, nationwide. The Valley was where the internet action was. Or so they thought.

In Silicon Valley during the mid-aughts the conventional wisdom was that the internet gold rush was largely over. The land had been grabbed. The frontier had been settled. The web had been won. Hell, the boom had gone bust three years earlier. Yet nobody ever bothered to send the memo to Mark Zuckerberg—because at the time, Zuck was a nobody: an ambitious teenaged college student obsessed with the computer underground. He knew his way around computers, but other than that, he was pretty clueless—when he was still at Harvard someone had to explain to him that internet sites like Napster were actually businesses, built by corporations.

Image may contain Text
Excerpted from Valley of Genius by Adam Fisher. Copyright © 2018. Available on Amazon and from Twelve Books, an imprint of Hachette Book Group, Inc.

But Zuckerberg could hack, and that fateful summer he ended up meeting a few key Silicon Valley players who would end up radically changing the direction of what was, at the time, a company in name only. For this oral history of those critical months back in 2004 and 2005, I interviewed all the key players and talked to a few other figures who had insight into the founding era. What emerged, as you’ll see, is a portrait of a corporate proto-culture that continues to exert an influence on Facebook today. The whole enterprise began as something of a lark, it was an un-corporation, an excuse for a summer of beer pong and code sprints. Indeed, Zuckerberg’s first business cards read, “I’m CEO … bitch.” The brogrammer ’tude was a joke … or was it?

Image may contain Human Person Mark Zuckerberg Footwear Shoe Clothing Apparel Sitting Flooring and Floor
Zuckerberg, photographed in March 2006 at the headquarters of Facebook in Palo Alto. His first business card read “I’m CEO … bitch.”

Elena Dorfman/Redux


Sean Parker (cofounder of Napster and first president of Facebook): The dotcom era sort of ended with Napster, then there’s the dotcom bust, which leads to the social media era.

Steven Johnson (noted author and cultural commentator): At the time, the web was fundamentally a literary metaphor: “pages”—and then these hypertext links between pages. There was no concept of the user; that was not part of the metaphor at all.

Mark Pincus (co-owner of the fundamental social media patent): I mark Napster as the beginning of the social web—people, not pages. For me that was the breakthrough moment, because I saw that the internet could be this completely distributed peer-to-peer network. We could disintermediate those big media companies and all be connected to each other.

Steven Johnson: To me it really started with blogging in the early 2000s. You started to have these sites that were oriented around a single person’s point of view. It suddenly became possible to imagine, Oh, maybe there’s another element here that the web could also be organized around? Like I trust these five people, I’d like to see what they are suggesting. And that’s kind of what early blogging was like.

Ev Williams (founder of Blogger, Twitter, and Medium): Blogs then were link heavy and mostly about the internet. “We’re on the internet writing about the internet, and then linking to more of the internet, and isn’t that fun?”

Steven Johnson: You would pull together a bunch of different voices that would basically recommend links to you, and so there was a personal filter.

Mark Pincus: In 2002 Reid Hoffman and I started brainstorming: What if the web could be like a great cocktail party? Where you can walk away with these amazing leads, right? And what’s a good lead? A good lead is a job, an interview, a date, an apartment, a house, a couch.

And so Reid and I started saying, “Wow, this people web could actually generate something more valuable than Google, because you’re in this very, very highly vetted community that has some affinity to each other, and everyone is there for a reason, so you have trust.” The signal-to-noise ratio could be be very high. We called it Web 2.0, but nobody wanted to hear about it, because this was in the nuclear winter of the consumer internet.

Sean Parker: So during the period between 2000 and 2004, kind of leading up to Facebook, there is this feeling that everything that there was to be done with the internet has already been done. The absolute bottom is probably around 2002. PayPal goes public in 2002, and it’s the only consumer internet IPO. So there’s this weird interim period where there’s a total of only six companies funded or something like that. Plaxo was one of them. Plaxo was a proto–social network. It’s this in-between thing: some kind of weird fish with legs.

Aaron Sittig (graphic designer who invented the Facebook „like“): Plaxo is the missing link. Plaxo was the first viral growth company to really succeed intentionally. This is when we really started to understand viral growth.

Sean Parker: The most important thing I ever worked on was developing algorithms for optimizing virality at Plaxo.

Aaron Sittig: Viral growth is when people using the product spreads the product to other people—that’s it. It’s not people deciding to spread the product because they like it. It’s just people in the natural course of using the software to do what they want to do, naturally spreading it to other people.

Sean Parker: There was an evolution that took place from the sort of earliest proto–social network, which is probably Napster, to Plaxo, which only sort of resembled a social network but had many of the characteristics of one, then to LinkedIn, MySpace, and Friendster, then to this modern network which is Facebook.

Ezra Callahan (one of Facebook’s very first employees): In the early 2000s, Friendster gets all the early adopters, has a really dense network, has a lot of activity, and then just hits this breaking point.

Aaron Sittig: There was this big race going on and Friendster had really taken off, and it really seemed like Friendster had invented this new thing called “social networking,” and they were the winner, the clear winner. And it’s not entirely clear what happened, but the site just started getting slower and slower and at some point it just stopped working.

Ezra Callahan: And that opens the door for MySpace.

Ev Williams: MySpace was a big deal at the time.

Sean Parker: It was a complicated time. MySpace had very quickly taken over the world from Friendster. They’d seized the mantle. So Friendster was declining, MySpace was ascending.

Scott Marlette (programmer who put photo tagging on Facebook): MySpace was really popular, but then MySpace had scaling trouble, too.

Aaron Sittig: Then pretty much unheralded and not talked about much, Facebook launched in February of 2004.

Dustin Moskovitz (Zuckerberg’s original right-hand man): Back then there was a really common problem that now seems trivial. It was basically impossible to think of a person by name and go and look up their picture. All of the dorms at Harvard had individual directories called face books—some were printed, some were online, and most were only available to the students of that particular dorm. So we decided to create a unified version online and we dubbed it “The Facebook” to differentiate it from the individual ones.

Image may contain Mark Zuckerberg Furniture Human Person Electronics Lcd Screen Monitor Screen Display and Footwear
Zuckerberg, left, cofounded, Facebook with his Harvard roommate, Dustin Moskovitz, center. Sean Parker, right, joined the company as president in 2004. The trio was photographed in the company’s Palo Alto office in May 2005.

Jim Wilson/New York Times/Redux

Mark Zuckerberg (Facebook’s founder and current CEO): And within a couple weeks, a few thousand people had signed up. And we started getting emails from people at other colleges asking for us to launch it at their schools.

Ezra Callahan: Facebook launched at the Ivy Leagues originally, and it wasn’t because they were snooty, stuck-up kids who only wanted to give things to the Ivy Leagues. It was because they had this intuition that people who go to the Ivy Leagues are more likely to be friends with kids at other Ivy League schools.

Aaron Sittig: When Facebook launched at Berkeley, the rules of socializing just totally transformed. When I started at Berkeley, the way you found out about parties was you spent all week talking to people figuring out what was interesting, and then you’d have to constantly be in contact. With Facebook there, knowing what was going on on the weekend was trivial. It was just all laid out for you.

Facebook came to the Stanford campus—in the heart of Silicon Valley— quite early: March 2004.


Sean Parker: My roommates in Portola Valley were all going to Stanford.

Ezra Callahan: So I was a year out of Stanford, I graduated Stanford in 2003, and me and four of my college friends rented a house for that year just near the campus, and we had an extra bedroom available, and so we advertised around on a few Stanford email lists to find a roommate to move into that house with us. We got a reply from this guy named Sean Parker. He ended up moving in with us pretty randomly, and we discovered that while Napster had been a cultural phenomenon, it didn’t make him any money.

Sean Parker: And so the girlfriend of one of my roommates was using a product, and I was like, “You know, that looks a lot like Friendster or MySpace.” She’s like, “Oh yes, well, nobody in college uses MySpace.” There was something a little rough about MySpace.

Mark Zuckerberg: So MySpace had almost a third of their staff monitoring the pictures that got uploaded for pornography. We hardly ever have any pornography uploaded. The reason is that people use their real names on Facebook.

Adam D’Angelo (Zuckerberg’s high school hacking buddy): Real names are really important.

Aaron Sittig: We got this clear early on because of something that was established as a community principle at the Well: You own your own words. And we took it farther than the Well. We always had everything be traceable back to a specific real person.

Stewart Brand (founder of the Well, the first important social networking site): The Well could have gone that route, but we did not. That was one of the mistakes we made.

Mark Zuckerberg: And I think that that’s a really simple social solution to a possibly complex technical issue.

Ezra Callahan: In this early period, it’s a fairly hacked-together, simple website: just basic web forms, because that’s what Facebook profiles are.

Ruchi Sanghvi (coder who created Facebook’s Newsfeed): There was a little profile pic, and it said things like, “This is my profile” and “See my friends,” and there were three or four links and one or two other boxes below that.

Aaron Sittig: But I was really impressed by how focused and clear their product was. Small details—like when you went to your profile, it really clearly said, “This is you,” because social networking at the time was really, really hard to understand. So there was a maturity in the product that you don’t typically see until a product has been out there for a couple of years and been refined.

Sean Parker: So I see this thing, and I emailed some email address at Facebook, and I basically said, “I’ve been working with Friendster for a while, and I’d just like to meet you guys and see if maybe there’s anything to talk about.” And so we set up this meeting in New York—I have no idea why it was in New York—and Mark and I just started talking about product design and what I thought the product needed.

Aaron Sittig: I got a call from Sean Parker and he said, “Hey, I’m in New York. I just met with this kid Mark Zuckerberg, who is very smart, and he’s the guy building Facebook, and they say they have a ‘secret feature’ that’s going to launch that’s going to change everything! But he won’t tell me what it is. It’s driving me crazy. I can’t figure out what it is. Do you know anything about this? Can you figure it out? What do you think it could be?” And so we spent a little time talking about it, and we couldn’t really figure out what their “secret feature” that was going to change everything was. We got kind of obsessed about it.

Two months after meeting Sean Parker, Mark Zuckerberg moved to Silicon Valley with the idea of turning his dorm‐room project into a real business. Accompanying him were his cofounder and consigliere, Dustin Moskovitz, and a couple of interns.

Mark Zuckerberg: Palo Alto was kind of like this mythical place where all the tech used to come from. So I was like, I want to check that out.

Ruchi Sanghvi: I was pretty surprised when I heard Facebook moved to the Bay Area, I thought they were still at Harvard working out of the dorms.

Image may contain Human Person Sitting Chris Hughes Electronics Pc Computer Clothing Apparel Furniture and Laptop
Zuckerberg recruited fellow Harvard student Chris Hughes in the early days of Facebook to help make suggestions about the fledgling service. The two were photographed at Eliot House in May 2004.

Rick Friedman/Getty Images


Ezra Callahan: Summer of 2004 is when that fateful series of events took place: that legendary story of Sean running into the Facebook cofounders on the street, having met them a couple months earlier on the East Coast. That meeting happened a week after we all moved out of the house we had been living in together. Sean was crashing with his girlfriend’s parents.

Sean Parker: I was walking outside the house, and there was this group of kids walking toward me—they were all wearing hoodies and they looked like they were probably pot-smoking high-school kids just out making trouble, and I hear my name. I’m like, Oh, it’s coincidence, and I hear my name again and I turn around and it’s like, “Sean, what are you doing here?”

It took me about 30 seconds to figure out what was going on, and I finally realize that it’s Mark and Dustin and a couple of other people, too. So I’m like, “What are you guys doing here?” And they’re like, “We live right there.” And I’m like, “That’s really weird, I live right here!” This is just super weird.

Aaron Sittig: I get a call from Sean and he’s telling me, “Hey, you won’t believe what’s just happened.” And Sean said, “You’ve got to come over and meet these guys. Just leave right now. Just come over and meet them!”

Sean Parker: And so I don’t even know what happened from there, other than that it just became very convenient for me to go swing by the house. It wasn’t even a particularly formal relationship.

Aaron Sittig: So I went over and met them, and I was really impressed by how focused they were as a group. They’d occasionally relax and go do their thing, but for the most part they spent all their time sitting at a kitchen table with their laptops open. I would go visit their place a couple times a week, and that was always where I’d find them, just sitting around the kitchen table working, constantly, to keep their product growing.

All Mark wanted to do was either make the product better, or take a break and relax so that you could get enough energy to go work on the product more. That’s it. They never left that house except to go watch a movie.

Ezra Callahan: The early company culture was very, very loose. It felt like a project that’s gotten out of control and has this amazing business potential. Imagine your freshman dorm running a business, that’s really what it felt like.

Mark Zuckerberg: Most businesses aren’t like a bunch of kids living in a house, doing whatever they want, not waking up at a normal time, not going into an office, hiring people by, like, bringing them into your house and letting them chill with you for a while and party with you and smoke with you.

Ezra Callahan: The living room was the office with all these monitors and workstations set up everywhere and just whiteboards as far as the eye can see.

At the time Mark Zuckerberg was obsessed with file sharing, and the grand plan for his Silicon Valley summer was to resurrect Napster. It would rise again, but this time as a feature inside of Facebook. The name of Zuckerberg’s pet project? Wirehog.

Aaron Sittig: Wirehog was the secret feature that Mark had promised was going to change everything. Mark had gotten convinced that what would make Facebook really popular and just sort of cement its position at schools was a way to send files around to other people—mostly just to trade music.

Mark Pincus: They built in this little thing that looked like Napster—you could see what music files someone had on their computer.

Ezra Callahan: This is at a time when we have just watched Napster get completely terminated by the courts and the entertainment industry is starting to sue random individuals for sharing files. The days of the Wild West were clearly ending.

Aaron Sittig: It’s important to remember that Wirehog was happening at a time where you couldn’t even share photos on your Facebook page. Wirehog was going to be the solution for sharing photos with other people. You could have a box on your profile and people could go there to get access to all your photos that you were sharing—or whatever files you were sharing. It might be audio files, it might be video files, it might be photos of their vacation.

Ezra Callahan: But at the end of the day it’s just a file-sharing service. When I joined Facebook, most people had already kind of come around to the idea that unless some new use comes up for Wirehog that we haven’t thought of, it’s just a liability. “We’re going to get sued someday, so what’s the point?” That was the mentality.

Mark Pincus: I was kind of wondering why Sean wanted to go anywhere near music again.

Aaron Sittig: My understanding was that some of Facebook’s lawyers advised that it would be a bad idea. And that work on Wirehog was kind of abandoned just as Facebook user growth started to grow really quickly.

Ezra Callahan: They had this insane demand to join. It’s still only at a hundred schools, but everyone in college has already heard of this, at all schools across the country. The usage numbers were already insane. Everything on the whiteboards was just all stuff related to what schools were going to launch next. The problem was very singular. It was simply, “How do we scale?”


Aaron Sittig: Facebook would launch at a school, and within one day they would have 70 percent of undergrads signed up. At the time, nothing had ever grown as fast as Facebook.

Ezra Callahan: It did not seem inevitable that we were going to succeed, but the scope of what success looked like was becoming clear. Dustin was already talking about being a billion-dollar company. They had that ambition from the very beginning. They were very confident: two 19-year-old cocky kids.

Mark Zuckerberg: We just all kind of sat around one day and were like, “We’re not going back to school, are we?” Nahhhh.

Ezra Callahan: The hubris seemed pretty remarkable.

David Choe (noted graffiti artist): And Sean is a skinny, nerdy kid and he’s like, “I’m going to go raise money for Facebook. I’m going to bend these fuckers’ minds.” And I’m like, “How are you going to do that?” And he transformed himself into an alpha male. He got like a fucking super-sharp haircut. He started working out every day, got a tan, got a nice suit. And he goes in these meetings and he got the money!

Mark Pincus: So it’s probably like September or October of 2004, and I’m at Tribe’s offices in this dusty converted brick building in Potrero Hill—the idea of Tribe.net was like Friendster meets Craigslist—and we’re in our conference room, and Sean says he’s bringing the Facebook guy in. And he brings Zuck in, and Zuck is in a pair of sweatpants, and these Adidas flip-flops that he wore, and he’s so young looking and he’s sitting there with his feet up on the table, and Sean is talking really fast about all the things Facebook is going to do and grow and everything else, and I was mesmerized.

Because I’m doing Tribe, and we are not succeeding, we’ve plateaued and we’re hitting our head against the wall trying to figure out how to grow, and here’s this kid, who has this simple idea, and he’s just taking off! I was kind of in awe already of what they had accomplished, and maybe a little annoyed by it. Because they did something simpler and quicker and with less, and then I remember Sean got on the computer in my office, and he pulled up The Facebook, and he starts showing it to me, and I had never been able to be on it, because it’s college kids only, and it was amazing.

People are putting up their phone numbers and home addresses and everything about themselves and I was like, I can’t believe it! But it was because they had all this trust. And then Sean put together an investment round quickly, and he had advised Zuck to, I think, take $500,000 from Peter Thiel, and then $38,000 each from me and Reid Hoffman. Because we were basically the only other people doing anything in social networking. It was a very, very small little club at the time.

Ezra Callahan: By December it’s—I wouldn’t say it’s like a more professional atmosphere, but all the kids that Mark and Dustin were hanging out with are either back at school back East or back at Stanford, and work has gotten a little more serious for them. They are working more than they were that first summer. We don’t move into an office until February of 2005. And right as we were signing the lease, Sean just randomly starts saying, “Dude! I know this street artist guy. We’re going to come in and have him totally do it up.”

David Choe: I was like, “If you want me to paint the entire building it’s going to be $60,000.” Sean’s like, “Do you want cash or do you want stock?”

Ezra Callahan: He pays David Choe in Facebook shares.

David Choe: I didn’t give a shit about Facebook or even know what it was. You had to have a college email to get on there. But I like to gamble, you know? I believed in Sean. I’m like, This kid knows something and I am going to bet my money on him.

Ezra Callahan: So then we move in, and when you first saw this graffiti it was like, “Holy shit, what did this guy do to the office?” The office was on the second floor, so as you walk in you immediately have to walk up some stairs, and on the big 10-foot-high wall facing you is just this huge buxom woman with enormous breasts wearing this Mad Max–style costume riding a bulldog.

It’s the most intimidating, totally inappropriate thing. “God damn it, Sean! What did you do?” It’s not so much that we set out to paint that, because that was the culture. It was more that Sean just did it, and that set a tone for us. A huge-breasted warrior woman riding a bulldog is the first thing you see as you come in the office, so like, get ready for that!

Ruchi Sanghvi: Yes, the graffiti was a little racy, but it was different, it was vibrant, it was alive. The energy was just so tangible.

Katie Geminder (project manager for early Facebook): I liked it, but it was really intense. There was certain imagery in there that was very sexually charged, which I didn’t really care about but that could be considered a little bit hostile, and I think we took care of some of the more provocative ones.

Ezra Callahan: I don’t think it was David Choe, I think it was Sean’s girlfriend who painted this explicit, intimate lesbian scene in the woman’s restroom of two completely naked women intertwined and cuddling with each other—not graphic, but certainly far more suggestive than what one would normally see in a women’s bathroom in an office. That one only actually lasted a few weeks.

Max Kelly (Facebook’s first cyber-security officer): There was a four-inch by four-inch drawing of someone getting fucked. One of the customer service people complained that it was “sexual in nature,” which, given what they were seeing every day, I’m not sure why they would complain about this. But I ended up going to a local store and buying a gold paint pen and defacing the graffiti—just a random design— so it didn’t show someone getting fucked.

Jeff Rothschild (investor turned Facebook employee): It was wild, but I thought that it was pretty cool. It looked a lot more like a college dorm or fraternity than it did a company.

Katie Geminder: There were blankets shoved in the corner and video games everywhere, and Nerf toys and Legos, and it was kind of a mess.

Jeff Rothschild: There’s a PlayStation. There’s a couple of old couches. It was clear people were sleeping there.

Karel Baloun (one of the earliest Facebook programmers): I’d probably stay there two or three nights a week. I won an award for “most likely to be found under your desk” at one of the employee gatherings.

Jeff Rothschild: They had a bar, a whole shelf with liquor, and after a long day people might have a drink.

Ezra Callahan: There’s a lot of drinking in the office. There would be mornings when I would walk in and hear beer cans move as I opened the door, and the office smells of stale beer and is just trashed.

Ruchi Sanghvi: They had a keg. There was some camera technology built on top of the keg. It basically detected presence and posted about who was present at the keg—so it would take your picture when you were at the keg, and post some sort of thing saying “so-and-so is at the keg.” The keg is patented.

Ezra Callahan: When we first moved in, the office door had this lock we couldn’t figure out, but the door would automatically unlock at 9 am every morning. I was the guy that had to get to the office by 9 to make sure nobody walked in and just stole everything, because no one else was going to get there before noon. All the Facebook guys are basically nocturnal.

Katie Geminder: These kids would come in—and I mean kids, literally they were kids—they’d come into work at 11 or 12.

Ruchi Sanghvi: Sometimes I would walk to work in my pajamas and that would be totally fine. It felt like an extension of college; all of us were going through the same life experiences at the same time. Work was fantastic. It was so interesting. It didn’t feel like work. It felt like we were having fun all the time.

Ezra Callahan: You’re hanging out. You’re drinking with your coworkers. People start dating within the office …


Ruchi Sanghvi: We found our significant others while we were at Facebook. All of us eventually got married. Now we’re in this phase where we’re having children.

Katie Geminder: If you look at the adults that worked at Facebook during those first few years—like, anyone over the age of 30 that was married—and you do a survey, I tell you that probably 75 percent of them are divorced.

Max Kelly: So, lunch would happen. The caterer we had was mentally unbalanced and you never knew what the fuck was going to show up in the food. There were worms in the fish one time. It was all terrible. Usually, I would work until about 3 in the afternoon and then I’d do a circuit through the office to try and figure out what the fuck was going to happen that night. Who was going to launch what? Who was ready? What rumors were going on? What was happening?

Steve Perlman (Silicon Valley veteran who started in the Atari era): We shared a break room with Facebook. We were building hardware: a facial capture technology. The Facebook guys were doing some HTML thing. They would come in late in the morning. They’d have a catered lunch. Then they leave usually by mid-afternoon. I’m like, man, that is the life! I need a startup like that. You know? And the only thing any of us could think about Facebook was: Really nice people but never going to go anywhere.

Max Kelly: Around 4 I’d have a meeting with my team, saying “here’s how we’re going to get fucked tonight.” And then we’d go to the bar. Between like 5 and 8-ish people would break off and go to different bars up and down University Avenue, have dinner, whatever.

Ruchi Sanghvi: And we would all sit together and have these intellectual conversations: “Hypothetically, if this network was a graph, how would you weight the relationship between two people? How would you weight the relationship between a person and a photo? What does that look like? What would this network eventually look like? What could we do with this network if we actually had it?”

Sean Parker: The “social graph” is a math concept from graph theory, but it was a way of trying to explain to people who were kind of academic and mathematically inclined that what we were building was not a product so much as it was a network composed of nodes with a lot of information flowing between those nodes. That’s graph theory. Therefore we’re building a social graph. It was never meant to be talked about publicly. It was a way of articulating to somebody with a math background what we were building.

Ruchi Sanghvi: In retrospect, I can’t believe we had those conversations back then. It seems like such a mature thing to be doing. We would sit around and have these conversations and they weren’t restricted to certain members of the team; they weren’t tied to any definite outcome. It was purely intellectual and was open to everyone.

Max Kelly: People were still drinking the whole time, like all night, but starting around 9, it really starts solidifying: “What are we going to release tonight? Who’s ready to go? Who’s not ready to go?” By about 11-ish we’d know what we were going to do that night.

Katie Geminder: There was an absence of process that was mind-blowing. There would be engineers working stealthily on something that they were passionate about. And then they’d ship it in the middle of the night. No testing—they would just ship it.

Ezra Callahan: Most websites have these very robust testing platforms so that they can test changes. That’s not how we did it.

Ruchi Sanghvi: With the push of a button you could push out code to the live site, because we truly believed in this philosophy of “move fast and break things.” So you shouldn’t have to wait to do it once a week, and you shouldn’t have to wait to do it once a day. If your code was ready you should be able to push it out live to users. And that was obviously a nightmare.

Katie Geminder: Can our servers stand up to something? Or security: How about testing a feature for security holes? It really was just shove it out there and see what happens.

Jeff Rothschild: That’s the hacker mentality: You just get it done. And it worked great when you had 10 people. By the time we got to 20, or 30, or 40, I was spending a lot of time trying to keep the site up. And so we had to develop some level of discipline.

Ruchi Sanghvi: So then we would only push out code in the middle of the night, and that’s because if we broke things it wouldn’t impact that many people. But it was terrible because we were up until like 3 or 4 am every night, because the act of pushing just took everybody who had committed any code to be present in case anything broke.

Max Kelly: Around 1 am, we’d know either we’re fucked or we’re good. If we were good, everyone would be like “whoopee” and might be able to sleep for a little while. If we were fucked then we were like, “OK, now we’ve got to try and claw this thing back or fix it.”

Katie Geminder: 2 am: That was when shit happened.

Ruchi Sanghvi: Then another push, and this would just go on and on and on and on and on until like 3 or 4 or 5 am in the night.

Max Kelly: If 4 am rolled around and we couldn’t fix it, I’d be like, “We’re going to try and revert it.” Which meant basically my team would be up till 6 am So, go to bed somewhere between 4 and 6, and then repeat every day for like nine months. It was crazy.

Jeff Rothschild: It was seven days a week. I was on all the time. I would drink a large glass of water before I went to sleep to assure that I’d wake up in two hours so I could go check everything and make sure that we hadn’t broken it in the meantime. It was all day, all night.

Katie Geminder: That was very challenging for someone who was trying to actually live an adult life with, like, a husband. There was definitely a feeling that because you were older and married and had a life outside of work that you weren’t committed.

Mark Zuckerberg: Why are most chess masters under 30? Young people just have simpler lives. We may not own a car. We may not have family … I only own a mattress.

Kate Geminder: Imagine being over 30 and hearing your boss say that!

Mark Zuckerberg: Young people are just smarter.

Ruchi Sanghvi: We were so young back then. We definitely had tons of energy and we could do it, but we weren’t necessarily the most efficient team by any means whatsoever. It was definitely frustrating for senior leadership, because a lot of the conversations happened at night when they weren’t around, and then the next morning they would come in to all of these changes that happened at night. But it was fun when we did it.


Ezra Callahan: For the first few hundred employees, almost all of them were already friends with someone working at the company, both within the engineering circle and also the user support people. It’s a lot of recent grads. When we move into the office was when the dorm room culture starts to really stick out and also starts to break a little bit. It has a dorm room feeling, but it’s not completely dominated by college kids. The adults are coming in.

Jeff Rothschild: I joined in May 2005. On the sidewalk outside the office was the menu board from a pizza parlor. It was a caricature of a chef with a blackboard below it, and the blackboard had a list of jobs. This was the recruiting effort.

Sean Parker: At the time there was a giant sucking sound in the universe, and it was called Google. All the great engineers were going to Google.

Kate Losse (early customer service rep): I don’t think I could have stood working at Google. To me Facebook seemed much cooler than Google, not because Facebook was necessarily like the coolest. It’s just that Google at that point already seemed nerdy in an uninteresting way, whereas like Facebook had a lot of people who didn’t actually want to come off as nerds. Facebook was a social network, so it has to have some social components that are like really normal American social activities—like beer pong.

Kate Geminder: There was a house down the street from the office where five or six of the engineers lived that was one ongoing beer pong party. It was like a boys’ club—although it wasn’t just boys.

Terry Winograd (noted Stanford computer-science professor): The way I would put it is that Facebook is more of an undergraduate culture and Google is more of a graduate student culture.

Jeff Rothschild: Before I walked in the door at Facebook, I thought these guys had created a dating site. It took me probably a week or two before I really understood what it was about. Mark, he used to tell us that we are not a social network. He would insist: “This is not a social network. We’re a social utility for people you actually know.”

MySpace was about building an online community among people who had similar interests. We might look the same because at some level it has the same shape, but what it accomplishes for the individual is solving a different problem. We were trying to improve the efficiency of communication among friends.

Max Kelly: Mark sat down with me and described to me what he saw Facebook being. He said, “It’s about connecting people and building a system where everyone who makes a connection to your life that has any value is preserved for as long as you want it to be preserved. And it doesn’t matter where you are, or who you’re with, or how your life changes: because you’re always in connection with the people that matter the most to you, and you’re always able to share with them.”

I heard that, and I thought, I want to be a part of this. I want to make this happen. Back in the ’90s all of us were utopian about the internet. This was almost a harkening back to the beautiful internet where everyone would be connected and everyone could share and there was no friction to doing that. Facebook sounded to me like the same thing. Mark was too young to know that time, but I think he intrinsically understood what the internet was supposed to be in the ’80s and in the ’90s. And here I was hearing the same story again and conceivably having the ability to help pull it off. That was very attractive.

Aaron Sittig: So in the summer of 2005 Mark sat us all down and he said, “We’re going to do five things this summer.” He said, “We’re redesigning the site. We’re doing a thing called News Feed, which is going to tell you everything your friends are doing on the site. We’re going to launch Photos, we’re going to redo Parties and turn it into Events, and we’re going to do a local-businesses product.” And we got one of those things done, we redesigned the site. Photos was my next project.

Ezra Callahan: The product at Facebook at the time is dead simple: profiles. There is no News Feed, there was a very weak messaging system. They had a very rudimentary events product you could use to organize parties. And almost no other functions to speak of. There’s no photos on the website, other than your profile photo. There’s nothing that tells you when anything on the site has changed. You find out somebody changed their profile picture by obsessively going to their profile and noticing, Oh, the picture changed.

Aaron Sittig: We had some people that were changing their profile picture once an hour, just as a way of sharing photos of themselves.

Scott Marlette: At the time photos was the number-one most requested feature. So, Aaron and I go into a room and whiteboard up some wireframes for some pages and decide on what data needs to get stored. In a month we had a nearly fully functioning prototype internally to play with. It was very simple. It was: You post a photo, it goes in an album, you have a set of albums, and then you can tag people in the photos.

Jeff Rothschild: Aaron had the insight to do tagging, which was a tremendously valuable insight. It was really a game changer.

Aaron Sittig: We thought the key feature is going to be saying who is in the photo. We weren’t sure if this was really going to be that successful; we just felt good about it.

Facebook Photos went live in October 2005. There were about 5 million users, virtually all of them college students.

Scott Marlette: We launched it at Harvard and Stanford first, because that’s where our friends were.

Image may contain Randi Zuckerberg Mark Zuckerberg Pants Clothing Apparel Human Person Jeans Denim and Footwear
Zuckerberg started coding while growing up in Dobbs Ferry, New York, where he was raised by his parents, Edward and Karen along with his sisters Randi, left, and Arielle, right.

SHERRY TESLER/New York Times/Redux

Aaron Sittig: We had built this program that would fill up a TV screen and show us everything that was being uploaded to the service, and then we flicked it on and waited for photos to start coming in. And the first photos that came in were Windows wallpapers: Someone had just uploaded all their wallpaper files from their Windows directory, which was a big disappointment, like, Oh no, maybe people don’t get it? Maybe this is not going to work?

But the next photos were of a guy hanging out with his friends, and then the next photos after that were a bunch of girls in different arrangements: three girls together, these four girls together, two of them together, just photos of them hanging out at parties, and then it just didn’t stop.

Max Kelly: You were at every wedding, you were at every bar mitzvah, you were seeing all this awesome stuff, and then there’s a dick. So, it was kind of awesome and shitty at the same time.

Aaron Sittig: Within the first day someone had uploaded and tagged themselves in 700 photos, and it just sort of took off from there.

Jeff Rothschild: Inside of three months, we were delivering more photos than any other website on the internet. Now you have to ask yourself: Why? And the answer was tagging. There isn’t anyone who could get an email message that said, “Someone has uploaded a photo of you to the internet”—and not go take a look. It’s just human nature.

Ezra Callahan: The single greatest growth mechanism ever was photo tagging. It shaped all of the rest of the product decisions that got made. It was the first time that there was a real fundamental change to how people used Facebook, the pivotal moment when the mindset of Facebook changes and the idea for News Feed starts to germinate and there is now a reason to see how this expands beyond college.


Jeff Rothschild: The News Feed project was started in the fall of 2005 and delivered in the fall of 2006.

Dustin Moskovitz: News Feed is the concept of viral distribution, incarnate.

Ezra Callahan: News Feed is what Facebook fundamentally is today.

Sean Parker: Originally it was called “What’s New,” and it was just a feed of all of the things that were happening in the network—really just a collection of status updates and profile changes that were occurring.

Katie Geminder: It was an aggregation, a collection of all those stories, with some logic built into it because we couldn’t show you everything that was going on. There were sort of two streams: things you were doing and things the rest of your network was doing.

Ezra Callahan: So News Feed is the first time where now your homepage, rather than being static and boring and useless, is now going to be this constantly updating “newspaper,” so to speak, of stuff happening on Facebook around you that we think you’ll care about.

Ruchi Sanghvi: And it was a fascinating idea, because normally when you think of newspapers, they have this editorialized content where they decide what they want to say, what they want to print, and they do it the previous night, and then they send these papers out to thousands if not hundreds of thousands of people. But in the case of Facebook, we were building 10 million different newspapers, because each person had a personalized version of it.

Ezra Callahan: It really was the first monumental product-engineering feat. The amount of data it had to deal with: all these changes and how to propagate that on an individual level.

Ruchi Sanghvi: We were working on it off and on for a year and a half.

Ezra Callahan: … and then the intelligence side of all this stuff: How do we surface the things that you’ll care about most? These are very hard problems engineering-wise.

Ruchi Sanghvi: Without realizing it, we ended up building one of the largest distributed systems in software at that point in time. It was pretty cutting-edge.

Ezra Callahan: We have it in-house and we play with it for weeks and weeks—which is really unusual.

Katie Geminder: So I remember being like, “OK, you guys, we have to do some level of user research,” and I finally convinced Zuck that we should bring users into a lab and sit behind the glass and watch our users using the product. And it took so much effort for me to get Dustin and Zuck and other people to go and actually watch this. They thought this was a waste of time. They were like, “No, our users are stupid.” Literally those words came out of somebody’s mouth.

Ezra Callahan: It’s the very first time we actually bring in outside people to test something for us, and their reaction, their initial reaction is clear. People are just like, “Holy shit, like, I shouldn’t be seeing this, like this doesn’t feel right,” because immediately you see this person changed their profile picture, this person did this, this person did that, and your first instinct is Oh my God! Everybody can see this about me! Everyone knows everything I’m doing on Facebook.

Max Kelly: But News Feed made perfect sense to all of us, internally. We all loved it.

Ezra Callahan: So in-house we have this idea that this isn’t going to go right: This is too jarring a change, it needs to be rolled out slowly, we need to warm people up to this—and Mark is just firmly committed. “We’re just going to do this. We’re just going to launch. It’s like ripping off a Band-Aid.”

Ruchi Sanghvi: We pushed the product in the dead of the night, we were really excited, we were celebrating, and then the next morning we woke up to all this pushback. I had written this blog post, “Facebook Gets a Facelift.”

Katie Geminder: We wrote a little letter, and at the bottom of it we put a button. And the button said, “Awesome!” Not like, “OK.” It was, “Awesome!” That’s just rude. I wish I had a screenshot of that. Oh man! And that was it. You landed on Facebook and you got the feature. We gave you no choice and not a great explanation and it scared people.

Jeff Rothschild: People were rattled because it just seemed like it was exposing information that hadn’t been visible before. In fact, that wasn’t the case. Everything shown in News Feed was something people put on the site that would have been visible to everyone if they had gone and visited that profile.

Ruchi Sanghvi: Users were revolting. They were threatening to boycott the product. They felt that they had been violated, and that their privacy had been violated. There were students organizing petitions. People had lined up outside the office. We hired a security guard.

Katie Geminder: There were camera crews outside. There were protests: “Bring back the old Facebook!” Everyone hated it.

Jeff Rothschild: There was such a violent reaction to it. We had people marching on the office. A Facebook group was organized protesting News Feed and inside of two days, a million people joined.

Ruchi Sanghvi: There was another group that was about how “Ruchi is the devil,” because I had written that blog post.

Max Kelly: The user base fought it every step of the way and would pound us, pound Customer Service, and say, “This is fucked up! This is terrible!”

Ezra Callahan: We’re getting emails from relatives and friends. They’re like, “What did you do? This is terrible! Change it back.”

Katie Geminder: We were sitting in the office and the protests were going on outside and it was, “Do we roll it back? Do we roll it back!?”

Ruchi Sanghvi: Now under usual circumstances if about 10 percent of your user base starts to boycott the product, you would shut it down. But we saw a very unusual pattern emerge.

Max Kelly: Even the same people who were telling us that this is terrible, we’d look at their user stream and be like: You’re fucking using it constantly! What are you talking about?

Ruchi Sanghvi: Despite the fact that there were these revolts and these petitions and people were lined up outside the office, they were digging the product. They were actually using it, and they were using it twice as much as before News Feed.

Ezra Callahan: It was just an emotionally devastating few days for everyone at the company. Especially for the set of people who had been waving their arms saying, “Don’t do this! Don’t do this!” because they feel like, “This is exactly what we told you was going to happen!”

Ruchi Sanghvi: Mark was on his very first press tour on the East Coast, and the rest of us were in the Palo Alto office dealing with this and looking at these logs and seeing the engagement and trying to communicate that “It’s actually working!,” and to just try a few things before we chose to shut it down.

Katie Geminder: We had to push some privacy features right away to quell the storm.

Ruchi Sanghvi: We asked everyone to give us 24 hours.

Katie Geminder: We built this janky privacy “audio mixer” with these little slider bars where you could turn things on and off. It was beautifully designed—it looked gorgeous—but it was irrelevant.

Jeff Rothschild: I don’t think anyone ever used it.

Ezra Callahan: But it gets added and eventually the immediate reaction subsides and people realize that the News Feed is exactly what they wanted, this feature is exactly right, this just made Facebook a thousand times more useful.

Katie Geminder: Like Photos, News Feed was just—boom!—a major change in the product and one of those sea changes that just leveled it up.

Jeff Rothschild: Our usage just skyrocketed on the launch of News Feed. About the same time we also opened the site up to people who didn’t have a .edu address.

Ezra Callahan: Once it opens to the public, it’s becoming clear that Facebook is on its way to becoming the directory of all the people in the world.

Jeff Rothschild: Those two things together—that was the inflection point where Facebook became a massively used product. Prior to that we were a niche product for high-school and college students.

Mark Zuckerberg: Domination!

Ruchi Sanghvi: “Domination” was a big mantra of Facebook back in the day.

Max Kelly: I remember company meetings where we were chanting “dominate.”

Ezra Callahan: We had company parties all the time, and for a period in 2005, all Mark’s toasts at the company parties would end with “Domination!”

Mark Zuckerberg: Domination!!


Max Kelly: I especially remember the meeting where we tore up the Yahoo offer.

Mark Pincus: In 2006 Yahoo offered Facebook $1.2 billion ,I think it was, and it seemed like a breathtaking offer at the time, and it was difficult to imagine them not taking it. Everyone had seen Napster flame out, Friendster flame out, MySpace flame out, so to be a company with no revenues, and a credible company offers a billion-two, and to say no to that? You have to have a lot of respect to founders that say no to these offers.

Dustin Moskovitz: I was sure the product would suffer in a big way if Yahoo bought us. And Sean was telling me that 90 percent of all mergers end in failure.

Mark Pincus: Luckily, for Zuck, and history, Yahoo’s stock went down, and they wouldn’t change the offer. They said that the offer is a fixed number of shares, and so the offer dropped to like $800 million, and I think probably emotionally Zuck didn’t want to do it and it gave him a clear out. If Yahoo had said, “No problem, we’ll back that up with cash or stock to make it $1.2 billion,” it might have been a lot harder for Zuck to say no, and maybe Facebook would be a little division of Yahoo today.

Max Kelly: We literally tore the Yahoo offer up and stomped on it as a company! We were like, “Fuck those guys, we are going to own them!” That was some malice-ass bullshit.

Mark Zuckerberg: Domination!!!

Kate Losse: He had kind of an ironic way of saying it. It wasn’t a totally flat, scary “domination.” It was funny. It’s only when you think about a much bigger scale of things that you’re like, Hmmmm: Are people aware that their interactions are being architected by a group of people who have a certain set of ideas about how the world works and what’s good?

Ezra Callahan: “How much was the direction of the internet influenced by the perspective of 19-, 20-, 21-year-old well-off white boys?” That’s a real question that sociologists will be studying forever.

Kate Losse: I don’t think most people really think about the impact that the values of a few people now have on everyone.

Steven Johnson: I think there’s legitimate debate about this. Facebook has certainly contributed to some echo chamber problems and political polarization problems, but I spent a lot of time arguing that the internet is less responsible for that than people think.

Mark Pincus: Maybe I’m too close to it all, but I think that when you pull the camera back, none of us really matter that much. I think the internet is following a path to where the internet wants to go. We’re all trying to figure out what consumers want, and if what people want is this massive echo chamber and this vain world of likes, someone is going to give it to them, and they’re going to be the one who wins, and the ones who don’t, won’t.

Steve Jobs: I don’t see anybody other than Facebook out there—they’re dominating.

Mark Pincus: So I don’t exactly think that a bunch of college boys shaped the internet. I just think they got there first.

Mark Zuckerberg: Domination!!!!

Ezra Callahan: So, it’s not until we have a full-time general council onboard who finally says, “Mark, for the love of God: You cannot use the word domination anymore,” that he stops.

Sean Parker: Once you are dominant, then suddenly it becomes an anticompetitive term.

Steven Johnson: It took the internet 30 years to get to 1 billion users. It took Facebook 10 years. The crucial thing about Facebook is that it’s not a service or an app—it’s a fundamental platform, on the same scale as the internet itself.

Steve Jobs: I admire Mark Zuckerberg. I only know him a little bit, but I admire him for not selling out—for wanting to make a company. I admire that a lot.


Author’s Note:

The written language is very different from the spoken word. And so, I’ve taken the liberty of correcting slips of the tongue, dividing streams of consciousness into sentences, ordering sentences into paragraphs, and eliminating redundancies. The point is not to polish and make what was originally spoken read as if it were written, but rather to make the verbatim transcripts of what was actually said readable in the first place.

That said, I’ve been careful to retain the rhythms of speech and quirks of language of everyone interviewed for this article intact, so that what you hear in your mind’s ear as you read is true in every sense of the word: true to life, true to the transcript, and true to the speakers‘ intended meaning.

The vast majority of the words found in this article originated in interviews that were given to me especially for this article. Where that wasn’t possible I tried, with some success, to unearth previously unpublished interviews and quote from them. And in a few cases I’ve resorted to quoting from interviews that have been published before.

Mark Zuckerberg’s quotes were uttered at a guest lecture he gave to Harvard’s Introduction to Computer Science class in 2005 and in an interview he gave to the Harvard Crimson in February that same year. Dustin Moskovitz’s quotes were taken from a keynote address at the Alliance of Youth Movements Summit in December of 2008 and from David Kirkpatrick’s authoritative history, The Facebook Effect. David Choe’s comments were made on The Howard Stern Show in March 2016. Steve Jobs made his remarks to his biographer, Walter Isaacson. The interview was aired on 60 Minutes soon after Jobs died in 2011.


This story is excerpted from Valley of Genius, by Adam Fisher.

Facebook knows so much about its users that it can link their accounts, even when created under different names, from different devices.

Source: https://www.wired.com/story/instagram-unlink-account-wont-unlink-facebook/

The settings on Instagram include a page devoted to the “Linked Accounts” feature. As you might expect, it displays … your linked accounts. Users have the option to connect to Twitter, Tumblr, and, of course, Instagram’s parent company, Facebook, among others.

On first glance, the feature appears pretty straightforward—apps that aren’t linked are shown in gray, linked apps appear in color. When it comes to Facebook, however, the feature may be misleading.

Like other platforms shown under the “Linked Accounts” menu on Instagram, the option to link your Facebook profile is ostensibly disabled by default. Users must tap the app’s grayed out logo and sign in before Instagram displays the two as connected. Once two profiles are connected, an option to “Unlink Account” appears in Instagram settings. Clicking there brings up a warning: “Unlinking makes it harder to get access to your Instagram account if you get locked out.”

Common sense suggests that if you unlink a Facebook account from your Instagram profile, you’ve unlinked that Facebook account from your Instagram profile. But like many things Facebook, common sense does not exactly apply here. Clicking Unlink Account does not actually unlink a Facebook account from Instagram, a Facebook spokesperson told WIRED, because it isn’t possible to separate the two. Even if a user never explicitly linked their Facebook and Instagram profiles, they are intrinsically connected—Finstagrams be damned—and will continue to be, regardless of how many times you mash “Unlink Account.”

That’s because the wealth of data that Facebook collects through its multiple services is more than enough to properly identify users’ various accounts and link them to one another. Even in cases where a different name, email address, or device was used to create each account—be it a throwaway WhatsApp profile, stalker Instagram account, or joke Facebook profile—Facebook often is able to suss out who is actually behind the account and whether they have accounts on other Facebook-owned apps.

“Because Facebook and Instagram share infrastructure, systems and technology, we connect information about your activities across our services based on a variety of signals,” a Facebook spokesperson told WIRED. “Linking or unlinking your accounts in the app doesn’t affect this.”

The disclosure comes as Facebook moves to integrate previously independent apps such as Instagram and WhatsApp. Messenger, Instagram, and WhatsApp are being combined into one mega-chat app (problematic enough on its own), while Instagram and WhatsApp have been rechristened as “Instagram from Facebook” and “WhatsApp from Facebook.”

But even as the apps are being woven more tightly together, they’re not all equal in the minds of Facebook executives. The Linked Accounts feature on Instagram appears designed to funnel traffic to Facebook, where user growth has flatlined, as Instagram’s growth continues apace. Meanwhile, Facebook last year made a contentious decision to stop funneling traffic to Instagram.

The spokesperson said Facebook began linking accounts behind the scenes based on data it had gathered about users shortly after it acquired Instagram in 2012. The spokesperson said that Facebook collects and connects this information about users’ activities in order to give users a “personalized experience” across all of the apps under the company’s umbrella, like more precisely targeted ads or in-app recommendations based on an amalgamation of the user’s cross-platform activities.

For users who thought they could keep various accounts separate, the realities of this “personalized experience” can prove frustrating. The spokesperson noted that Facebook could use this data to suggest that a user join a Facebook group that includes people that they follow on Instagram or chat with over Messenger. That could pose privacy concerns for users who want their activity on an unlinked Instagram account isolated from their prime Facebook profile.

The connections among these accounts pose additional challenges on the back end. Some users that set out to create Finstagrams complain that they’ve found their new accounts linked to their prime Facebook profiles, resulting in all of their friends, half-acquaintances, and distant relatives receiving a notification to follow their supposedly private Finsta.

Six Instagram users queried by WIRED said that, though they either did not recall ever linking their Facebook and Instagram accounts or explicitly unlinked the two, they are still served notifications that can only be dismissed by clicking the “Open Facebook” button inside the Instagram app. Despite the fact that their accounts are not explicitly linked, clicking the button brings them to either the Facebook app or a logged-in mobile web version of the site.

Asked about the issue, a Facebook spokesperson at first said it was a bug, then later described it as a feature. Regardless of whether an Instagram user has elected to link their Facebook profile, so long as they have an account, the company has linked the two internally, and tapping “Open Facebook” in Instagram will take them to the associated account, the spokesperson said. “It’s just one of the ways that we can help people to understand that Facebook is there,” the spokesperson said.

All users will likely see a notification bubble in Instagram which can only be dismissed by clicking Open Facebook. However, the number of notifications served to users who haven’t linked their Facebook accounts will effectively be made up.

“With an unlinked account … it’s not an accurate representation of what your actual number of Facebook notifications are,” the spokesperson explained. Tapping the Open Facebook button, the spokesperson said, ”will again either open the app if you have it or just open you onto the web page.”

The Facebook spokesperson says the company began testing the Open Facebook feature in June 2018 and introduced it to some users in August 2018. The spokesperson wasn’t sure whether the Open Facebook feature was currently the default for all users, or whether it was still being rolled out to all users.