Schlagwort-Archive: Facebook

Sex, Beer, and Coding: Inside Facebook’s Wild Early Days

Adam Fisher @ Wired Magazine Source

Image may contain Mark Zuckerberg Clothing Apparel Human Person Face Jacket and Coat

Mark Zuckerberg and his cofounders moved from Harvard to Palo Alto, California, in March 2004. The whole enterprise began as something of a lark.Scott Beale

 

Sex, Beer, and Coding: Inside Facebook’s Wild Early Days

When the young Mark Zuckerberg moved to Palo Alto in 2004, he and his buddies built a corporate proto-culture that continues to influence the company today.
Image may contain Mark Zuckerberg Clothing Apparel Human Person Face Jacket and Coat
Mark Zuckerberg and his cofounders moved from Harvard to Palo Alto, California, in March 2004. The whole enterprise began as something of a lark.Scott Beale

This story is excerpted from Valley of Genius, by Adam Fisher.

Everyone who has seen The Social Network knows the story of Facebook’s founding. It was at Harvard in the spring semester of 2004. What people tend to forget, however, is that Facebook was only based in Cambridge for a few short months. Back then it was called TheFacebook.com, and it was a college-specific carbon copy of Friendster, a pioneering social network based in Silicon Valley.

Mark Zuckerberg’s knockoff site was a hit on campus, and so he and a few school chums decided to move to Silicon Valley after finals and spend the summer there rolling Facebook out to other colleges, nationwide. The Valley was where the internet action was. Or so they thought.

In Silicon Valley during the mid-aughts the conventional wisdom was that the internet gold rush was largely over. The land had been grabbed. The frontier had been settled. The web had been won. Hell, the boom had gone bust three years earlier. Yet nobody ever bothered to send the memo to Mark Zuckerberg—because at the time, Zuck was a nobody: an ambitious teenaged college student obsessed with the computer underground. He knew his way around computers, but other than that, he was pretty clueless—when he was still at Harvard someone had to explain to him that internet sites like Napster were actually businesses, built by corporations.

Image may contain Text
Excerpted from Valley of Genius by Adam Fisher. Copyright © 2018. Available on Amazon and from Twelve Books, an imprint of Hachette Book Group, Inc.

But Zuckerberg could hack, and that fateful summer he ended up meeting a few key Silicon Valley players who would end up radically changing the direction of what was, at the time, a company in name only. For this oral history of those critical months back in 2004 and 2005, I interviewed all the key players and talked to a few other figures who had insight into the founding era. What emerged, as you’ll see, is a portrait of a corporate proto-culture that continues to exert an influence on Facebook today. The whole enterprise began as something of a lark, it was an un-corporation, an excuse for a summer of beer pong and code sprints. Indeed, Zuckerberg’s first business cards read, “I’m CEO … bitch.” The brogrammer ’tude was a joke … or was it?

Image may contain Human Person Mark Zuckerberg Footwear Shoe Clothing Apparel Sitting Flooring and Floor
Zuckerberg, photographed in March 2006 at the headquarters of Facebook in Palo Alto. His first business card read “I’m CEO … bitch.”

Elena Dorfman/Redux


Sean Parker (cofounder of Napster and first president of Facebook): The dotcom era sort of ended with Napster, then there’s the dotcom bust, which leads to the social media era.

Steven Johnson (noted author and cultural commentator): At the time, the web was fundamentally a literary metaphor: “pages”—and then these hypertext links between pages. There was no concept of the user; that was not part of the metaphor at all.

Mark Pincus (co-owner of the fundamental social media patent): I mark Napster as the beginning of the social web—people, not pages. For me that was the breakthrough moment, because I saw that the internet could be this completely distributed peer-to-peer network. We could disintermediate those big media companies and all be connected to each other.

Steven Johnson: To me it really started with blogging in the early 2000s. You started to have these sites that were oriented around a single person’s point of view. It suddenly became possible to imagine, Oh, maybe there’s another element here that the web could also be organized around? Like I trust these five people, I’d like to see what they are suggesting. And that’s kind of what early blogging was like.

Ev Williams (founder of Blogger, Twitter, and Medium): Blogs then were link heavy and mostly about the internet. “We’re on the internet writing about the internet, and then linking to more of the internet, and isn’t that fun?”

Steven Johnson: You would pull together a bunch of different voices that would basically recommend links to you, and so there was a personal filter.

Mark Pincus: In 2002 Reid Hoffman and I started brainstorming: What if the web could be like a great cocktail party? Where you can walk away with these amazing leads, right? And what’s a good lead? A good lead is a job, an interview, a date, an apartment, a house, a couch.

And so Reid and I started saying, “Wow, this people web could actually generate something more valuable than Google, because you’re in this very, very highly vetted community that has some affinity to each other, and everyone is there for a reason, so you have trust.” The signal-to-noise ratio could be be very high. We called it Web 2.0, but nobody wanted to hear about it, because this was in the nuclear winter of the consumer internet.

Sean Parker: So during the period between 2000 and 2004, kind of leading up to Facebook, there is this feeling that everything that there was to be done with the internet has already been done. The absolute bottom is probably around 2002. PayPal goes public in 2002, and it’s the only consumer internet IPO. So there’s this weird interim period where there’s a total of only six companies funded or something like that. Plaxo was one of them. Plaxo was a proto–social network. It’s this in-between thing: some kind of weird fish with legs.

Aaron Sittig (graphic designer who invented the Facebook „like“): Plaxo is the missing link. Plaxo was the first viral growth company to really succeed intentionally. This is when we really started to understand viral growth.

Sean Parker: The most important thing I ever worked on was developing algorithms for optimizing virality at Plaxo.

Aaron Sittig: Viral growth is when people using the product spreads the product to other people—that’s it. It’s not people deciding to spread the product because they like it. It’s just people in the natural course of using the software to do what they want to do, naturally spreading it to other people.

Sean Parker: There was an evolution that took place from the sort of earliest proto–social network, which is probably Napster, to Plaxo, which only sort of resembled a social network but had many of the characteristics of one, then to LinkedIn, MySpace, and Friendster, then to this modern network which is Facebook.

Ezra Callahan (one of Facebook’s very first employees): In the early 2000s, Friendster gets all the early adopters, has a really dense network, has a lot of activity, and then just hits this breaking point.

Aaron Sittig: There was this big race going on and Friendster had really taken off, and it really seemed like Friendster had invented this new thing called “social networking,” and they were the winner, the clear winner. And it’s not entirely clear what happened, but the site just started getting slower and slower and at some point it just stopped working.

Ezra Callahan: And that opens the door for MySpace.

Ev Williams: MySpace was a big deal at the time.

Sean Parker: It was a complicated time. MySpace had very quickly taken over the world from Friendster. They’d seized the mantle. So Friendster was declining, MySpace was ascending.

Scott Marlette (programmer who put photo tagging on Facebook): MySpace was really popular, but then MySpace had scaling trouble, too.

Aaron Sittig: Then pretty much unheralded and not talked about much, Facebook launched in February of 2004.

Dustin Moskovitz (Zuckerberg’s original right-hand man): Back then there was a really common problem that now seems trivial. It was basically impossible to think of a person by name and go and look up their picture. All of the dorms at Harvard had individual directories called face books—some were printed, some were online, and most were only available to the students of that particular dorm. So we decided to create a unified version online and we dubbed it “The Facebook” to differentiate it from the individual ones.

Image may contain Mark Zuckerberg Furniture Human Person Electronics Lcd Screen Monitor Screen Display and Footwear
Zuckerberg, left, cofounded, Facebook with his Harvard roommate, Dustin Moskovitz, center. Sean Parker, right, joined the company as president in 2004. The trio was photographed in the company’s Palo Alto office in May 2005.

Jim Wilson/New York Times/Redux

Mark Zuckerberg (Facebook’s founder and current CEO): And within a couple weeks, a few thousand people had signed up. And we started getting emails from people at other colleges asking for us to launch it at their schools.

Ezra Callahan: Facebook launched at the Ivy Leagues originally, and it wasn’t because they were snooty, stuck-up kids who only wanted to give things to the Ivy Leagues. It was because they had this intuition that people who go to the Ivy Leagues are more likely to be friends with kids at other Ivy League schools.

Aaron Sittig: When Facebook launched at Berkeley, the rules of socializing just totally transformed. When I started at Berkeley, the way you found out about parties was you spent all week talking to people figuring out what was interesting, and then you’d have to constantly be in contact. With Facebook there, knowing what was going on on the weekend was trivial. It was just all laid out for you.

Facebook came to the Stanford campus—in the heart of Silicon Valley— quite early: March 2004.


Sean Parker: My roommates in Portola Valley were all going to Stanford.

Ezra Callahan: So I was a year out of Stanford, I graduated Stanford in 2003, and me and four of my college friends rented a house for that year just near the campus, and we had an extra bedroom available, and so we advertised around on a few Stanford email lists to find a roommate to move into that house with us. We got a reply from this guy named Sean Parker. He ended up moving in with us pretty randomly, and we discovered that while Napster had been a cultural phenomenon, it didn’t make him any money.

Sean Parker: And so the girlfriend of one of my roommates was using a product, and I was like, “You know, that looks a lot like Friendster or MySpace.” She’s like, “Oh yes, well, nobody in college uses MySpace.” There was something a little rough about MySpace.

Mark Zuckerberg: So MySpace had almost a third of their staff monitoring the pictures that got uploaded for pornography. We hardly ever have any pornography uploaded. The reason is that people use their real names on Facebook.

Adam D’Angelo (Zuckerberg’s high school hacking buddy): Real names are really important.

Aaron Sittig: We got this clear early on because of something that was established as a community principle at the Well: You own your own words. And we took it farther than the Well. We always had everything be traceable back to a specific real person.

Stewart Brand (founder of the Well, the first important social networking site): The Well could have gone that route, but we did not. That was one of the mistakes we made.

Mark Zuckerberg: And I think that that’s a really simple social solution to a possibly complex technical issue.

Ezra Callahan: In this early period, it’s a fairly hacked-together, simple website: just basic web forms, because that’s what Facebook profiles are.

Ruchi Sanghvi (coder who created Facebook’s Newsfeed): There was a little profile pic, and it said things like, “This is my profile” and “See my friends,” and there were three or four links and one or two other boxes below that.

Aaron Sittig: But I was really impressed by how focused and clear their product was. Small details—like when you went to your profile, it really clearly said, “This is you,” because social networking at the time was really, really hard to understand. So there was a maturity in the product that you don’t typically see until a product has been out there for a couple of years and been refined.

Sean Parker: So I see this thing, and I emailed some email address at Facebook, and I basically said, “I’ve been working with Friendster for a while, and I’d just like to meet you guys and see if maybe there’s anything to talk about.” And so we set up this meeting in New York—I have no idea why it was in New York—and Mark and I just started talking about product design and what I thought the product needed.

Aaron Sittig: I got a call from Sean Parker and he said, “Hey, I’m in New York. I just met with this kid Mark Zuckerberg, who is very smart, and he’s the guy building Facebook, and they say they have a ‘secret feature’ that’s going to launch that’s going to change everything! But he won’t tell me what it is. It’s driving me crazy. I can’t figure out what it is. Do you know anything about this? Can you figure it out? What do you think it could be?” And so we spent a little time talking about it, and we couldn’t really figure out what their “secret feature” that was going to change everything was. We got kind of obsessed about it.

Two months after meeting Sean Parker, Mark Zuckerberg moved to Silicon Valley with the idea of turning his dorm‐room project into a real business. Accompanying him were his cofounder and consigliere, Dustin Moskovitz, and a couple of interns.

Mark Zuckerberg: Palo Alto was kind of like this mythical place where all the tech used to come from. So I was like, I want to check that out.

Ruchi Sanghvi: I was pretty surprised when I heard Facebook moved to the Bay Area, I thought they were still at Harvard working out of the dorms.

Image may contain Human Person Sitting Chris Hughes Electronics Pc Computer Clothing Apparel Furniture and Laptop
Zuckerberg recruited fellow Harvard student Chris Hughes in the early days of Facebook to help make suggestions about the fledgling service. The two were photographed at Eliot House in May 2004.

Rick Friedman/Getty Images


Ezra Callahan: Summer of 2004 is when that fateful series of events took place: that legendary story of Sean running into the Facebook cofounders on the street, having met them a couple months earlier on the East Coast. That meeting happened a week after we all moved out of the house we had been living in together. Sean was crashing with his girlfriend’s parents.

Sean Parker: I was walking outside the house, and there was this group of kids walking toward me—they were all wearing hoodies and they looked like they were probably pot-smoking high-school kids just out making trouble, and I hear my name. I’m like, Oh, it’s coincidence, and I hear my name again and I turn around and it’s like, “Sean, what are you doing here?”

It took me about 30 seconds to figure out what was going on, and I finally realize that it’s Mark and Dustin and a couple of other people, too. So I’m like, “What are you guys doing here?” And they’re like, “We live right there.” And I’m like, “That’s really weird, I live right here!” This is just super weird.

Aaron Sittig: I get a call from Sean and he’s telling me, “Hey, you won’t believe what’s just happened.” And Sean said, “You’ve got to come over and meet these guys. Just leave right now. Just come over and meet them!”

Sean Parker: And so I don’t even know what happened from there, other than that it just became very convenient for me to go swing by the house. It wasn’t even a particularly formal relationship.

Aaron Sittig: So I went over and met them, and I was really impressed by how focused they were as a group. They’d occasionally relax and go do their thing, but for the most part they spent all their time sitting at a kitchen table with their laptops open. I would go visit their place a couple times a week, and that was always where I’d find them, just sitting around the kitchen table working, constantly, to keep their product growing.

All Mark wanted to do was either make the product better, or take a break and relax so that you could get enough energy to go work on the product more. That’s it. They never left that house except to go watch a movie.

Ezra Callahan: The early company culture was very, very loose. It felt like a project that’s gotten out of control and has this amazing business potential. Imagine your freshman dorm running a business, that’s really what it felt like.

Mark Zuckerberg: Most businesses aren’t like a bunch of kids living in a house, doing whatever they want, not waking up at a normal time, not going into an office, hiring people by, like, bringing them into your house and letting them chill with you for a while and party with you and smoke with you.

Ezra Callahan: The living room was the office with all these monitors and workstations set up everywhere and just whiteboards as far as the eye can see.

At the time Mark Zuckerberg was obsessed with file sharing, and the grand plan for his Silicon Valley summer was to resurrect Napster. It would rise again, but this time as a feature inside of Facebook. The name of Zuckerberg’s pet project? Wirehog.

Aaron Sittig: Wirehog was the secret feature that Mark had promised was going to change everything. Mark had gotten convinced that what would make Facebook really popular and just sort of cement its position at schools was a way to send files around to other people—mostly just to trade music.

Mark Pincus: They built in this little thing that looked like Napster—you could see what music files someone had on their computer.

Ezra Callahan: This is at a time when we have just watched Napster get completely terminated by the courts and the entertainment industry is starting to sue random individuals for sharing files. The days of the Wild West were clearly ending.

Aaron Sittig: It’s important to remember that Wirehog was happening at a time where you couldn’t even share photos on your Facebook page. Wirehog was going to be the solution for sharing photos with other people. You could have a box on your profile and people could go there to get access to all your photos that you were sharing—or whatever files you were sharing. It might be audio files, it might be video files, it might be photos of their vacation.

Ezra Callahan: But at the end of the day it’s just a file-sharing service. When I joined Facebook, most people had already kind of come around to the idea that unless some new use comes up for Wirehog that we haven’t thought of, it’s just a liability. “We’re going to get sued someday, so what’s the point?” That was the mentality.

Mark Pincus: I was kind of wondering why Sean wanted to go anywhere near music again.

Aaron Sittig: My understanding was that some of Facebook’s lawyers advised that it would be a bad idea. And that work on Wirehog was kind of abandoned just as Facebook user growth started to grow really quickly.

Ezra Callahan: They had this insane demand to join. It’s still only at a hundred schools, but everyone in college has already heard of this, at all schools across the country. The usage numbers were already insane. Everything on the whiteboards was just all stuff related to what schools were going to launch next. The problem was very singular. It was simply, “How do we scale?”


Aaron Sittig: Facebook would launch at a school, and within one day they would have 70 percent of undergrads signed up. At the time, nothing had ever grown as fast as Facebook.

Ezra Callahan: It did not seem inevitable that we were going to succeed, but the scope of what success looked like was becoming clear. Dustin was already talking about being a billion-dollar company. They had that ambition from the very beginning. They were very confident: two 19-year-old cocky kids.

Mark Zuckerberg: We just all kind of sat around one day and were like, “We’re not going back to school, are we?” Nahhhh.

Ezra Callahan: The hubris seemed pretty remarkable.

David Choe (noted graffiti artist): And Sean is a skinny, nerdy kid and he’s like, “I’m going to go raise money for Facebook. I’m going to bend these fuckers’ minds.” And I’m like, “How are you going to do that?” And he transformed himself into an alpha male. He got like a fucking super-sharp haircut. He started working out every day, got a tan, got a nice suit. And he goes in these meetings and he got the money!

Mark Pincus: So it’s probably like September or October of 2004, and I’m at Tribe’s offices in this dusty converted brick building in Potrero Hill—the idea of Tribe.net was like Friendster meets Craigslist—and we’re in our conference room, and Sean says he’s bringing the Facebook guy in. And he brings Zuck in, and Zuck is in a pair of sweatpants, and these Adidas flip-flops that he wore, and he’s so young looking and he’s sitting there with his feet up on the table, and Sean is talking really fast about all the things Facebook is going to do and grow and everything else, and I was mesmerized.

Because I’m doing Tribe, and we are not succeeding, we’ve plateaued and we’re hitting our head against the wall trying to figure out how to grow, and here’s this kid, who has this simple idea, and he’s just taking off! I was kind of in awe already of what they had accomplished, and maybe a little annoyed by it. Because they did something simpler and quicker and with less, and then I remember Sean got on the computer in my office, and he pulled up The Facebook, and he starts showing it to me, and I had never been able to be on it, because it’s college kids only, and it was amazing.

People are putting up their phone numbers and home addresses and everything about themselves and I was like, I can’t believe it! But it was because they had all this trust. And then Sean put together an investment round quickly, and he had advised Zuck to, I think, take $500,000 from Peter Thiel, and then $38,000 each from me and Reid Hoffman. Because we were basically the only other people doing anything in social networking. It was a very, very small little club at the time.

Ezra Callahan: By December it’s—I wouldn’t say it’s like a more professional atmosphere, but all the kids that Mark and Dustin were hanging out with are either back at school back East or back at Stanford, and work has gotten a little more serious for them. They are working more than they were that first summer. We don’t move into an office until February of 2005. And right as we were signing the lease, Sean just randomly starts saying, “Dude! I know this street artist guy. We’re going to come in and have him totally do it up.”

David Choe: I was like, “If you want me to paint the entire building it’s going to be $60,000.” Sean’s like, “Do you want cash or do you want stock?”

Ezra Callahan: He pays David Choe in Facebook shares.

David Choe: I didn’t give a shit about Facebook or even know what it was. You had to have a college email to get on there. But I like to gamble, you know? I believed in Sean. I’m like, This kid knows something and I am going to bet my money on him.

Ezra Callahan: So then we move in, and when you first saw this graffiti it was like, “Holy shit, what did this guy do to the office?” The office was on the second floor, so as you walk in you immediately have to walk up some stairs, and on the big 10-foot-high wall facing you is just this huge buxom woman with enormous breasts wearing this Mad Max–style costume riding a bulldog.

It’s the most intimidating, totally inappropriate thing. “God damn it, Sean! What did you do?” It’s not so much that we set out to paint that, because that was the culture. It was more that Sean just did it, and that set a tone for us. A huge-breasted warrior woman riding a bulldog is the first thing you see as you come in the office, so like, get ready for that!

Ruchi Sanghvi: Yes, the graffiti was a little racy, but it was different, it was vibrant, it was alive. The energy was just so tangible.

Katie Geminder (project manager for early Facebook): I liked it, but it was really intense. There was certain imagery in there that was very sexually charged, which I didn’t really care about but that could be considered a little bit hostile, and I think we took care of some of the more provocative ones.

Ezra Callahan: I don’t think it was David Choe, I think it was Sean’s girlfriend who painted this explicit, intimate lesbian scene in the woman’s restroom of two completely naked women intertwined and cuddling with each other—not graphic, but certainly far more suggestive than what one would normally see in a women’s bathroom in an office. That one only actually lasted a few weeks.

Max Kelly (Facebook’s first cyber-security officer): There was a four-inch by four-inch drawing of someone getting fucked. One of the customer service people complained that it was “sexual in nature,” which, given what they were seeing every day, I’m not sure why they would complain about this. But I ended up going to a local store and buying a gold paint pen and defacing the graffiti—just a random design— so it didn’t show someone getting fucked.

Jeff Rothschild (investor turned Facebook employee): It was wild, but I thought that it was pretty cool. It looked a lot more like a college dorm or fraternity than it did a company.

Katie Geminder: There were blankets shoved in the corner and video games everywhere, and Nerf toys and Legos, and it was kind of a mess.

Jeff Rothschild: There’s a PlayStation. There’s a couple of old couches. It was clear people were sleeping there.

Karel Baloun (one of the earliest Facebook programmers): I’d probably stay there two or three nights a week. I won an award for “most likely to be found under your desk” at one of the employee gatherings.

Jeff Rothschild: They had a bar, a whole shelf with liquor, and after a long day people might have a drink.

Ezra Callahan: There’s a lot of drinking in the office. There would be mornings when I would walk in and hear beer cans move as I opened the door, and the office smells of stale beer and is just trashed.

Ruchi Sanghvi: They had a keg. There was some camera technology built on top of the keg. It basically detected presence and posted about who was present at the keg—so it would take your picture when you were at the keg, and post some sort of thing saying “so-and-so is at the keg.” The keg is patented.

Ezra Callahan: When we first moved in, the office door had this lock we couldn’t figure out, but the door would automatically unlock at 9 am every morning. I was the guy that had to get to the office by 9 to make sure nobody walked in and just stole everything, because no one else was going to get there before noon. All the Facebook guys are basically nocturnal.

Katie Geminder: These kids would come in—and I mean kids, literally they were kids—they’d come into work at 11 or 12.

Ruchi Sanghvi: Sometimes I would walk to work in my pajamas and that would be totally fine. It felt like an extension of college; all of us were going through the same life experiences at the same time. Work was fantastic. It was so interesting. It didn’t feel like work. It felt like we were having fun all the time.

Ezra Callahan: You’re hanging out. You’re drinking with your coworkers. People start dating within the office …


Ruchi Sanghvi: We found our significant others while we were at Facebook. All of us eventually got married. Now we’re in this phase where we’re having children.

Katie Geminder: If you look at the adults that worked at Facebook during those first few years—like, anyone over the age of 30 that was married—and you do a survey, I tell you that probably 75 percent of them are divorced.

Max Kelly: So, lunch would happen. The caterer we had was mentally unbalanced and you never knew what the fuck was going to show up in the food. There were worms in the fish one time. It was all terrible. Usually, I would work until about 3 in the afternoon and then I’d do a circuit through the office to try and figure out what the fuck was going to happen that night. Who was going to launch what? Who was ready? What rumors were going on? What was happening?

Steve Perlman (Silicon Valley veteran who started in the Atari era): We shared a break room with Facebook. We were building hardware: a facial capture technology. The Facebook guys were doing some HTML thing. They would come in late in the morning. They’d have a catered lunch. Then they leave usually by mid-afternoon. I’m like, man, that is the life! I need a startup like that. You know? And the only thing any of us could think about Facebook was: Really nice people but never going to go anywhere.

Max Kelly: Around 4 I’d have a meeting with my team, saying “here’s how we’re going to get fucked tonight.” And then we’d go to the bar. Between like 5 and 8-ish people would break off and go to different bars up and down University Avenue, have dinner, whatever.

Ruchi Sanghvi: And we would all sit together and have these intellectual conversations: “Hypothetically, if this network was a graph, how would you weight the relationship between two people? How would you weight the relationship between a person and a photo? What does that look like? What would this network eventually look like? What could we do with this network if we actually had it?”

Sean Parker: The “social graph” is a math concept from graph theory, but it was a way of trying to explain to people who were kind of academic and mathematically inclined that what we were building was not a product so much as it was a network composed of nodes with a lot of information flowing between those nodes. That’s graph theory. Therefore we’re building a social graph. It was never meant to be talked about publicly. It was a way of articulating to somebody with a math background what we were building.

Ruchi Sanghvi: In retrospect, I can’t believe we had those conversations back then. It seems like such a mature thing to be doing. We would sit around and have these conversations and they weren’t restricted to certain members of the team; they weren’t tied to any definite outcome. It was purely intellectual and was open to everyone.

Max Kelly: People were still drinking the whole time, like all night, but starting around 9, it really starts solidifying: “What are we going to release tonight? Who’s ready to go? Who’s not ready to go?” By about 11-ish we’d know what we were going to do that night.

Katie Geminder: There was an absence of process that was mind-blowing. There would be engineers working stealthily on something that they were passionate about. And then they’d ship it in the middle of the night. No testing—they would just ship it.

Ezra Callahan: Most websites have these very robust testing platforms so that they can test changes. That’s not how we did it.

Ruchi Sanghvi: With the push of a button you could push out code to the live site, because we truly believed in this philosophy of “move fast and break things.” So you shouldn’t have to wait to do it once a week, and you shouldn’t have to wait to do it once a day. If your code was ready you should be able to push it out live to users. And that was obviously a nightmare.

Katie Geminder: Can our servers stand up to something? Or security: How about testing a feature for security holes? It really was just shove it out there and see what happens.

Jeff Rothschild: That’s the hacker mentality: You just get it done. And it worked great when you had 10 people. By the time we got to 20, or 30, or 40, I was spending a lot of time trying to keep the site up. And so we had to develop some level of discipline.

Ruchi Sanghvi: So then we would only push out code in the middle of the night, and that’s because if we broke things it wouldn’t impact that many people. But it was terrible because we were up until like 3 or 4 am every night, because the act of pushing just took everybody who had committed any code to be present in case anything broke.

Max Kelly: Around 1 am, we’d know either we’re fucked or we’re good. If we were good, everyone would be like “whoopee” and might be able to sleep for a little while. If we were fucked then we were like, “OK, now we’ve got to try and claw this thing back or fix it.”

Katie Geminder: 2 am: That was when shit happened.

Ruchi Sanghvi: Then another push, and this would just go on and on and on and on and on until like 3 or 4 or 5 am in the night.

Max Kelly: If 4 am rolled around and we couldn’t fix it, I’d be like, “We’re going to try and revert it.” Which meant basically my team would be up till 6 am So, go to bed somewhere between 4 and 6, and then repeat every day for like nine months. It was crazy.

Jeff Rothschild: It was seven days a week. I was on all the time. I would drink a large glass of water before I went to sleep to assure that I’d wake up in two hours so I could go check everything and make sure that we hadn’t broken it in the meantime. It was all day, all night.

Katie Geminder: That was very challenging for someone who was trying to actually live an adult life with, like, a husband. There was definitely a feeling that because you were older and married and had a life outside of work that you weren’t committed.

Mark Zuckerberg: Why are most chess masters under 30? Young people just have simpler lives. We may not own a car. We may not have family … I only own a mattress.

Kate Geminder: Imagine being over 30 and hearing your boss say that!

Mark Zuckerberg: Young people are just smarter.

Ruchi Sanghvi: We were so young back then. We definitely had tons of energy and we could do it, but we weren’t necessarily the most efficient team by any means whatsoever. It was definitely frustrating for senior leadership, because a lot of the conversations happened at night when they weren’t around, and then the next morning they would come in to all of these changes that happened at night. But it was fun when we did it.


Ezra Callahan: For the first few hundred employees, almost all of them were already friends with someone working at the company, both within the engineering circle and also the user support people. It’s a lot of recent grads. When we move into the office was when the dorm room culture starts to really stick out and also starts to break a little bit. It has a dorm room feeling, but it’s not completely dominated by college kids. The adults are coming in.

Jeff Rothschild: I joined in May 2005. On the sidewalk outside the office was the menu board from a pizza parlor. It was a caricature of a chef with a blackboard below it, and the blackboard had a list of jobs. This was the recruiting effort.

Sean Parker: At the time there was a giant sucking sound in the universe, and it was called Google. All the great engineers were going to Google.

Kate Losse (early customer service rep): I don’t think I could have stood working at Google. To me Facebook seemed much cooler than Google, not because Facebook was necessarily like the coolest. It’s just that Google at that point already seemed nerdy in an uninteresting way, whereas like Facebook had a lot of people who didn’t actually want to come off as nerds. Facebook was a social network, so it has to have some social components that are like really normal American social activities—like beer pong.

Kate Geminder: There was a house down the street from the office where five or six of the engineers lived that was one ongoing beer pong party. It was like a boys’ club—although it wasn’t just boys.

Terry Winograd (noted Stanford computer-science professor): The way I would put it is that Facebook is more of an undergraduate culture and Google is more of a graduate student culture.

Jeff Rothschild: Before I walked in the door at Facebook, I thought these guys had created a dating site. It took me probably a week or two before I really understood what it was about. Mark, he used to tell us that we are not a social network. He would insist: “This is not a social network. We’re a social utility for people you actually know.”

MySpace was about building an online community among people who had similar interests. We might look the same because at some level it has the same shape, but what it accomplishes for the individual is solving a different problem. We were trying to improve the efficiency of communication among friends.

Max Kelly: Mark sat down with me and described to me what he saw Facebook being. He said, “It’s about connecting people and building a system where everyone who makes a connection to your life that has any value is preserved for as long as you want it to be preserved. And it doesn’t matter where you are, or who you’re with, or how your life changes: because you’re always in connection with the people that matter the most to you, and you’re always able to share with them.”

I heard that, and I thought, I want to be a part of this. I want to make this happen. Back in the ’90s all of us were utopian about the internet. This was almost a harkening back to the beautiful internet where everyone would be connected and everyone could share and there was no friction to doing that. Facebook sounded to me like the same thing. Mark was too young to know that time, but I think he intrinsically understood what the internet was supposed to be in the ’80s and in the ’90s. And here I was hearing the same story again and conceivably having the ability to help pull it off. That was very attractive.

Aaron Sittig: So in the summer of 2005 Mark sat us all down and he said, “We’re going to do five things this summer.” He said, “We’re redesigning the site. We’re doing a thing called News Feed, which is going to tell you everything your friends are doing on the site. We’re going to launch Photos, we’re going to redo Parties and turn it into Events, and we’re going to do a local-businesses product.” And we got one of those things done, we redesigned the site. Photos was my next project.

Ezra Callahan: The product at Facebook at the time is dead simple: profiles. There is no News Feed, there was a very weak messaging system. They had a very rudimentary events product you could use to organize parties. And almost no other functions to speak of. There’s no photos on the website, other than your profile photo. There’s nothing that tells you when anything on the site has changed. You find out somebody changed their profile picture by obsessively going to their profile and noticing, Oh, the picture changed.

Aaron Sittig: We had some people that were changing their profile picture once an hour, just as a way of sharing photos of themselves.

Scott Marlette: At the time photos was the number-one most requested feature. So, Aaron and I go into a room and whiteboard up some wireframes for some pages and decide on what data needs to get stored. In a month we had a nearly fully functioning prototype internally to play with. It was very simple. It was: You post a photo, it goes in an album, you have a set of albums, and then you can tag people in the photos.

Jeff Rothschild: Aaron had the insight to do tagging, which was a tremendously valuable insight. It was really a game changer.

Aaron Sittig: We thought the key feature is going to be saying who is in the photo. We weren’t sure if this was really going to be that successful; we just felt good about it.

Facebook Photos went live in October 2005. There were about 5 million users, virtually all of them college students.

Scott Marlette: We launched it at Harvard and Stanford first, because that’s where our friends were.

Image may contain Randi Zuckerberg Mark Zuckerberg Pants Clothing Apparel Human Person Jeans Denim and Footwear
Zuckerberg started coding while growing up in Dobbs Ferry, New York, where he was raised by his parents, Edward and Karen along with his sisters Randi, left, and Arielle, right.

SHERRY TESLER/New York Times/Redux

Aaron Sittig: We had built this program that would fill up a TV screen and show us everything that was being uploaded to the service, and then we flicked it on and waited for photos to start coming in. And the first photos that came in were Windows wallpapers: Someone had just uploaded all their wallpaper files from their Windows directory, which was a big disappointment, like, Oh no, maybe people don’t get it? Maybe this is not going to work?

But the next photos were of a guy hanging out with his friends, and then the next photos after that were a bunch of girls in different arrangements: three girls together, these four girls together, two of them together, just photos of them hanging out at parties, and then it just didn’t stop.

Max Kelly: You were at every wedding, you were at every bar mitzvah, you were seeing all this awesome stuff, and then there’s a dick. So, it was kind of awesome and shitty at the same time.

Aaron Sittig: Within the first day someone had uploaded and tagged themselves in 700 photos, and it just sort of took off from there.

Jeff Rothschild: Inside of three months, we were delivering more photos than any other website on the internet. Now you have to ask yourself: Why? And the answer was tagging. There isn’t anyone who could get an email message that said, “Someone has uploaded a photo of you to the internet”—and not go take a look. It’s just human nature.

Ezra Callahan: The single greatest growth mechanism ever was photo tagging. It shaped all of the rest of the product decisions that got made. It was the first time that there was a real fundamental change to how people used Facebook, the pivotal moment when the mindset of Facebook changes and the idea for News Feed starts to germinate and there is now a reason to see how this expands beyond college.


Jeff Rothschild: The News Feed project was started in the fall of 2005 and delivered in the fall of 2006.

Dustin Moskovitz: News Feed is the concept of viral distribution, incarnate.

Ezra Callahan: News Feed is what Facebook fundamentally is today.

Sean Parker: Originally it was called “What’s New,” and it was just a feed of all of the things that were happening in the network—really just a collection of status updates and profile changes that were occurring.

Katie Geminder: It was an aggregation, a collection of all those stories, with some logic built into it because we couldn’t show you everything that was going on. There were sort of two streams: things you were doing and things the rest of your network was doing.

Ezra Callahan: So News Feed is the first time where now your homepage, rather than being static and boring and useless, is now going to be this constantly updating “newspaper,” so to speak, of stuff happening on Facebook around you that we think you’ll care about.

Ruchi Sanghvi: And it was a fascinating idea, because normally when you think of newspapers, they have this editorialized content where they decide what they want to say, what they want to print, and they do it the previous night, and then they send these papers out to thousands if not hundreds of thousands of people. But in the case of Facebook, we were building 10 million different newspapers, because each person had a personalized version of it.

Ezra Callahan: It really was the first monumental product-engineering feat. The amount of data it had to deal with: all these changes and how to propagate that on an individual level.

Ruchi Sanghvi: We were working on it off and on for a year and a half.

Ezra Callahan: … and then the intelligence side of all this stuff: How do we surface the things that you’ll care about most? These are very hard problems engineering-wise.

Ruchi Sanghvi: Without realizing it, we ended up building one of the largest distributed systems in software at that point in time. It was pretty cutting-edge.

Ezra Callahan: We have it in-house and we play with it for weeks and weeks—which is really unusual.

Katie Geminder: So I remember being like, “OK, you guys, we have to do some level of user research,” and I finally convinced Zuck that we should bring users into a lab and sit behind the glass and watch our users using the product. And it took so much effort for me to get Dustin and Zuck and other people to go and actually watch this. They thought this was a waste of time. They were like, “No, our users are stupid.” Literally those words came out of somebody’s mouth.

Ezra Callahan: It’s the very first time we actually bring in outside people to test something for us, and their reaction, their initial reaction is clear. People are just like, “Holy shit, like, I shouldn’t be seeing this, like this doesn’t feel right,” because immediately you see this person changed their profile picture, this person did this, this person did that, and your first instinct is Oh my God! Everybody can see this about me! Everyone knows everything I’m doing on Facebook.

Max Kelly: But News Feed made perfect sense to all of us, internally. We all loved it.

Ezra Callahan: So in-house we have this idea that this isn’t going to go right: This is too jarring a change, it needs to be rolled out slowly, we need to warm people up to this—and Mark is just firmly committed. “We’re just going to do this. We’re just going to launch. It’s like ripping off a Band-Aid.”

Ruchi Sanghvi: We pushed the product in the dead of the night, we were really excited, we were celebrating, and then the next morning we woke up to all this pushback. I had written this blog post, “Facebook Gets a Facelift.”

Katie Geminder: We wrote a little letter, and at the bottom of it we put a button. And the button said, “Awesome!” Not like, “OK.” It was, “Awesome!” That’s just rude. I wish I had a screenshot of that. Oh man! And that was it. You landed on Facebook and you got the feature. We gave you no choice and not a great explanation and it scared people.

Jeff Rothschild: People were rattled because it just seemed like it was exposing information that hadn’t been visible before. In fact, that wasn’t the case. Everything shown in News Feed was something people put on the site that would have been visible to everyone if they had gone and visited that profile.

Ruchi Sanghvi: Users were revolting. They were threatening to boycott the product. They felt that they had been violated, and that their privacy had been violated. There were students organizing petitions. People had lined up outside the office. We hired a security guard.

Katie Geminder: There were camera crews outside. There were protests: “Bring back the old Facebook!” Everyone hated it.

Jeff Rothschild: There was such a violent reaction to it. We had people marching on the office. A Facebook group was organized protesting News Feed and inside of two days, a million people joined.

Ruchi Sanghvi: There was another group that was about how “Ruchi is the devil,” because I had written that blog post.

Max Kelly: The user base fought it every step of the way and would pound us, pound Customer Service, and say, “This is fucked up! This is terrible!”

Ezra Callahan: We’re getting emails from relatives and friends. They’re like, “What did you do? This is terrible! Change it back.”

Katie Geminder: We were sitting in the office and the protests were going on outside and it was, “Do we roll it back? Do we roll it back!?”

Ruchi Sanghvi: Now under usual circumstances if about 10 percent of your user base starts to boycott the product, you would shut it down. But we saw a very unusual pattern emerge.

Max Kelly: Even the same people who were telling us that this is terrible, we’d look at their user stream and be like: You’re fucking using it constantly! What are you talking about?

Ruchi Sanghvi: Despite the fact that there were these revolts and these petitions and people were lined up outside the office, they were digging the product. They were actually using it, and they were using it twice as much as before News Feed.

Ezra Callahan: It was just an emotionally devastating few days for everyone at the company. Especially for the set of people who had been waving their arms saying, “Don’t do this! Don’t do this!” because they feel like, “This is exactly what we told you was going to happen!”

Ruchi Sanghvi: Mark was on his very first press tour on the East Coast, and the rest of us were in the Palo Alto office dealing with this and looking at these logs and seeing the engagement and trying to communicate that “It’s actually working!,” and to just try a few things before we chose to shut it down.

Katie Geminder: We had to push some privacy features right away to quell the storm.

Ruchi Sanghvi: We asked everyone to give us 24 hours.

Katie Geminder: We built this janky privacy “audio mixer” with these little slider bars where you could turn things on and off. It was beautifully designed—it looked gorgeous—but it was irrelevant.

Jeff Rothschild: I don’t think anyone ever used it.

Ezra Callahan: But it gets added and eventually the immediate reaction subsides and people realize that the News Feed is exactly what they wanted, this feature is exactly right, this just made Facebook a thousand times more useful.

Katie Geminder: Like Photos, News Feed was just—boom!—a major change in the product and one of those sea changes that just leveled it up.

Jeff Rothschild: Our usage just skyrocketed on the launch of News Feed. About the same time we also opened the site up to people who didn’t have a .edu address.

Ezra Callahan: Once it opens to the public, it’s becoming clear that Facebook is on its way to becoming the directory of all the people in the world.

Jeff Rothschild: Those two things together—that was the inflection point where Facebook became a massively used product. Prior to that we were a niche product for high-school and college students.

Mark Zuckerberg: Domination!

Ruchi Sanghvi: “Domination” was a big mantra of Facebook back in the day.

Max Kelly: I remember company meetings where we were chanting “dominate.”

Ezra Callahan: We had company parties all the time, and for a period in 2005, all Mark’s toasts at the company parties would end with “Domination!”

Mark Zuckerberg: Domination!!


Max Kelly: I especially remember the meeting where we tore up the Yahoo offer.

Mark Pincus: In 2006 Yahoo offered Facebook $1.2 billion ,I think it was, and it seemed like a breathtaking offer at the time, and it was difficult to imagine them not taking it. Everyone had seen Napster flame out, Friendster flame out, MySpace flame out, so to be a company with no revenues, and a credible company offers a billion-two, and to say no to that? You have to have a lot of respect to founders that say no to these offers.

Dustin Moskovitz: I was sure the product would suffer in a big way if Yahoo bought us. And Sean was telling me that 90 percent of all mergers end in failure.

Mark Pincus: Luckily, for Zuck, and history, Yahoo’s stock went down, and they wouldn’t change the offer. They said that the offer is a fixed number of shares, and so the offer dropped to like $800 million, and I think probably emotionally Zuck didn’t want to do it and it gave him a clear out. If Yahoo had said, “No problem, we’ll back that up with cash or stock to make it $1.2 billion,” it might have been a lot harder for Zuck to say no, and maybe Facebook would be a little division of Yahoo today.

Max Kelly: We literally tore the Yahoo offer up and stomped on it as a company! We were like, “Fuck those guys, we are going to own them!” That was some malice-ass bullshit.

Mark Zuckerberg: Domination!!!

Kate Losse: He had kind of an ironic way of saying it. It wasn’t a totally flat, scary “domination.” It was funny. It’s only when you think about a much bigger scale of things that you’re like, Hmmmm: Are people aware that their interactions are being architected by a group of people who have a certain set of ideas about how the world works and what’s good?

Ezra Callahan: “How much was the direction of the internet influenced by the perspective of 19-, 20-, 21-year-old well-off white boys?” That’s a real question that sociologists will be studying forever.

Kate Losse: I don’t think most people really think about the impact that the values of a few people now have on everyone.

Steven Johnson: I think there’s legitimate debate about this. Facebook has certainly contributed to some echo chamber problems and political polarization problems, but I spent a lot of time arguing that the internet is less responsible for that than people think.

Mark Pincus: Maybe I’m too close to it all, but I think that when you pull the camera back, none of us really matter that much. I think the internet is following a path to where the internet wants to go. We’re all trying to figure out what consumers want, and if what people want is this massive echo chamber and this vain world of likes, someone is going to give it to them, and they’re going to be the one who wins, and the ones who don’t, won’t.

Steve Jobs: I don’t see anybody other than Facebook out there—they’re dominating.

Mark Pincus: So I don’t exactly think that a bunch of college boys shaped the internet. I just think they got there first.

Mark Zuckerberg: Domination!!!!

Ezra Callahan: So, it’s not until we have a full-time general council onboard who finally says, “Mark, for the love of God: You cannot use the word domination anymore,” that he stops.

Sean Parker: Once you are dominant, then suddenly it becomes an anticompetitive term.

Steven Johnson: It took the internet 30 years to get to 1 billion users. It took Facebook 10 years. The crucial thing about Facebook is that it’s not a service or an app—it’s a fundamental platform, on the same scale as the internet itself.

Steve Jobs: I admire Mark Zuckerberg. I only know him a little bit, but I admire him for not selling out—for wanting to make a company. I admire that a lot.


Author’s Note:

The written language is very different from the spoken word. And so, I’ve taken the liberty of correcting slips of the tongue, dividing streams of consciousness into sentences, ordering sentences into paragraphs, and eliminating redundancies. The point is not to polish and make what was originally spoken read as if it were written, but rather to make the verbatim transcripts of what was actually said readable in the first place.

That said, I’ve been careful to retain the rhythms of speech and quirks of language of everyone interviewed for this article intact, so that what you hear in your mind’s ear as you read is true in every sense of the word: true to life, true to the transcript, and true to the speakers‘ intended meaning.

The vast majority of the words found in this article originated in interviews that were given to me especially for this article. Where that wasn’t possible I tried, with some success, to unearth previously unpublished interviews and quote from them. And in a few cases I’ve resorted to quoting from interviews that have been published before.

Mark Zuckerberg’s quotes were uttered at a guest lecture he gave to Harvard’s Introduction to Computer Science class in 2005 and in an interview he gave to the Harvard Crimson in February that same year. Dustin Moskovitz’s quotes were taken from a keynote address at the Alliance of Youth Movements Summit in December of 2008 and from David Kirkpatrick’s authoritative history, The Facebook Effect. David Choe’s comments were made on The Howard Stern Show in March 2016. Steve Jobs made his remarks to his biographer, Walter Isaacson. The interview was aired on 60 Minutes soon after Jobs died in 2011.


This story is excerpted from Valley of Genius, by Adam Fisher.

Facebook knows so much about its users that it can link their accounts, even when created under different names, from different devices.

Source: https://www.wired.com/story/instagram-unlink-account-wont-unlink-facebook/

The settings on Instagram include a page devoted to the “Linked Accounts” feature. As you might expect, it displays … your linked accounts. Users have the option to connect to Twitter, Tumblr, and, of course, Instagram’s parent company, Facebook, among others.

On first glance, the feature appears pretty straightforward—apps that aren’t linked are shown in gray, linked apps appear in color. When it comes to Facebook, however, the feature may be misleading.

Like other platforms shown under the “Linked Accounts” menu on Instagram, the option to link your Facebook profile is ostensibly disabled by default. Users must tap the app’s grayed out logo and sign in before Instagram displays the two as connected. Once two profiles are connected, an option to “Unlink Account” appears in Instagram settings. Clicking there brings up a warning: “Unlinking makes it harder to get access to your Instagram account if you get locked out.”

Common sense suggests that if you unlink a Facebook account from your Instagram profile, you’ve unlinked that Facebook account from your Instagram profile. But like many things Facebook, common sense does not exactly apply here. Clicking Unlink Account does not actually unlink a Facebook account from Instagram, a Facebook spokesperson told WIRED, because it isn’t possible to separate the two. Even if a user never explicitly linked their Facebook and Instagram profiles, they are intrinsically connected—Finstagrams be damned—and will continue to be, regardless of how many times you mash “Unlink Account.”

That’s because the wealth of data that Facebook collects through its multiple services is more than enough to properly identify users’ various accounts and link them to one another. Even in cases where a different name, email address, or device was used to create each account—be it a throwaway WhatsApp profile, stalker Instagram account, or joke Facebook profile—Facebook often is able to suss out who is actually behind the account and whether they have accounts on other Facebook-owned apps.

“Because Facebook and Instagram share infrastructure, systems and technology, we connect information about your activities across our services based on a variety of signals,” a Facebook spokesperson told WIRED. “Linking or unlinking your accounts in the app doesn’t affect this.”

The disclosure comes as Facebook moves to integrate previously independent apps such as Instagram and WhatsApp. Messenger, Instagram, and WhatsApp are being combined into one mega-chat app (problematic enough on its own), while Instagram and WhatsApp have been rechristened as “Instagram from Facebook” and “WhatsApp from Facebook.”

But even as the apps are being woven more tightly together, they’re not all equal in the minds of Facebook executives. The Linked Accounts feature on Instagram appears designed to funnel traffic to Facebook, where user growth has flatlined, as Instagram’s growth continues apace. Meanwhile, Facebook last year made a contentious decision to stop funneling traffic to Instagram.

The spokesperson said Facebook began linking accounts behind the scenes based on data it had gathered about users shortly after it acquired Instagram in 2012. The spokesperson said that Facebook collects and connects this information about users’ activities in order to give users a “personalized experience” across all of the apps under the company’s umbrella, like more precisely targeted ads or in-app recommendations based on an amalgamation of the user’s cross-platform activities.

For users who thought they could keep various accounts separate, the realities of this “personalized experience” can prove frustrating. The spokesperson noted that Facebook could use this data to suggest that a user join a Facebook group that includes people that they follow on Instagram or chat with over Messenger. That could pose privacy concerns for users who want their activity on an unlinked Instagram account isolated from their prime Facebook profile.

The connections among these accounts pose additional challenges on the back end. Some users that set out to create Finstagrams complain that they’ve found their new accounts linked to their prime Facebook profiles, resulting in all of their friends, half-acquaintances, and distant relatives receiving a notification to follow their supposedly private Finsta.

Six Instagram users queried by WIRED said that, though they either did not recall ever linking their Facebook and Instagram accounts or explicitly unlinked the two, they are still served notifications that can only be dismissed by clicking the “Open Facebook” button inside the Instagram app. Despite the fact that their accounts are not explicitly linked, clicking the button brings them to either the Facebook app or a logged-in mobile web version of the site.

Asked about the issue, a Facebook spokesperson at first said it was a bug, then later described it as a feature. Regardless of whether an Instagram user has elected to link their Facebook profile, so long as they have an account, the company has linked the two internally, and tapping “Open Facebook” in Instagram will take them to the associated account, the spokesperson said. “It’s just one of the ways that we can help people to understand that Facebook is there,” the spokesperson said.

All users will likely see a notification bubble in Instagram which can only be dismissed by clicking Open Facebook. However, the number of notifications served to users who haven’t linked their Facebook accounts will effectively be made up.

“With an unlinked account … it’s not an accurate representation of what your actual number of Facebook notifications are,” the spokesperson explained. Tapping the Open Facebook button, the spokesperson said, ”will again either open the app if you have it or just open you onto the web page.”

The Facebook spokesperson says the company began testing the Open Facebook feature in June 2018 and introduced it to some users in August 2018. The spokesperson wasn’t sure whether the Open Facebook feature was currently the default for all users, or whether it was still being rolled out to all users.

Facebook it’s Hell Inside. Facebook Scandals. Backstabbing. Resignations. Record profits. Time Bombs. In early 2018, Mark Zuckerberg set out to fix Facebook.

The confusing rollout of meaningful social interactions—marked by internal dissent, blistering external criticism, genuine efforts at reform, and foolish mistakes—set the stage for Facebook’s 2018. This is the story of that annus horribilis, based on interviews with 65 current and former employees. It’s ultimately a story about the biggest shifts ever to take place inside the world’s biggest social network. But it’s also about a company trapped by its own pathologies and, perversely, by the inexorable logic of its own recipe for success.

Facebook’s powerful network effects have kept advertisers from fleeing, and overall user numbers remain healthy if you include people on Insta­gram, which Facebook owns. But the company’s original culture and mission kept creating a set of brutal debts that came due with regularity over the past 16 months. The company floundered, dissembled, and apologized. Even when it told the truth, people didn’t believe it. Critics appeared on all sides, demanding changes that ranged from the essential to the contradictory to the impossible. As crises multiplied and diverged, even the company’s own solutions began to cannibalize each other. And the most crucial episode in this story—the crisis that cut the deepest—began not long after Davos, when some reporters from The New York Times, The Guardian, and Britain’s Channel 4 News came calling. They’d learned some troubling things about a shady British company called Cambridge Analytica, and they had some questions.

15 Months of Fresh Hell Inside Facebook

Scandals. Backstabbing. Resignations. Record profits. Time Bombs. In early 2018, Mark Zuckerberg set out to fix Facebook. Here’s how that turned out:

https://www.wired.com/story/facebook-mark-zuckerberg-15-months-of-fresh-hell/

Facebook pays teens to install VPN that spies on them

facebook vpn watching

Desperate for data on its competitors, Facebook has been secretly paying people to install a “Facebook Research” VPN that lets the company suck in all of a user’s phone and web activity, similar to Facebook’s Onavo Protect app that Apple banned in June and that was removed in August. Facebook sidesteps the App Store and rewards teenagers and adults to download the Research app and give it root access to network traffic in what may be a violation of Apple policy so the social network can decrypt and analyze their phone activity, a TechCrunch investigation confirms.

Facebook admitted to TechCrunch it was running the Research program to gather data on usage habits.

Since 2016, Facebook has been paying users ages 13 to 35 up to $20 per month plus referral fees to sell their privacy by installing the iOS or Android “Facebook Research” app. Facebook even asked users to screenshot their Amazon order history page. The program is administered through beta testing services Applause, BetaBound and uTest to cloak Facebook’s involvement, and is referred to in some documentation as “Project Atlas” — a fitting name for Facebook’s effort to map new trends and rivals around the globe.

Seven hours after this story was published, Facebook told TechCrunch it would shut down the iOS version of its Research app in the wake of our report. But on Wednesday morning, an Apple spokesperson confirmed that Facebook violated its policies, and it had blocked Facebook’s Research app on Tuesday before the social network seemingly pulled it voluntarily (without mentioning it was forced to do so). You can read our full report on the development here.

An Apple spokesperson provided this statement. “We designed our Enterprise Developer Program solely for the internal distribution of apps within an organization. Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple. Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data.”

Facebook’s Research program will continue to run on Android.

Facebook’s Research app requires users to ‘Trust’ it with extensive access to their dataWe asked Guardian Mobile Firewall’s security expert Will Strafach to dig into the Facebook Research app, and he told us that “If Facebook makes full use of the level of access they are given by asking users to install the Certificate, they will have the ability to continuously collect the following types of data: private messages in social media apps, chats from in instant messaging apps – including photos/videos sent to others, emails, web searches, web browsing activity, and even ongoing location information by tapping into the feeds of any location tracking apps you may have installed.” It’s unclear exactly what data Facebook is concerned with, but it gets nearly limitless access to a user’s device once they install the app.

The strategy shows how far Facebook is willing to go and how much it’s willing to pay to protect its dominance — even at the risk of breaking the rules of Apple’s iOS platform on which it depends. Apple may have asked Facebook to discontinue distributing its Research app.

A more stringent punishment would be to revoke Facebook’s permission to offer employee-only apps. The situation could further chill relations between the tech giants. Apple’s Tim Cook has repeatedly criticized Facebook’s data collection practices. Facebook disobeying iOS policies to slurp up more information could become a new talking point.

Facebook’s Research program is referred to as Project Atlas on sign-up sites that don’t mention Facebook’s involvement

“The fairly technical sounding ‘install our Root Certificate’ step is appalling,” Strafach tells us. “This hands Facebook continuous access to the most sensitive data about you, and most users are going to be unable to reasonably consent to this regardless of any agreement they sign, because there is no good way to articulate just how much power is handed to Facebook when you do this.”

Facebook’s surveillance app

Facebook first got into the data-sniffing business when it acquired Onavo for around $120 million in 2014. The VPN app helped users track and minimize their mobile data plan usage, but also gave Facebook deep analytics about what other apps they were using. Internal documents acquired by Charlie Warzel and Ryan Mac of BuzzFeed News reveal that Facebook was able to leverage Onavo to learn that WhatsApp was sending more than twice as many messages per day as Facebook Messenger. Onavo allowed Facebook to spot WhatsApp’s meteoric rise and justify paying $19 billion to buy the chat startup in 2014. WhatsApp has since tripled its user base, demonstrating the power of Onavo’s foresight.

Over the years since, Onavo clued Facebook in to what apps to copy, features to build and flops to avoid. By 2018, Facebook was promoting the Onavo app in a Protect bookmark of the main Facebook app in hopes of scoring more users to snoop on. Facebook also launched the Onavo Bolt app that let you lock apps behind a passcode or fingerprint while it surveils you, but Facebook shut down the app the day it was discovered following privacy criticism. Onavo’s main app remains available on Google Play and has been installed more than 10 million times.

The backlash heated up after security expert Strafach detailed in March how Onavo Protect was reporting to Facebook when a user’s screen was on or off, and its Wi-Fi and cellular data usage in bytes even when the VPN was turned off. In June, Apple updated its developer policies to ban collecting data about usage of other apps or data that’s not necessary for an app to function. Apple proceeded to inform Facebook in August that Onavo Protect violated those data collection policies and that the social network needed to remove it from the App Store, which it did, Deepa Seetharaman of the WSJ reported.

But that didn’t stop Facebook’s data collection.

Project Atlas

TechCrunch recently received a tip that despite Onavo Protect being banished by Apple, Facebook was paying users to sideload a similar VPN app under the Facebook Research moniker from outside of the App Store. We investigated, and learned Facebook was working with three app beta testing services to distribute the Facebook Research app: BetaBound, uTest and Applause. Facebook began distributing the Research VPN app in 2016. It has been referred to as Project Atlas since at least mid-2018, around when backlash to Onavo Protect magnified and Apple instituted its new rules that prohibited Onavo. Previously, a similar program was called Project Kodiak. Facebook didn’t want to stop collecting data on people’s phone usage and so the Research program continued, in disregard for Apple banning Onavo Protect.

Facebook’s Research App on iOS

Ads (shown below) for the program run by uTest on Instagram and Snapchat sought teens 13-17 years old for a “paid social media research study.” The sign-up page for the Facebook Research program administered by Applause doesn’t mention Facebook, but seeks users “Age: 13-35 (parental consent required for ages 13-17).” If minors try to sign-up, they’re asked to get their parents’ permission with a form that reveal’s Facebook’s involvement and says “There are no known risks associated with the project, however you acknowledge that the inherent nature of the project involves the tracking of personal information via your child’s use of apps. You will be compensated by Applause for your child’s participation.” For kids short on cash, the payments could coerce them to sell their privacy to Facebook.

The Applause site explains what data could be collected by the Facebook Research app (emphasis mine):

“By installing the software, you’re giving our client permission to collect data from your phone that will help them understand how you browse the internet, and how you use the features in the apps you’ve installed . . . This means you’re letting our client collect information such as which apps are on your phone, how and when you use them, data about your activities and content within those apps, as well as how other people interact with you or your content within those apps. You are also letting our client collect information about your internet browsing activity (including the websites you visit and data that is exchanged between your device and those websites) and your use of other online services. There are some instances when our client will collect this information even where the app uses encryption, or from within secure browser sessions.”

Meanwhile, the BetaBound sign-up page with a URL ending in “Atlas” explains that “For $20 per month (via e-gift cards), you will install an app on your phone and let it run in the background.” It also offers $20 per friend you refer. That site also doesn’t initially mention Facebook, but the instruction manual for installing Facebook Research reveals the company’s involvement.

Facebook’s intermediary uTest ran ads on Snapchat and Instagram, luring teens to the Research program with the promise of money

 

Facebook seems to have purposefully avoided TestFlight, Apple’s official beta testing system, which requires apps to be reviewed by Apple and is limited to 10,000 participants. Instead, the instruction manual reveals that users download the app from r.facebook-program.com and are told to install an Enterprise Developer Certificate and VPN and “Trust” Facebook with root access to the data their phone transmits. Apple requires that developers agree to only use this certificate system for distributing internal corporate apps to their own employees. Randomly recruiting testers and paying them a monthly fee appears to violate the spirit of that rule.

Security expert Will Strafach found Facebook’s Research app contains lots of code from Onavo Protect, the Facebook-owned app Apple banned last year

Once installed, users just had to keep the VPN running and sending data to Facebook to get paid. The Applause-administered program requested that users screenshot their Amazon orders page. This data could potentially help Facebook tie browsing habits and usage of other apps with purchase preferences and behavior. That information could be harnessed to pinpoint ad targeting and understand which types of users buy what.

TechCrunch commissioned Strafach to analyze the Facebook Research app and find out where it was sending data. He confirmed that data is routed to “vpn-sjc1.v.facebook-program.com” that is associated with Onavo’s IP address, and that the facebook-program.com domain is registered to Facebook, according to MarkMonitor. The app can update itself without interacting with the App Store, and is linked to the email address PeopleJourney@fb.com. He also discovered that the Enterprise Certificate first acquired in 2016 indicates Facebook renewed it on June 27th, 2018 — weeks after Apple announced its new rules that prohibited the similar Onavo Protect app.

“It is tricky to know what data Facebook is actually saving (without access to their servers). The only information that is knowable here is what access Facebook is capable of based on the code in the app. And it paints a very worrisome picture,” Strafach explains. “They might respond and claim to only actually retain/save very specific limited data, and that could be true, it really boils down to how much you trust Facebook’s word on it. The most charitable narrative of this situation would be that Facebook did not think too hard about the level of access they were granting to themselves . . . which is a startling level of carelessness in itself if that is the case.”

[Update: TechCrunch also found that Google’s Screenwise Meter surveillance app also breaks the Enterprise Certificate policy, though it does a better job of revealing the company’s involvement and how it works than Facebook does.]

“Flagrant defiance of Apple’s rules”

In response to TechCrunch’s inquiry, a Facebook spokesperson confirmed it’s running the program to learn how people use their phones and other services. The spokesperson told us “Like many companies, we invite people to participate in research that helps us identify things we can be doing better. Since this research is aimed at helping Facebook understand how people use their mobile devices, we’ve provided extensive information about the type of data we collect and how they can participate. We don’t share this information with others and people can stop participating at any time.”

Facebook’s Research app requires Root Certificate access, which Facebook gather almost any piece of data transmitted by your phone

Facebook’s spokesperson claimed that the Facebook Research app was in line with Apple’s Enterprise Certificate program, but didn’t explain how in the face of evidence to the contrary. They said Facebook first launched its Research app program in 2016. They tried to liken the program to a focus group and said Nielsen and comScore run similar programs, yet neither of those ask people to install a VPN or provide root access to the network. The spokesperson confirmed the Facebook Research program does recruit teens but also other age groups from around the world. They claimed that Onavo and Facebook Research are separate programs, but admitted the same team supports both as an explanation for why their code was so similar.

Facebook’s Research program requested users screenshot their Amazon order history to provide it with purchase data

However, Facebook’s claim that it doesn’t violate Apple’s Enterprise Certificate policy is directly contradicted by the terms of that policy. Those include that developers “Distribute Provisioning Profiles only to Your Employees and only in conjunction with Your Internal Use Applications for the purpose of developing and testing”. The policy also states that “You may not use, distribute or otherwise make Your Internal Use Applications available to Your Customers” unless under direct supervision of employees or on company premises. Given Facebook’s customers are using the Enterprise Certificate-powered app without supervision, it appears Facebook is in violation.

Seven hours after this report was first published, Facebook updated its position and told TechCrunch that it would shut down the iOS Research app. Facebook noted that the Research app was started in 2016 and was therefore not a replacement for Onavo Protect. However, they do share similar code and could be seen as twins running in parallel. A Facebook spokesperson also provided this additional statement:

“Key facts about this market research program are being ignored. Despite early reports, there was nothing ‘secret’ about this; it was literally called the Facebook Research App. It wasn’t ‘spying’ as all of the people who signed up to participate went through a clear on-boarding process asking for their permission and were paid to participate. Finally, less than 5 percent of the people who chose to participate in this market research program were teens. All of them with signed parental consent forms.”

Facebook did not publicly promote the Research VPN itself and used intermediaries that often didn’t disclose Facebook’s involvement until users had begun the signup process. While users were given clear instructions and warnings, the program never stresses nor mentions the full extent of the data Facebook can collect through the VPN. A small fraction of the users paid may have been teens, but we stand by the newsworthiness of its choice not to exclude minors from this data collection initiative.

Facebook disobeying Apple so directly and then pulling the app could hurt their relationship. “The code in this iOS app strongly indicates that it is simply a poorly re-branded build of the banned Onavo app, now using an Enterprise Certificate owned by Facebook in direct violation of Apple’s rules, allowing Facebook to distribute this app without Apple review to as many users as they want,” Strafach tells us. ONV prefixes and mentions of graph.onavo.com, “onavoApp://” and “onavoProtect://” custom URL schemes litter the app. “This is an egregious violation on many fronts, and I hope that Apple will act expeditiously in revoking the signing certificate to render the app inoperable.”

Facebook is particularly interested in what teens do on their phones as the demographic has increasingly abandoned the social network in favor of Snapchat, YouTube and Facebook’s acquisition Instagram. Insights into how popular with teens is Chinese video music app TikTok and meme sharing led Facebook to launch a clone called Lasso and begin developing a meme-browsing feature called LOL, TechCrunch first reported. But Facebook’s desire for data about teens riles critics at a time when the company has been battered in the press. Analysts on tomorrow’s Facebook earnings call should inquire about what other ways the company has to collect competitive intelligence now that it’s ceased to run the Research program on iOS.

Last year when Tim Cook was asked what he’d do in Mark Zuckerberg’s position in the wake of the Cambridge Analytica scandal, he said “I wouldn’t be in this situation . . . The truth is we could make a ton of money if we monetized our customer, if our customer was our product. We’ve elected not to do that.” Zuckerberg told Ezra Klein that he felt Cook’s comment was “extremely glib.”

Now it’s clear that even after Apple’s warnings and the removal of Onavo Protect, Facebook was still aggressively collecting data on its competitors via Apple’s iOS platform. “I have never seen such open and flagrant defiance of Apple’s rules by an App Store developer,” Strafach concluded. Now that Facebook has ceased the program on iOS and its Android future is uncertain, it may either have to invent new ways to surveil our behavior amidst a climate of privacy scrutiny, or be left in the dark.

Additional reporting by Zack Whittaker. Updated with comment from Facebook, and on Wednesday with a statement from Apple. 

Source: https://techcrunch.com/2019/01/29/facebook-project-atlas/

‘I Don’t Really Want to Work for Facebook.’ So Say Some Computer Science Students.

Surprisingly a number of students and generation Y digital natives turn against social media giants.

Computer Science Students.

Image
The Cal Hacks 5.0 competition drew students to the University of California, Berkeley, including, from left, Haitao Zhang, Ingrid Wu and Emily Hu, all students at Berkeley. Some students at the hackathon expressed a reluctance to work for big tech firms.CreditCreditMax Whittaker for The New York Times

BERKELEY, Calif. — A job at Facebook sounds pretty plum. The interns make around $8,000 a month, and an entry-level software engineer makes about $140,000 a year. The food is free. There’s a walking trail with indigenous plants and a juice bar.

But the tone among highly sought-after computer scientists about the social network is changing. On a recent night at the University of California, Berkeley, as a group of young engineers gathered to show off their tech skills, many said they would avoid taking jobs at the social network.

“I’ve heard a lot of employees who work there don’t even use it,” said Niky Arora, 19, an engineering student, who was recently invited to a Facebook recruiting event at the company’s headquarters in Menlo Park, Calif. “I just don’t believe in the product because like, Facebook, the baseline of everything they do is desire to show people more ads.”

Emily Zhong, 20, a computer science major, piped up. “Surprisingly, a lot of my friends now are like, ‘I don’t really want to work for Facebook,’” she said, citing “privacy stuff, fake news, personal data, all of it.”

“Before it was this glorious, magical thing to work there,” said Jazz Singh, 18, also studying computer science. “Now it’s like, just because it does what you want doesn’t mean it’s doing good.”

As Facebook has been rocked by scandal after scandal, some young engineers are souring on the company. Many are still taking jobs there, but those who do are doing it a little more quietly, telling their friends that they will work to change it from within or that they have carved out more ethical work at a company whose reputation has turned toxic.

Facebook, which employs more than 30,000 full-time workers around the world, said, “In 2018, we’ve hired more engineers than ever before.” The company added, “We continue to see strong engagement and excitement within the engineering community at the prospect of joining our company.”

Image
Niky Arora, 19, a student at Berkeley, said she was skeptical about working for Facebook, which invited her to a recruiting event recently. “I’ve heard a lot of employees who work there don’t even use it,” she said.CreditMax Whittaker for The New York Times

The changing attitudes are happening beyond Facebook. Across Silicon Valley, tech recruiters said job applicants in general were asking more hard questions during interviews, wanting to know specifically what they would be asked to do at the company. Career coaches said they had tech employees reaching out to get tips on handling moral quandaries. The questions include “How do I avoid a project I disagree with?” and “How do I remind my bosses of the company mission statement?”

“Employees are wising up to the fact that you can have a mission statement on your website, but when you’re looking at how the company creates new products or makes decisions, the correlation between the two is not so tightly aligned,” said David Chie, the head of Palo Alto Staffing, a tech job placement service in Silicon Valley. “Everyone’s having this conversation.”

When engineers apply for jobs, they are also doing it differently.

“They do a lot more due diligence,” said Heather Johnston, Bay Area district president for the tech job staffing agency Robert Half. “Before, candidates were like: ‘Oh, I don’t want to do team interviews. I want a one-and-done.’” Now, she added, job candidates “want to meet the team.”

“They’re not just going to blindly take a company because of the name anymore,” she said.

Yet while many of the big tech companies have been hit by a change in public perception, Facebook seems uniquely tarred among young workers.

“I’ve had a couple of clients recently say they’re not as enthusiastic about Facebook because they’re frustrated with what they see happening politically or socially,” said Paul Freiberger, president of Shimmering Careers, a career counseling group based in San Mateo, Calif. “It’s privacy and political news, and concern that it’s going to be hard to correct these things from inside.”

Chad Herst, a leadership and career coach based in San Francisco since 2008, said that now, for the first time, he had clients who wanted to avoid working for big social media companies like Facebook or Twitter.

“They’re concerned about where democracy is going, that social media polarizes us, and they don’t want to be building it,” Mr. Herst said. “People really have been thinking about the mission of the company and what the companies are trying to achieve a little more.”

He said one client, a midlevel executive at Facebook, wanted advice on how to shift her group’s work to encourage users to connect offline as well. But she found resistance internally to her efforts.

“She was trying to figure out: ‘How do I politic this? How do I language this?’” Mr. Herst said. “And I was telling her to bring up some of Mark Zuckerberg’s past statements about connecting people.”

On the recent evening at the University of California, Berkeley, around 2,200 engineering students from around the country gathered for Cal Hacks 5.0 — a competition to build the best apps. The event spanned a weekend, so teenage competitors dragged pillows around with them. The hosts handed out 2,000 burritos as students registered.

It was also a hiring event. Recruiters from Facebook and Alphabet set up booths (free sunglasses from Facebook; $200 in credit to the Google Cloud platform from Alphabet).

In the auditorium, the head of Y Combinator, a start-up incubator and investment firm, gave opening remarks, recommending that young people avoid jobs in big tech.

“You get to program your life on a totally different scale,” said Michael Seibel, who leads Y Combinator. “The worst thing that can happen to you is you get a job at Google.” He called those jobs “$100,000-a-year welfare” — meaning, he said, that workers can get tethered to the paycheck and avoid taking risks.

The event then segued to a word from the sponsor, Microsoft. Justin Garrett, a Microsoft recruiter who on his LinkedIn profile calls himself a senior technical evangelist, stepped onstage, laughing a little.

“So, Michael’s a tough guy to follow, especially when you work for one of those big companies,” Mr. Garrett said. “He called it welfare. I like to call it tremendous opportunity.”

Then students flooded into the stadium, which was filled with long tables of computers where they would stay and compete. In the middle of the scrum, three friends joked around. Caleb Thomas, 21, was gently made fun of because he had accepted an internship at Facebook.

“Come on, guys,” Mr. Thomas said.

“These are the realities of how the business works,” said Samuel Resendez, 20, a computer science student at the University of Southern California.

It turned out Mr. Resendez had interned at Facebook in the summer. Olivia Brown, 20, head of Stanford’s Computer Science and Social Good club and an iOS intern at Mozilla, called him out on it. “But you still worked at Facebook, too,” she said.

“Well, at least I signed before Cambridge Analytica,” Mr. Resendez said, a little bashful about the data privacy and election manipulation scandal that rocked the company this year. “Ninety-five percent of what Facebook is doing is delivering memes.”

Ms. Brown said a lot of students criticize Facebook and talk about how they would not work there, but ultimately join. “Everyone cares about ethics in tech before they get a contract,” she said.

Ms. Brown said she thought that could change soon, though, as the social stigma of working for Facebook began outweighing the financial benefits.

“Defense companies have had this reputation for a long time,” she said. “Social networks are just getting that.”

Source: https://www.nytimes.com/2018/11/15/technology/jobs-facebook-computer-science-students.html

resting and vesting — showing up to Facebook and barely working to collect a $450 million payday

Jan Koum 5The WhatsApp cofounder Jan Koum.Reuters

  • Back in April, the WhatsApp cofounder Jan Koum announced plans to leave Facebook.
  • But he’s still showing up to the office once a month so he can continue to collect $450 million in Facebook stock he’s contractually due from when Facebook bought his company.
  • It’s a high-dollar example of „rest and vest,“ in which big tech companies pay senior employees who don’t do much work.
  • Koum has already sold over $7 billion in Facebook stock.

The WhatsApp cofounder Jan Koum said in April that he planned to leave Facebook, which bought his company for $19 billion in 2014. He’s already sold $7.1 billion worth of Facebook shares.

But he’s still showing up to the office, The Wall Street Journal reports, to collect one last payday: $450 million in stock.

Koum is resting and vesting, in Silicon Valley lingo, a state that often refers to wealthy entrepreneurs and engineers with one foot out the door at big tech companies who are allowed to continue to be officially employed until they’re able to collect stock and options in quarterly or annual increments.

Usually, stock awards after a merger are distributed on a four-year vesting cliff — if you last all four years, you get your entire stock grant. Koum’s last vesting date is November. He showed up at Facebook’s offices in mid-July, fulfilling a requirement of his employment contract, according to The Wall Street Journal.

„Resting and vesting“ is an open secret in Silicon Valley, Business Insider has reported. At some companies, the employees are called „coasters.“ The HBO show „Silicon Valley“ even spoofed it in an episode in which engineers hang out on a roof and don’t do any work.

„I’ve actually had a number of people, including today at Google X … send me pictures of themselves on a roof, kicking back doing nothing, with the hashtag ‚unassigned‘ or ‚rest and vest.‘ It’s something that really happens, and apparently, somewhat often,“ Josh Brener, the actor who plays the lucky character who got to rest and vest in HBO’s „Silicon Valley,“ told Business Insider last year.

From Business Insider’s report on the phenomenon:

„Facebook, for instance, has a fairly hush bonus program called ‚discretionary equity,‘ a former Facebook engineer who received it said.

„DE is when the company hands an engineer a massive, extra chunk of restricted stock units, worth tens to hundreds of thousands of dollars. It’s a thank-you for a job well done. It also helps keep the person from jumping ship because DE vests over time. These are bonus grants that are signed by top executives, sometimes even CEO Mark Zuckerberg.“

Koum’s payday isn’t related to discretionary equity; it’s instead a result of the over 20 million restricted shares of Facebook he received when he sold WhatsApp. He has one more vesting day in August and one in November, according to filings with the Securities and Exchange Commission.

Koum reportedly decided to leave Facebook in the middle of a spat over how to integrate advertising into WhatsApp. A WhatsApp representative declined to comment, but The Journal reports that Koum is still employed at the social-networking giant.

When Koum left, he wrote that he was taking time off to collect „rare air-cooled Porsches“ and play ultimate Frisbee.

How many Porsches can one buy with $450 million?

 

http://uk.businessinsider.com/whatsapp-founder-jan-koum-rest-and-vest-for-450-million-facebook-stock-2018-8?r=US&IR=T

June 2018 Tech News & Trends to Watch

1. Companies Worldwide Strive for GDPR Compliance

By now, everyone with an email address has seen a slew of emails announcing privacy policy updates. You have Europe’s GDPR legislation to thank for your overcrowded inbox. GDPR creates rules around how much data companies are allowed to collect, how they’re able to use that data, and how clear they have to be with consumers about it all.

Companies around the world are scrambling to get their business and its practices into compliance – a significant task for many of them. While technically, the deadline to get everything in order passed on May 25, for many companies the process will continue well into June and possibly beyond. Some companies are even shutting down in Europe for good, or for as long as it takes them to get in compliance.

Even with the deadline behind us, the GDPR continues to be a top story for the tech world and may remain so for some time to come.

 

2. Amazon Provides Facial Recognition Tech to Law Enforcement

Amazon can’t seem to go a whole month without showing up in a tech news roundup. This month it’s for a controversial story: selling use of Rekognition, their facial recognition software, to law enforcement agencies on the cheap.

Civil rights groups have called for the company to stop allowing law enforcement access to the tech out of concerns that increased government surveillance can pose a threat to vulnerable communities in the country. In spite of the public criticism, Amazon hasn’t backed off on providing the tech to authorities, at least as of this time.

 

3. Apple Looks Into Self-Driving Employee Shuttles

Of the many problems facing our world, the frustrating work commute is one that many of the brightest minds in tech deal with just like the rest of us. Which makes it a problem the biggest tech companies have a strong incentive to try to solve.

Apple is one of many companies that’s invested in developing self-driving cars as a possible solution, but while that goal is still (probably) years away, they’ve narrowed their focus to teaming up with VW to create self-driving shuttles just for their employees.  Even that project is moving slower than the company had hoped, but they’re aiming to have some shuttles ready by the end of the year.

 

4. Court Weighs in on President’s Tendency to Block Critics on Twitter

Three years ago no one would have imagined that Twitter would be a president’s go-to source for making announcements, but today it’s used to that effect more frequently than official press conferences or briefings.

In a court battle that may sound surreal to many of us, a judge just found that the president can no longer legally block other users on Twitter.  The court asserted that blocking users on a public forum like Twitter amounts to a violation of their First Amendment rights. The judgment does still allow for the president and other public officials to mute users they don’t agree with, though.

 

5. YouTube Launches Music Streaming Service

YouTube joined the ranks of Spotify, Pandora, and Amazon this past month with their own streaming music service. Consumers can use a free version of the service that includes ads, or can pay $9.99 for the ad-free version.

youtube music service

With so many similar services already on the market, people weren’t exactly clamoring for another music streaming option. But since YouTube is likely to remain the reigning source for videos, it doesn’t necessarily need to unseat Spotify to still be okay. And with access to Google’s extensive user data, it may be able to provide more useful recommendations than its main competitors in the space, which is one way the service could differentiate itself.

 

6. Facebook Institutes Political Ad Rules

Facebook hasn’t yet left behind the controversies of the last election. The company is still working to proactively respond to criticism of its role in the spread of political propaganda many believe influenced election results. One of the solutions they’re trying is a new set of rules for any political ads run on the platform.

Any campaign that intends to run Facebook ads is now required to verify their identity with a card Facebook mails to their address that has a verification code. While Facebook has been promoting these new rules for a few weeks to politicians active on the platform, some felt blindsided when they realized, right before their primaries no less, that they could no longer place ads without waiting 12 to 15 days for a verification code to come in the mail. Politicians in this position blame the company for making a change that could affect their chances in the upcoming election.

Even in their efforts to avoid swaying elections, Facebook has found themselves criticized for doing just that. They’re probably feeling at this point like they just can’t win.

 

7. Another Big Month for Tech IPOs

This year has seen one tech IPO after another and this month is no different. Chinese smartphone company Xiaomi has a particularly large IPO in the works. The company seeks to join the Hong Kong stock exchange on June 7 with an initial public offering that experts anticipate could reach $10 billion.

The online lending platform Greensky started trading on the New York Stock Exchange on May 23 and sold 38 million shares in its first day, 4 million more than expected. This month continues 2018’s trend of tech companies going public, largely to great success.

 

8. StumbleUpon Shuts Down

In the internet’s ongoing evolution, there will always be tech companies that win and those that fall by the wayside. StumbleUpon, a content discovery platform that had its heyday in the early aughts, is officially shutting down on June 30.

Since its 2002 launch, the service has helped over 40 million users “stumble upon” 60 billion new websites and pieces of content. The company behind StumbleUpon plans to create a new platform that serves a similar purpose that may be more useful to former StumbleUpon users called Mix.

 

9. Uber and Lyft Invest in Driver Benefits

In spite of their ongoing success, the popular ridesharing platforms Uber and Lyft have faced their share of criticism since they came onto the scene. One of the common complaints critics have made is that the companies don’t provide proper benefits to their drivers. And in fact, the companies have fought to keep drivers classified legally as contractors so they’re off the hook for covering the cost of employee taxes and benefits.

Recently both companies have taken steps to make driving for them a little more attractive. Uber has begun offering Partner Protection to its drivers in Europe, which includes health insurance, sick pay, and parental leave ­ ­– so far nothing similar in the U.S. though. For its part, Lyft is investing $100 million in building driver support centers where their drivers can stop to get discounted car maintenance, tax help, and customer support help in person from Lyft staff. It’s not the same as getting full employee benefits (in the U.S. at least), but it’s something.

Source: https://www.hostgator.com/blog/june-tech-trends-to-watch/

Lets Get Rid of the “Nothing to Hide, Nothing to Fear” Mentality

With Zuckerberg testifying to the US Congress over Facebook’s data privacy and the implementation of GDPR fast approaching, the debate around data ownership has suddenly burst into the public psyche. Collecting user data to serve targeted advertising in a free platform is one thing, harvesting the social graphs of people interacting with apps and using it to sway an election is somewhat worse.

Suffice to say that neither of the above compare to the indiscriminate collection of ordinary civilians’ data on behalf of governments every day.

In 2013, Edward Snowden blew the whistle on the systematic US spy program he helped to architect. Perhaps the largest revelation to come out of the trove of documents he released were the details of PRISM, an NSA program that collects internet communications data from US telecommunications companies like Microsoft, Yahoo, Google, Facebook and Apple. The data collected included audio and video chat logs, photographs, emails, documents and connection logs of anyone using the services of 9 leading US internet companies. PRISM benefited from changes to FISA that allowed warrantless domestic surveillance of any target without the need for probable cause. Bill Binney, former US intelligence official, explains how, for instances where corporate control wasn’t achievable, the NSA enticed third party countries to clandestinely tap internet communication lines on the internet backbone via the RAMPART-A program.What this means is that the NSA was able to assemble near complete dossiers of all web activity carried out by anyone using the internet.

But this is just in the US right?, policies like this wouldn’t be implemented in Europe.

Wrong unfortunately.

GCHQ, the UK’s intelligence agency allegedly collects considerably more metadata than the NSA. Under Tempora, GCHQ can intercept all internet communications from submarine fibre optic cables and store the information for 30 days at the Bude facility in Cornwall. This includes complete web histories, the contents of all emails and facebook entires and given that more than 25% of all internet communications flow through these cables, the implications are astronomical. Elsewhere, JTRIG, a unit of GCHQ have intercepted private facebook pictures, changed the results of online polls and spoofed websites in real time. A lot of these techniques have been made possible by the 2016 Investigatory Powers Act which Snowden describes as the most “extreme surveillance in the history of western democracy”.

But despite all this, the age old reprise; “if you’ve got nothing to hide, you’ve got nothing to fear” often rings out in debates over privacy.

Indeed, the idea is so pervasive that politicians often lean on the phrase to justify ever more draconian methods of surveillance. Yes, they draw upon the selfsame rhetoric of Joseph Goebbels, propaganda minister for the Nazi regime.

In drafting legislation for the the Investigatory Powers Act, May said that such extremes were necessary to ensure “no area of cyberspace becomes a haven for those who seek to harm us, to plot, poison minds and peddle hatred under the radar”.

When levelled against the fear of terrorism and death, its easy to see how people passively accept ever greater levels of surveillance. Indeed, Naomi Klein writes extensively in Shock Doctrine how the fear of external threats can be used as a smokescreen to implement ever more invasive policy. But indiscriminate mass surveillance should never be blindly accepted, privacy should and always will be a social norm, despite what Mark Zuckerberg said in 2010. Although I’m sure he may have a different answer now.

So you just read emails and look at cat memes online, why would you care about privacy?

In the same way we’re able to close our living room curtains and be alone and unmonitored, we should be able to explore our identities online un-impinged. Its a well rehearsed idea that nowadays we’re more honest to our web browsers than we are to each other but what happens when you become cognisant that everything you do online is intercepted and catalogued? As with CCTV, when we know we’re being watched, we alter our behaviour in line with whats expected.

As soon as this happens online, the liberating quality provided by the anonymity of the internet is lost. Your thinking aligns with the status quo and we lose the boundless ability of the internet to search and develop our identities. No progress can be made when everyone thinks the same way. Difference of opinion fuels innovation.

This draws obvious comparisons with Bentham’s Panopticon, a prison blueprint for enforcing control from within. The basic setup is as follows; there is a central guard tower surrounded by cells. In the cells are prisoners. The tower shines bright light so that the watchman can see each inmate silhouetted in their cell but the prisoners cannot see the watchman. The prisoners must assume they could be observed at any point and therefore act accordingly. In literature, the common comparison is Orwell’s 1984 where omnipresent government surveillance enforces control and distorts reality. With revelations about surveillance states, the relevance of these metaphors are plain to see.

In reality, theres actually a lot more at stake here.

With the Panopticon certain individuals are watched, in 1984 everyone is watched. On the modern internet, every person, irrespective of the threat they pose, is not only watched but their information is stored and archived for analysis.

Kafka’s The Trial, in which a bureaucracy uses citizens information to make decisions about them, but denies them the ability to participate in how their information is used, therefore seems a more apt comparison. The issue here is that corporations, more so, states have been allowed to comb our data and make decisions that affect us without our consent.

Maybe, as a member of a western democracy, you don’t think this matters. But what if you’re a member of a minority group in an oppressive regime? What if you’re arrested because a computer algorithm cant separate humour from intent to harm?

On the other hand, maybe you trust the intentions of your government, but how much faith do you have in them to keep your data private? The recent hack of the SEC shows that even government systems aren’t safe from attackers. When a business database is breached, maybe your credit card details become public, when a government database that has aggregated millions of data points on every aspect of your online life is hacked, you’ve lost all control of your ability to selectively reveal yourself to the world. Just as Lyndon Johnson sought to control physical clouds, he who controls the modern cloud, will rule the world.

Perhaps you think that even this doesn’t matter, if it allows the government to protect us from those that intend to cause harm then its worth the loss of privacy. The trouble with indiscriminate surveillance is that with so much data you see everything but paradoxically, still know nothing.

Intelligence is the strategic collection of pertinent facts, bulk data collection cannot therefore be intelligent. As Bill Binney puts it “bulk data kills people” because technicians are so overwhelmed that they cant isolate whats useful. Data collection as it is can only focus on retribution rather than reduction.

Granted, GDPR is a big step forward for individual consent but will it stop corporations handing over your data to the government? Depending on how cynical you are, you might think that GDPR is just a tool to clean up and create more reliable deterministic data anyway. The nothing to hide, nothing to fear mentality renders us passive supplicants in the removal of our civil liberties. We should be thinking about how we relate to one another and to our Governments and how much power we want to have in that relationship.

To paraphrase Edward Snowden, saying you don’t care about privacy because you’ve got nothing to hide is analogous to saying you don’t care about freedom of speech because you have nothing to say.

http://behindthebrowser.space/index.php/2018/04/22/nothing-to-fear-nothing-to-hide/

Forget Facebook

Forget Facebook

Photo Credits: oe24.at – Copyrights of oe24.at reserved

Source: Techcrunch.com

Cambridge Analytica may have used Facebook’s data to influence your political opinions. But why does least-liked tech company Facebook have all this data about its users in the first place?

Let’s put aside Instagram, WhatsApp and other Facebook products for a minute. Facebook has built the world’s biggest social network. But that’s not what they sell. You’ve probably heard the internet saying “if a product is free, it means that you are the product.”

And it’s particularly true in this case because Facebook is the world’s second biggest advertising company in the world behind Google. During the last quarter of 2017, Facebook reported $12.97 billion in revenue, including $12.78 billion from ads.

That’s 98.5 percent of Facebook’s revenue coming from ads.

Ads aren’t necessarily a bad thing. But Facebook has reached ad saturation in the newsfeed. So the company has two options — creating new products and ad formats, or optimizing those sponsored posts.

Facebook has reached ad saturation in the newsfeed

This isn’t a zero-sum game — Facebook has been doing both at the same time. That’s why you’re seeing more ads on Instagram and Messenger. And that’s also why ads on Facebook seem more relevant than ever.

If Facebook can show you relevant ads and you end up clicking more often on those ads, then advertisers will pay Facebook more money.

So Facebook has been collecting as much personal data about you as possible — it’s all about showing you the best ad. The company knows your interests, what you buy, where you go and who you’re sleeping with.

You can’t hide from Facebook

Facebook’s terms and conditions are a giant lie. They are purposely misleading, too long and too broad. So you can’t just read the company’s terms of service and understand what it knows about you.

That’s why some people have been downloading their Facebook data. You can do it too, it’s quite easy. Just head over to your Facebook settings and click the tiny link that says “Download a copy of your Facebook data.”

In that archive file, you’ll find your photos, your posts, your events, etc. But if you keep digging, you’ll also find your private messages on Messenger (by default, nothing is encrypted).

And if you keep digging a bit more, chances are you’ll also find your entire address book and even metadata about your SMS messages and phone calls.

All of this is by design and you agreed to it. Facebook has unified terms of service and share user data across all its apps and services (except WhatsApp data in Europe for now). So if you follow a clothing brand on Instagram, you could see an ad from this brand on Facebook.com.

Messaging apps are privacy traps

But Facebook has also been using this trick quite a lot with Messenger. You might not remember, but the on-boarding experience on Messenger is really aggressive.

On iOS, the app shows you a fake permission popup to access your address book that says “Ok” or “Learn More”. The company is using a fake popup because you can’t ask for permission twice.

There’s a blinking arrow below the OK button.

If you click on “Learn More”, you get a giant blue button that says “Turn On”. Everything about this screen is misleading and Messenger tries to manipulate your emotions.

“Messenger only works when you have people to talk to,” it says. Nobody wants to be lonely, that’s why Facebook implies that turning on this option will give you friends.

Even worse, it says “if you skip this step, you’ll need to add each contact one-by-one to message them.” This is simply a lie as you can automatically talk to your Facebook friends using Messenger without adding them one-by-one.

The next time you pay for a burrito with your credit card, Facebook will learn about this transaction and match this credit card number with the one you added in Messenger

If you tap on “Not Now”, Messenger will show you a fake notification every now and then to push you to enable contact syncing. If you tap on yes and disable it later, Facebook still keeps all your contacts on its servers.

On Android, you can let Messenger manage your SMS messages. Of course, you guessed it, Facebook uploads all your metadata. Facebook knows who you’re texting, when, how often.

Even if you disable it later, Facebook will keep this data for later reference.

But Facebook doesn’t stop there. The company knows a lot more about you than what you can find in your downloaded archive. The company asks you to share your location with your friends. The company tracks your web history on nearly every website on earth using embedded JavaScript.

But my favorite thing is probably peer-to-peer payments. In some countries, you can pay back your friends using Messenger. It’s free! You just have to add your card to the app.

It turns out that Facebook also buys data about your offline purchases. The next time you pay for a burrito with your credit card, Facebook will learn about this transaction and match this credit card number with the one you added in Messenger.

In other words, Messenger is a great Trojan horse designed to learn everything about you.

And the next time an app asks you to share your address book, there’s a 99-percent chance that this app is going to mine your address book to get new users, spam your friends, improve ad targeting and sell email addresses to marketing companies.

I could say the same thing about all the other permission popups on your phone. Be careful when you install an app from the Play Store or open an app for the first time on iOS. It’s easier to enable something if a feature doesn’t work without it than to find out that Facebook knows everything about you.

GDPR to the rescue

There’s one last hope. And that hope is GDPR. I encourage you to read TechCrunch’s Natasha Lomas excellent explanation of GDPR to understand what the European regulation is all about.

Many of the misleading things that are currently happening at Facebook will have to change. You can’t force people to opt in like in Messenger. Data collection should be minimized to essential features. And Facebook will have to explain why it needs all this data to its users.

If Facebook doesn’t comply, the company will have to pay up to 4 percent of its global annual turnover. But that doesn’t stop you from actively reclaiming your online privacy right now.

You can’t be invisible on the internet, but you have to be conscious about what’s happening behind your back. Every time a company asks you to tap OK, think about what’s behind this popup. You can’t say that nobody told you.

Source: Techcrunch.com

Whatsapp spies on your encrypted messages

Exclusive: Privacy campaigners criticise WhatsApp vulnerability as a ‘huge threat to freedom of speech’ and warn it could be exploited by government agencies

Research shows that the company can read messages due to the way WhatsApp has implemented its end-to-end encryption protocol.
Research shows that WhatsApp can read messages due to the way the company has implemented its end-to-end encryption protocol. Photograph: Ritchie B Tongo/EPA

A security backdoor that can be used to allow Facebook and others to intercept and read encrypted messages has been found within its WhatsApp messaging service.

Facebook claims that no one can intercept WhatsApp messages, not even the company and its staff, ensuring privacy for its billion-plus users. But new research shows that the company could in fact read messages due to the way WhatsApphas implemented its end-to-end encryption protocol.

Privacy campaigners said the vulnerability is a “huge threat to freedom of speech” and warned it can be used by government agencies to snoop on users who believe their messages to be secure. WhatsApp has made privacy and security a primary selling point, and has become a go to communications tool of activists, dissidents and diplomats.

WhatsApp’s end-to-end encryption relies on the generation of unique security keys, using the acclaimed Signal protocol, developed by Open Whisper Systems, that are traded and verified between users to guarantee communications are secure and cannot be intercepted by a middleman. However, WhatsApp has the ability to force the generation of new encryption keys for offline users, unbeknown to the sender and recipient of the messages, and to make the sender re-encrypt messages with new keys and send them again for any messages that have not been marked as delivered.

The recipient is not made aware of this change in encryption, while the sender is only notified if they have opted-in to encryption warnings in settings, and only after the messages have been resent. This re-encryption and rebroadcasting effectively allows WhatsApp to intercept and read users’ messages.

The security backdoor was discovered by Tobias Boelter, a cryptography and security researcher at the University of California, Berkeley. He told the Guardian: “If WhatsApp is asked by a government agency to disclose its messaging records, it can effectively grant access due to the change in keys.”

The backdoor is not inherent to the Signal protocol. Open Whisper Systems’ messaging app, Signal, the app used and recommended by whistleblower Edward Snowden, does not suffer from the same vulnerability. If a recipient changes the security key while offline, for instance, a sent message will fail to be delivered and the sender will be notified of the change in security keys without automatically resending the message.

WhatsApp’s implementation automatically resends an undelivered message with a new key without warning the user in advance or giving them the ability to prevent it.

Boelter reported the backdoor vulnerability to Facebook in April 2016, but was told that Facebook was aware of the issue, that it was “expected behaviour” and wasn’t being actively worked on. The Guardian has verified the backdoor still exists.

The WhatsApp vulnerability calls into question the privacy of messages sent across the service used around the world, including by people living in oppressive regimes.
Pinterest
The WhatsApp vulnerability calls into question the privacy of messages sent across the service used around the world, including by people living in oppressive regimes. Photograph: Marcelo Sayão/EPA

Steffen Tor Jensen, head of information security and digital counter-surveillance at the European-Bahraini Organisation for Human Rights, verified Boelter’s findings. He said: “WhatsApp can effectively continue flipping the security keys when devices are offline and re-sending the message, without letting users know of the change till after it has been made, providing an extremely insecure platform.”

Boelter said: “[Some] might say that this vulnerability could only be abused to snoop on ‘single’ targeted messages, not entire conversations. This is not true if you consider that the WhatsApp server can just forward messages without sending the ‘message was received by recipient’ notification (or the double tick), which users might not notice. Using the retransmission vulnerability, the WhatsApp server can then later get a transcript of the whole conversation, not just a single message.”

The vulnerability calls into question the privacy of messages sent across the service, which is used around the world, including by people living in oppressive regimes.

Professor Kirstie Ball, co-director and founder of the Centre for Research into Information, Surveillance and Privacy, called the existence of a backdoor within WhatsApp’s encryption “a gold mine for security agencies” and “a huge betrayal of user trust”. She added: “It is a huge threat to freedom of speech, for it to be able to look at what you’re saying if it wants to. Consumers will say, I’ve got nothing to hide, but you don’t know what information is looked for and what connections are being made.”

In the UK, the recently passed Investigatory Powers Act allows the government to intercept bulk data of users held by private companies, without suspicion of criminal activity, similar to the activity of the US National Security Agency uncovered by the Snowden revelations. The government also has the power to force companies to “maintain technical capabilities” that allow data collection through hacking and interception, and requires companies to remove “electronic protection” from data. Intentional or not, WhatsApp’s backdoor to the end-to-end encryption could be used in such a way to facilitate government interception.

Jim Killock, executive director of Open Rights Group, said: “If companies claim to offer end-to-end encryption, they should come clean if it is found to be compromised – whether through deliberately installed backdoors or security flaws. In the UK, the Investigatory Powers Act means that technical capability notices could be used to compel companies to introduce flaws – which could leave people’s data vulnerable.”

A WhatsApp spokesperson told the Guardian: “Over 1 billion people use WhatsApp today because it is simple, fast, reliable and secure. At WhatsApp, we’ve always believed that people’s conversations should be secure and private. Last year, we gave all our users a better level of security by making every message, photo, video, file and call end-to-end encrypted by default. As we introduce features like end-to-end encryption, we focus on keeping the product simple and take into consideration how it’s used every day around the world.

“In WhatsApp’s implementation of the Signal protocol, we have a “Show Security Notifications” setting (option under Settings > Account > Security) that notifies you when a contact’s security code has changed. We know the most common reasons this happens are because someone has switched phones or reinstalled WhatsApp. This is because in many parts of the world, people frequently change devices and Sim cards. In these situations, we want to make sure people’s messages are delivered, not lost in transit.”

Asked to comment specifically on whether Facebook/WhatApps had accessed users’ messages and whether it had done so at the request of government agencies or other third parties, it directed the Guardian to its site that details aggregate data on government requests by country.

Concerns over the privacy of WhatsApp users has been repeatedly highlighted since Facebook acquired the company for $22bn in 2014. In August 2015, Facebook announced a change to the privacy policy governing WhatsApp that allowed the social network to merge data from WhatsApp users and Facebook, including phone numbers and app usage, for advertising and development purposes.

Facebook halted the use of the shared user data for advertising purposes in November after pressure from the pan-European data protection agency groupArticle 29 Working Party in October. The European commission then filed charges against Facebook for providing “misleading” information in the run-up to the social network’s acquisition of messaging service WhatsApp, following its data-sharing change.

https://www.theguardian.com/technology/2017/jan/13/whatsapp-backdoor-allows-snooping-on-encrypted-messages