The following is a rough transcript which has not been revised by The Jim Rutt Show or by Steven Levy. Please check with us before using any quotations from this transcript. Thank you.
Jim: Today’s guest is journalist and author Steven Levy.
Steven: Hi, Jim.
Jim: Steven, great to have you here, somebody whose career I have been following since long, long ago. Steven has been following the personal computer and online worlds since damn near the beginning. The first book of his which I read was Hackers: Heroes of the Computer Revolution from 1984 when hackers was still mostly a positive term. Indeed that book, as far as I can tell, was the catalyst for the legendary Hackers Conference of 1984, quite a who’s who for the early days of computing and networks. He also wrote In the Plex: How Google Thinks, Works, and Shapes our Lives, which I also read, and it was very prescient to my mind about the culture and power of Google.
Jim: I made my own first visit to the plex in early 2014, and it was stunningly like how Steven had drawn it. It was like, wow, okay. It’s like I had been here before, came from that book. In addition to his influential books, Steven is a long time top technology journalist. He’s currently Wired’s editor-at-large and has appeared in Rolling Stone, Harper’s Magazine, Macworld, the New York Times Magazine, Esquire, The New Yorker, and many more.
Jim: Today, we’re mostly going to talk about Steven’s new book, Facebook. As always, links are available to Steven’s book and to references, organizations, and articles that we reference in this discussion.
Jim: Let’s start with the unprecedent level and duration of the access you had to Facebook and to Zuck and his management team. Could you tell us a little bit about how it came to be and kind of how the relationship grew and changed over what period of time?
Steven: Sure, so I wanted to write this book beginning, I think, in 2015. Actually, I could pinpoint the date. I think it was August 27th, 2015 when Zuckerberg posted that a billion people were online in the previous 24 hours on Facebook, and that had never happened before. You get a billion people for the World Cup. That’s people just watching something. They’re not members of anything. They can’t talk to each other. This wasn’t a spike, this was a baseline. Facebook was only going to grow.
Steven: I’d been covering Facebook for a number of years, beginning when it was just the college network. It struck me that Zuckerberg’s very ambitious plans were on the way to being fulfilled, and I wanted to document that.
Steven: The way I do that, you mentioned the Google book, is to dive in there and learn everything I can about this. If I were doing a book without cooperation from Facebook, I’d talk to a lot of people outside the company. I’d try to talk to people inside the company, away from the campus, and people who have opinions about it. So I did a lot of stuff that you do when you a place doesn’t cooperate, but I talked them into cooperating with me on the record officially. In exchange, they didn’t get anything from me. I didn’t show them the book until after it was printed and all the trees were dead, and it was too late to do anything about it. So that was just a pure gain for me.
Steven: I told them that someone has to do this, someone has to document this historical ascent of this company. Even if it isn’t me, it should be someone, but it should be me. They knew who I was from my track record, and I’d been covering Facebook, and they knew me to be trustworthy. They knew going in there that there would probably be stuff in there that they didn’t like. They didn’t anticipate that a few months after I started writing the book, Facebook would be thrust into a trajectory that put it in crisis and exposed a lot of their misdeeds, which of course I documented in detail. But I kept my implicit promise to be fair and delivered a book which I think tells the story of Facebook.
Jim: Wow. As far as I know, nobody else has ever had this level of access. I’ve read some other books about Facebook and articles, and they don’t have the sense of I was there like this one does.
Steven: Yeah. Particularly when you try to get to know an enigmatic character like Zuckerberg. He doesn’t trust people very easily. So it really takes a long time for him to be candid with someone. Remarkably, the best two interviews I wound up doing with him ever were the last two interviews I did in the book when I was already writing the book. This is in 2019, where I was able to ask him questions that normally he would walk out of the room if he got from someone. But he knew because I knew the company so well, that he had to grapple with it, and he had to give me the answers as much for himself as to me.
Jim: Very good. We will kind of weave through this to the degree that I can. Your insights into the nature of Zuckerberg. He’s obviously a world historical person, and I suspect you have as much insight on him as anybody who isn’t his family.
Jim: Let’s start with in the book early on, you dig into Zuckerberg’s denial of the impact that Facebook maybe either had or didn’t have in 2016 elections from the flow of misinformation. In fact, he called it a crazy idea at that time. In some sense as the book unfolds, that seems to be a pivotal moment when Zuckerberg’s view, or at least his public pronouncement about the nature of Facebook, became disconnected from the broader public perception of the impact and perhaps the duties of Facebook. Since then, perceptions have clearly changed. As you point out, investigations on multiple content, charges of breach of fiduciary duties with respect to privacy, huge fines, etc. Did this strike you as the pivotal moment in the unfolding of Facebook, or at least the later days of Facebook?
Steven: The election was the pivotal moment. Facebook didn’t change in one day, but the election exposed and made it difficult to ignore the things about Facebook which were worrisome. I think if Donald Trump wasn’t elected president, this reckoning might not have happened, or it certainly wouldn’t have happened at this time.
Steven: As I get into in the book, there’s plenty of events where people called out Facebook for privacy violations, for being cavalier about the content on the network, all sorts of things. But they managed to skate by it and not really pay the price for when it went wrong. It figured it can kind of go along and apologize for what it did wrong and going along in its merry way.
Steven: But after the election, people took a closer look at what happened on Facebook and what was happening on Facebook and became less forgiving about it, because they realized what an impact it had. Now, that particular statement by Zuckerberg came at a conference two days after the election. I was in the room when he said it. It really didn’t level the room when he said, “It’s crazy to think that we affected the election.” People were still digesting it.
Steven: But what I only learned later was, the day before that, there was a big meeting at Facebook, and people were in tears. They were asking themselves, “Did we have anything to do with that?” So in a way, he wasn’t dealing totally candidly when he brushed it off, because he knew from the reaction of his own employees that there was a concern about this at that point. That same shock that the public had after Trump got elected, and they said, “Well, what about a lot of stuff that happened on Facebook? Did that have anything to do with it?” That was a question that was being asked in Menlo Park and the other Facebook offices by Facebook employees.
Jim: Indeed. It’s interesting to my mind that that highlighted to me what I took as sort of an implicit theme of your book, which I don’t think you ever quite made explicit, which was that Facebook started out in that era that was kind of congruent with the hacker culture, spirit of doing something outrageous but useful, thumbing the noses of the powers that be, which we’ll talk about a little bit later, how Zuckerberg kind of got sideways with the admin at Harvard, very slowly climbed a ramp in this hacker spirit. If we think back, you and I are old timers, we go back to the earliest days of the online world. I worked at The Source, which was the very first consumer online service from 1980 to 1982.
Steven: Yeah, I was on The Source.
Jim: Yeah, I was TCA 080, that’s how early I was. If we recall, the evolution of the considered responsibility of these platforms, in the ’80s, we had services like CompuServe. The Source, I quit after two years. Reader’s Digest had bought it and stifled it and just didn’t understand what they had. I up and quit and became an entrepreneur and quite a successful one using the technologies to deliver information into professional audiences.
Jim: But the consumer online world took off at a fairly good pace. CompuServe by the mid-’80s had millions of users. At the time, the ruling law was a case called Cubby versus CompuServe, which was a court decision which basically entirely let CompuServe off the hook for what happened there so long as it did not do any moderating, which is quite interesting. That was the consensus both of the legal framework and what we might call the hacker sensibility, which is “Hey, these platforms are to allow people to get together and talk.” The biggest thing on CompuServe were things called SIGs, special interest groups, which were kind of a combination of a forum and a software repository and some other little features. I think they eventually had chat built into them, and their multi-user chat. So again, CompuServe was really about interaction. They saw that way on. But the ethos of that time was not their problem what gets said on that platform.
Jim: Then later, 1996, after a lot of lobbying and conversation on Capitol Hill, and I still remember a lot of that. [inaudible 00:10:22] organization, I was I think member number seven of the EFF, had a role in this as well. The famous Section 230 of the Communications Decency Act, which basically gave platforms like AOL at that time was still relatively big, and various other players, essentially immunity from what the people on their platform said, even if they did do moderation.
Steven: Yeah. That was the key. I worked for Newsweek for 12 years. I was chief technology writer, and I remember I had actually fairly recently gotten to Newsweek then. I interviewed Al Gore the night that passed. I was writing about it for Newsweek. We talked about it. That basically gave the platforms the right to do some moderation, as you point out. Before it was binary, either you hands off and you weren’t responsible, or if you mess with it, then they can sue you and say, “Well yeah, hey, you were moderating. So you are responsible.” So now, they were able to say, “Hey, we can take out things like hate speech or threats or whatever to make it a safer environment for people,” and still not be responsible for vetting it like an editor at Wired would vet a story to make sure that we don’t get sued, because we are responsible for our content. If we libel someone in an article in Wired, we have to answer for it. Not if you post something on AOL, or later Facebook. So Facebook was a child of the 1996 Communications Act.
Jim: Indeed. I guess my point is, that was the ethos of that time.
Steven: Yeah. Yeah. But Mark Zuckerberg had no consciousness of that at all when he started Facebook as a college network. I actually get into the moment where that started become a factor. It wasn’t the engineers that Zuckerberg hired or Zuckerberg himself who became concerned about this. It was the people who answered the mail, the support people. These were people who maybe dropped out of college or just recently graduated from college, very early employees of Facebook who became multi-millionaires just because they were lucky enough to walk up the stairs, Facebook’s office in 2005 or whatever.
Steven: They got these complaints and people saying, “Hey, here’s someone who’s saying bad things about gay people. Here’s someone who’s posting a naked picture of [inaudible 00:12:54] porn,” or whatever. They realized that Facebook had to be involved in this, and they sat around and just made some informal rules. It was almost like an ad hoc agreement. When you signed on as a new employee to do support, the person next to you would give you advice and saying, “Hey, what should I do about this piece of content? Someone’s complaining about it.” They said, “Well, is the person wearing pants? No pants? Take it out.”
Steven: And from doing informal rules, they eventually evolved into what’s now a 37-page set of community standards. Then there’s a whole larger body of work, which gets into more detail. But that’s given to tens of thousands of content moderators throughout the world that try to police what is the world’s most active hotbed of discussion. And [inaudible 00:13:50] too, and content which is hate content, content which is harmful, content where terrorists try to put their work on there. So that all came from that, from that informal discussion there.
Jim: Yep, and you’ve talked about that. I thought that was actually quite interesting how it was essentially a shared, editable text document that people just added into in an ad hoc fashion. Kind of interesting.
Jim: But let’s go back a little bit to the history of this. Again, there were millions of people on CompuServe, AOL, even The Well. Didn’t have millions, but had over 10,000 at its peak. You and I have both been a member of The Well for a long time. There didn’t seem to be so much concern about so-called bad things. I mean, there were some bad things on CompuServe way back in the day, and CompuServe’s view on it was, “Oh well, we don’t care. Not our job.”
Jim: And The Source where I worked, so I was more aware of how we did it, again, it was hands off with respect to content. But because it was owned by the Reader’s Digest, oddly, they were extraordinarily persnickety about George Carlin’s obscene words, and so they had a strict rule. Can’t say fuck, right? In fact, I got in a big argument and almost threatened to quit when the chairman of The Source, a Reader’s Digest executive, demanded that I take down from the user publishing one of my earlier products, a essay written by one of our users, which he was getting paid for, sort of a precursor of blogs, a very humorous riff on all the various ways fuck could be used. The Reader’s Digest wouldn’t have cared if there was a neo-Nazi group on The Source, but they had classic, old school, fusty editorial control over the actual words that could be used.
Jim: Why does it make a difference today that we should suppress hate speech, say, when it didn’t in, say, 1982?
Steven: This is the benefit of going back and looking over the history of the company, how it evolves. The reason why that wasn’t such an alarming thing in The Source or CompuServe or whatever, is that there was no equivalent of going viral on those things. Facebook built in the tools to allow certain kind of posts just to percolate, and this is a horrible comparison for our time right now, like a virus. One of their big product initiatives was actually named Pandemic at Facebook. It’s the name of one of the chapters of my book that I wrote way before this happened.
Steven: Facebook thought that was great, because Facebook was really obsessed with growth and to things that got people on the platform and drew them in there and have them spend more time there was great for growth and retention. So when things started going viral on Facebook, which happened because they would distribute posts, they’d rank them higher on your newsfeed if more people were engaging with it, and especially people you knew, that allowed things to get huge audiences of millions and millions of people. Then businesses cropped up to take advantage of that. The Arab Spring took advantage of that, and Facebook thought that was terrific, that’s fantastic, and never really seriously considered the consequences of destructive content being distributed widely and going viral on Facebook, and why it was more likely to go viral on Facebook than content which was upstanding and well-researched.
Steven: So this was a consequence that at first they didn’t think of. Then it was called to their attention that they stuck their heads in the sand and ignored, because it didn’t really fit the way they wanted the company to grow.
Jim: Yeah, actually I think that’s interesting; and in fact, I might even go a step further and say they didn’t really stick their head in the sand. Quite the opposite. Because they were so metrics driven and the key metric for a long time was engagement, they actually had a positive incentive, not just head in the sand, to tweak their algorithms.
Steven: Yeah. They had to ignore the idea that … You’re absolutely right. That was [inaudible 00:17:59] they’re going. They had to ignore the fact that they were encouraging the engine that wound up becoming a very efficient engine of destructive content, which could erode democracy, for instance. It could lead people to riot and cause fatal riots in countries where Facebook didn’t have anyone who spoke the native language.
Jim: Yeah. Clearly, their algorithm was not optimized for badness, but what it was optimized for was high cognitive impact. Right?
Steven: Yeah, the sensational content, that’s a term that the Zuckerbergs actually used, is more popular when they measure this than non-sensational content. We know this. Look at what you put on the cover of a tabloid newspaper. But if that dominates what you do and no one is checking to make sure that even truthful things are circulated widely, that’s a destructive force on society, and that could help elect candidates who tell lies about their opponents.
Jim: Yep. But again, I’m going to play a little bit of a contrarian here just for fun, see how you respond. In the previous world, prior to the network world, we had scurrilous pieces of shit like the Weekly News of the World, the most outrageous. And then things like the National Enquirer, which had readerships in the millions that published blatant lies, ridiculous stuff. “Liz Taylor has a baby with an alien,” right? On and on and on. Millions of people were reading scurrilous shit on these publications, and many of us who were sort of hardcore civil libertarians said, “Oh well, that’s the cost of free speech.” So there’s a part of me from that old hacker culture. You remember what hackers were like. You were one of the central figures in catalyzing the hacker movement. You remember the early days of online, when the idea of censorship, other than, again, personal attacks, doxing, et cetera, would have been rejected by the ethos of online, and that the argument was the answer for bad speech was good speech, right?
Jim: Are we really sure that the feel of the need to sensor is actually net good?
Steven: Well, here’s the difference between the World Weekly News, what happened in the print world with the World Weekly News and what happens on Facebook, and it’s a design issue. It’s not necessarily a censorship issue, it’s how Facebook handles it. So when you got the World Weekly News. Sometimes I would as a teenager, I’d take the subway into downtown Philadelphia, and we’d go to a newsstand and we’d buy the World Weekly News and we’d read about the aliens and the gruesome stuff, people with six arms or whatever. It was all the World Weekly News. We knew what it was. The World Weekly News wasn’t mixed in with The New York Times and other publications.
Steven: You go in your newsfeed on Facebook, and an article from the World Weekly News universe gets mixed head to head with an article from The New York Times or Wired or other things, and sometimes it’s intentionally masked to look like that. They’ll make up a publication like the Denver Guardian. There’s no Denver Guardian. The address of the publication was a parking lot in Denver, but you won’t be able to tell the difference between the Denver Guardian, which is a made-up publication and the Denver Post, which is the newspaper in Denver. So that’s the difference. The difference is that you can’t tell from all the signals you get when you putting your 15 cents that we used to spend to buy the World Weekly News, and we knew we were going to get a romp which maybe wasn’t the most rigorous journalism. We knew that.
Steven: But on Facebook, it’s hard to tell. People take advantage of that to circulate content which is intended to mislead. Then the next step is to target it to people who are most likely to misinterpret it or be influenced by it because of their particular biases.
Jim: Yeah. In fact, I don’t recall if you mentioned it in the book. I do know from my own readings on the topic that a goodly amount of the most clever disinformation in the 2016 election was actually done for profit by Eastern European and Balkan, you know, the famous Macedonian teenagers, right? They discovered that you could come out with the most outrageous shit about the 2016 election that was highly inflammatory cognitive button pushing, get lots of reads and sell some ads. Right?
Steven: Facebook knew that. When they looked into it, that’s what they found. There was this one particular town in Macedonia where a lot of people were driving very fancy cars, because they got rich by taking a far right wing blog item totally made up, and like the-
Steven: … Item, totally made up. And the Pope endorses Trump or whatever, Hillary killed an FBI agent, and making up a phony publication that it supposedly came from and circulated it. People clicked on it, and there were ads and those people got paid from the ads. And as it turns out, they discovered content which leaned towards the right, which made you dislike Hillary or not want to vote or whatever, was more profitable.
Steven: So there was more of that stuff on there. And then of course there was the Russian interference, which they have their own reasons for circulating content which favor Trump. But you’re right. By numbers, most of it was for profit.
Jim: Yep. And it was essentially an ecosystem… Again, it reminds me a lot of a famous town in… I guess, village and County in Mexico, where suddenly people were driving fancy brand new pickup trucks. They weren’t into Mercedes then, the roads were too bad. And it turned out that this whole village was involved in building artifacts for World of Warcraft. So when an ecosystem appears, somebody will figure out how to exploit it.
Steven: Yeah, yeah. No, definitely. What’s interesting is when Facebook, in the last six weeks of the election, it has really became apparent and people started complaining about it. Barack Obama, in a rally for Clinton, actually called out Facebook for this. And Facebook, I describe a meeting they had, Cheryl Sandberg was on the phone and there were the policy people. And taking the lead with the head of Facebook’s policy in Washington DC who came from the Bush administration, he was best friends with Kavanaugh. People in that office told me that he saw his job as protecting the Republican side. And he said, “We’re not going to do anything about it. We’re not going to take this down, because why offend the right?”
Jim: Of course it is difficult, right? How do you do it and yet not remain a force for the left?
Steven: Yeah. The complaint was… You’re right, the complaint was, “Oh well, we don’t want to tilt the playing field.” But the fact was the playing field was already tilted. By taking out that toxic content, you actually would be leveling the playing field. It was tilted not by accident, but because the way Facebook was designed, it was vulnerable to being exploited in that sense. You know, Mark Zuckerberg didn’t intend to say, “Hey, I want bad content to thrive.” But he built it to generate interest and attention, and keep people on there in a way where that was a consequence. And a lot of people called him out, people not taking responsibility for fixing that, when it became certainly clear by the end of that election cycle that it was causing damage.
Jim: And there’s another aspect, going back to electoral advertising, which makes all online advertising different, but particularly Facebook, which is micro-targeting, right? Historically, if I ran an ad, and let’s say I made a grossly racist appeal, I would get a lot of comeback as a candidate for running a grossly racist ad, right? I remember George HW Bush got a lot of swing back for his Willie Horton ads, right? He eventually won, but he took a lot of grief.
Jim: The problem with micro-targeting is, I can send those racist ads just to racists who aren’t likely to complain. And I think that’s something that the affordances that Facebook provides to all marketers, but particularly political marketers, fundamentally changes the game. Well, it doesn’t fundamentally change the game, but you combine it with the cost advantage. Because you could do micro-targeting with direct mail but it was very expensive. You can do micro-targeting on Facebook, and it’s really cheap and allows you to send different messages to different people and not be held publicly accountable for your advertising.
Steven: Yeah. Yeah. As you say, it doesn’t change the game, it advances it. But it advances it so far that it does change the game. I was saying that if you do the same thing at a bigger speed in a lower cost, it actually becomes a different thing, because you’re able to use it, in a sense, where it affects the situation much more than if you did it at a lower level. It’s like saying the computers we use now in our hands are different devices than those things that we used in the early 1980s.
Steven: So I think the micro-targeting becomes a very big issue because Facebook enables that to happen, and as you say, these political marketers can identify people who are susceptible to messages. So Facebook’s model is to deliver advertisements to people who are likely to respond to it. And Facebook argues that you should welcome these advertisements because they are relevant to you. They are going to give you buying opportunities to things that you actually want to buy. If you’re a fan of an obscure music group and there’s some t-shirt that you never would have known about with this group’s logo on it, it’ll find you and say, “Hey, here’s an article of clothing you will love to have that you didn’t know existed.” And I’ve had that happened to me on Facebook and bought the t-shirt.
Steven: But where it’s different, is when Facebook understands enough about you to know where your weakness is, and then sells your contact to people, allows people to contact you to explore your weakness. That’s something different. People aren’t so happy when they hear they’re being manipulated by people, you know? People are able to know their weaknesses and then exploit them. That’s something different, of different character. And we’re having a debate now about whether that’s kosher to do in a political campaign. Some people think that Facebook or anyone else shouldn’t be allowed to do that degree of micro-targeting in a campaign.
Jim: Yeah, that’s a an interesting question. And as far as I know, Facebook is holding the line that they’re going to allow it for the coming election. And I can tell you from some of my own sources, that the Republicans, particularly the Trump campaign, have invested vast sums in running the most sophisticated ever social media based election campaign, and have raised at gigantic war chest. And in fact, part of their pitch to donors is specifically this. So it’s going to be part of the 2020 experience.
Steven: Yeah. You know, in 2016 they actually did run the best campaign by far, and the people at Facebook were looked on in awe at how well the Trump campaign used Facebook. And Facebook also let them help. They embedded some of their employees in the campaign just like they do for any big advertiser to help them use the platform better. They offered the same to the Clinton campaign. The Clinton campaign said, “No, don’t need it. Go away. You’re not that important we want to do that.” But the Trump campaign took the help, and in a way that they didn’t need it quite as much because they really understood Facebook and were able to do things.
Steven: On some days they ran 175,000 different ads to different people, because they were able to tailor them to exactly what they thought those targets wanted. “Hey, here’s a person who’s interested in gun rights. Let’s get a gun out to them. Here’s a person who’s anti-abortion. Let’s do that. Oh, here’s a person who’s an African-American. They’re probably going to go and vote for Trump. Let’s send them something to get them sick of the whole system so maybe they won’t vote.” So they were able to do all these different ads to people to be very effective in helping on election day.
Jim: Yep, and they’re going to do it even more this time, I can tell you that. I will say, one thing that has changed… I do a little bit of promoting of my podcast on Facebook, and I was getting a fair number of my episodes censored by the advertising clearance engine at Facebook, because some of my episodes, maybe 15% of them, are sort of political, right? And they now require you to register as a political advertiser if you’re going to run political ads, and you have to put a check on the ad if you believe that it infringes in the political space.
Jim: And so I actually went through the process of registering as a political advertiser on Facebook. And when I have an episode that is explicitly political, I’ll check the box. So unlike in 2016, Facebook knows what is a political ad and what is not. In 2016 they didn’t know or care. So at least in theory, and frankly, I don’t know whether they’re doing this or not, but they should, it would be really helpful to, shall we say, the hygiene of political discourse, if published every ad that was run with the political checkbox next to it and the micro-targeting filter that was applied to it, so that public interest people… Obviously citizens aren’t going to do it, it would be too much data.
Jim: But public interest groups, if only the opponent could go through this and say, “Huh, Biden is appealing to black nationalists with an obviously racist meme.” Or, “Trump is appealing to neo-Nazis but doing it in a surreptitious way that only neo- Nazis see. So that one reform alone could be quite helpful in letting the immune system of discourse in journalism, and frankly even competition, identify abuse of micro-targeting.
Steven: Well actually they do that. When you check that box, that stuff goes into an archive that in theory could be searched. There’s a question about how great the tools are to search it, and can help with what kind of access research groups have to that archive. But that’s the theory, that anything political will be able to archive. And then when they first did that, media outlets were outraged when they would do articles about political things, and Facebook would say, “Well that’s political,” and say, “Wait a minute, we’re not political, we’re news.” And I think Jim, it’s interesting. Let’s see what would have happened if you pushed back and said, “I’m a news source, I’m not a political source.”
Steven: Because there was a big fight within Facebook, the people who work with journalists in Facebook, in the Facebook journalism project were saying, “Hey, we really can’t do that. We’re going to get a lot of crap from journalists and publishers.” And that’s what they did. And Facebook did pull back a little from that. But it is difficult to determine what’s political and what’s not political, when it’s not a campaign actually taking out the ad. But it could be a podcast with someone who may lean left or right. That’s okay. There’s publications like that. Or it could be a publication like the New York Times or Wall Street Journal, or even Breitbart that says, “This is an editorial we’re running.” So it gets into murky grounds.
Jim: Yeah, it does. And frankly I had that thought, but at the tiny level I advertise, clearly I’m just being filtered by a machine, and it wasn’t worth my time to go argue with the goddamn board, right, and try to deal with Facebook. It’s impossible, there’s no way to get ahold of anybody. It’s just a fucking nightmare. So I said, “All right, I’ll go through the process and get registered as an advertiser.” But you’re right, probably I could have fought the fight, but it wasn’t worth the time.
Jim: Let’s change directions here a little bit. This has been extraordinarily interesting, but I’d like to focus back a little bit on Zuckerberg the person, and how he evolved both as a person, and his character. And how Facebook evolved from earlier things that he did, and kind of came up a slope a little bit at a time.
Jim: One of the things that struck me when I read the book, and then went and researched it again to create my topics for the show, was, “Damn. Zuckerberg was born in 1984. He’s a baby.” Right? I’d already left the source and started my first two companies by 1984. I guess he’s the king of the millennials, which I think in itself is interesting and something us geezers need to keep in mind when we think about Zuckerberg, that he is, if not quite an internet native, he’s certainly an online native. And I think that’s something well worth keeping in mind when we think about Zuckerberg. How much of that came through in his personality that he was an online native?
Steven: Quite a lot, right? 1984, you mentioned. Wow. That’s when my first book came out, and that’s when that hackers conference, which was indeed generated by my book, that was my book party that Kevin Kelly and Stewart Brand, and the people who went through after they did that book. So he grew up as an AOL Instant Messenger kid, and he was always building projects and thinking about what to do. He built some projects on top of that. He built a product that was very much based on Synapse, it was called synapse AI. It was based on Winamp, which was the music player which AOL bought.
Steven: And that was very much his frame of mind, and then and all the stuff on the internet and the web. He grew up using web-based tools, and that’s why he was able to build Facebook so quickly is because he understood, in a way that maybe an older person wouldn’t have understood, that you could build an application really quickly, and it could be really powerful, and it could be updated really quickly using some of the brand new tools of development which had come out, along with the popularization of the web in the late ’90s and early 2000s.
Steven: And that really framed the way he thought about things to the degree that when the mobile revolution came six or seven years later, in around 2010, 2011, Zuckerburg was caught short. He was more of a web person, and his generation was a little behind in mobile, and Facebook got cut short and had to really regear itself in order to build itself into a mobile company. And that was in part because Zuckerberg’s mentality was the web.
Jim: Interesting. And that reminds me a tremendous amount, as soon as I read that in your book, and I remember the history of it, and part of the expenditure of a ridiculously large sum of money for WhatsApp was about closing that gap. It reminded me a lot of Gates, Bill Gates, right? Mr. PC. He was amazingly blind to the internet, right?
Jim: And they famously had to call an all hands meeting and refocus the company on the internet, but they did it years late. And very, very similar, though unlike Microsoft, Zuckerberg did the change in time. I was amazed watching their percentage go from web-based to app-based, just was like a rocket ship. Very, very rapidly. Now another thing that you mention I believe about a dozen times in the book, is that Zuck was a great gamer, PC-based games. In particular, Civilization, being one of his favorite games, but also Risk and a bunch of others.
Jim: And it resonated with me, because I’ve been a gamer since I was 10-years-old back when we played Avalon Hill War Games on tabletops, and have gone through all the evolutions. In fact, I recently have created a phone-based game, which is out for beta tests right now. So you mention it a dozen times, so you must think that Zuck’s interest and saturation in games, particularly Civilization, is an important insight into his personality?
Steven: Right. I think it’s telling, the kinds of games he gravitated to, though I’m sure he’s not a stranger to shoot ’em ups. He really loved the games where you built a society and ran a society. And he plays board games too, and Risk was one of his favorites well into adulthood. And as you know, the point of that game is to take over the world.
Steven: And when he first started the Facebook, which was what Facebook was called when he launched it at Harvard, within a couple of weeks he was already gearing up to move it to other campuses after he released it in Harvard. So the whole world of universities became his Risk board, and he went from one to another. Some of them already had similar systems working, but he would take them over just like you take over a country in Risk, where maybe some pieces are already on there from one of the opposing players. It was a good window into his mind.
Jim: Yeah. Actually, I noticed that. When I was doing my content prep for this call, you pointed out that the first college that he went after, after Harvard, was Columbia, which seemed counterintuitive because it had already had a similar service. But a game player might well say, “Hey, I want to attack that entrance way into Australia in Risk, right?” Rather than go grab some cheap territory in Mongolia. Because two things: One, I want to crush an early opponent before they can get bigger, That’s game thinking. And second, in the meta game of business, you strengthen yourself by competition and testing yourself against others, and learning faster than the other guy.
Jim: So I would say Zuck probably did internalize a lot of very useful lessons from gaming. I know I did. I mean, I had people ask me what are my influences in my business career? And I’d said, “War gaming was certainly one of the top three or four.” I was impressed by the fact how often you brought that back and wove it into the storyline.
Jim: Now another question that really jumped out at me again, particularly as I was reviewing my notes, that Zuckerberg took a consciously elitist educational approach that seemed kind of contrary to his own interests. It demanded that he go to Exeter and then Harvard, he always wanted to go to Harvard. Rather than, what would have seemed more natural, Stanford, MIT, Carnegie Mellon, University of Michigan, University of Texas, one of the universities more well known for its computer science department. Did you get any insights or motivational sense of why this kid, who clearly a nerd and a geek, would sort of forcefully pound the table for the Exeter Harvard route rather than the public high school, MIT route, for instance?
Steven: Right. Well the Exeter, he had heard about they have a great computer program there, and he went in his freshmen and sophomore years of high school to the public high school in Westchester County, which wasn’t a terrible high school. But he exhausted all the opportunities he had for learning about computers there. His mother wanted him to go to a closer private school, Horace Mann, where he can commute, and she didn’t want him to leave the house. His older sister was going off actually to Harvard, and she didn’t want to lose two kids the same year. She wanted to have him around for his high school years.
Steven: And he said there, “I’m going to Exeter.” And she said, “Why don’t you just at least interview with Horace Mann, and maybe you’ll like it.” And he said, “I’ll interview with them, but I’m going to Exeter.” And he went to Exeter. Now the Harvard thing… I think maybe because his sister went there, he knew people. He also was interested in some non-computer things. He took a lot of psychology courses, and he was interested in the classics and Latin. His hero was Augustus Caesar, another conqueror.
Steven: So I think that’s the best I can get to why he wanted to go to Harvard. He had a Harvard pennant in his room at Exeter, someone told me, which was interesting. But you’re right, it turns out that you could make an argument that he might’ve been happier at MIT or Stanford, he was always building things. And Harvard wasn’t really friendly to people who were entrepreneurial. Nowhere close to Stanford or MIT.
Steven: And I’ve talked to other people there who had an entrepreneurial bent, and they felt that the school didn’t encourage them. And when Zuckerberg got into a spat with some other people, in terms of their companies, and this has been documented not a hundred percent accurately in this movie, Harvard really didn’t want to deal with that. That that wasn’t their DNA.
Jim: That came up again and again, we’ll get to that in a little bit. But I’m going to take a little sidebar here, because you mentioned it. I did not have any idea till I read your book that Zuckerberg’s hero was Caesar Augustus, or Octavian as he was known prior to his ascent. And that really struck me. I said, “There’s something about Zuckerberg that reminds me of Octavian, right? He even looks sort of like some of the younger statues of him. I wonder if he is consciously grooming himself to be the Octavian, or the Augustus Caesar.
Jim: And people who don’t know the transition between the republic and the empire, Octavian was this very unlikely character. When Julius Caesar was assassinated at the Senate by Brutus and his friends, as laid out somewhat accurately, I suppose, in the Shakespeare play, Julius Caesar, it had been publicized but not well known that Caesar had recently made Octavian, this 19-year-old distant cousin, his heir.
Jim: And all the power brokers of the world thought this little punk could be pushed aside. But guess what? This little punk was smarter, more ruthless, but quiet and low-key, than the rest of them. And he ended up killing them all off over time and becoming the first emperor of the Roman world. For someone to choose Octavian, Augustus Caesar, as their hero is a very interesting choice.
Steven: Yeah, yeah. And now to the point where he went… I think it was his honeymoon, to Rome, and he dragged his wife to all these sites that where Octavian went, and museums, exhibits about this. And she complained that there was three people on her honeymoon: She and Zuckerberg, and Augustus Caesar.
Jim: That’s amazing. And I think that’s something worth considering, because again, in a historical context, Octavian or Augusta Caesar could be thought as comparable to a Napoleon or a Hitler or a Stalin, right? Only more successful than any of them. So, I mean, it’s a very odd choice to be your hero.
Steven: Exactly. He wasn’t a warm fuzzy guy.
Jim: Yeah, not at all. I mean, he was a cold stone killer, including his own children, his wife. Well, I don’t think he killed them, he exiled them until they died, basically. So anyway, a real cold and tough guy and conquered the world. Basically took over the biggest empire in world history at that point.
Jim: So the next point, and you alluded to it, which was.
Jim: … At that point. So the next point, and you alluded to it, which was from the very time he got to even in late high school, but particularly the time he got to Harvard, Zuckerberg was a relentless small scale web developer being paid by other people, and entrepreneur trying all kinds of stuff in a rather non Harvard-ish kind of fashion. And that did put him a little bit sideways with Harvard. And as you point out, MIT or Stanford wouldn’t have cared. I went to MIT in the early ’70s, and the culture there was, “Man, you had to do something pretty bad on a computer to get into trouble.” And the first time he came to the attention I think of the authorities was he creates something called Face Mash, essentially a juvenile takeoff on Hot or Not. And it had a lot of the hacker-ish aspects, scraping photos off the official face site, not concerned with privacy or permissions, et cetera. But it was within the spirit of the hacker ethos of that time, I’d say.
Steven: Yeah it was, but Harvard didn’t have much of a sense of humor about it. And yeah, I think it was a combination of that he violated their security rules by working his way into the computers in the other houses, these residential houses that Harvard organized in, in order to scrape the databases to get the pictures, but also it was a very politically incorrect thing to do. The women’s groups at Harvard complained that they were being objectified, which they were. And so there was a lot of pressure of Harvard not to ignore it.
Steven: And of course, the rules he violated weren’t free speech rules, but were rules about not breaking into computers and getting the information. And the lesson he learned from that was, “Hey, don’t do it again. If I’m going to build my site, that is a community site, I’m not going to steal the information from Harvard, I’m going to make people bring their own. That way there’s no privacy issue in terms of that because the people have provided it themselves about themselves. I got them to do it for me.”
Jim: In fact, you mentioned the earliest days of the Facebook is a direct quote from your book that the Facebook, the precursor to Facebook, offered more privacy protection than any other social networks of its time. So he did learn the lesson even if it might’ve been cynically. The other thing I thought that was interesting was you talked about the hearing that he had before the deans, et cetera, and he was given a slap on the wrist. And the actual charge was something like improper social behavior. Odd and Harvard-esque kind of charge. Zuckerberg and his social skills. Now, you mentioned again half a dozen times at least, the Zuckerberg trance. Really apparently quite off-putting, go on for minutes. And you quote one of people who knew him really well as he falls into and becomes the Eye of Sauron from Lord of the rings. What are you willing to say about Zuckerberg’s social skills and social aspect?
Steven: Right. Well, obviously he’s evolved quite a lot from a 19 year old who was socially challenged then in the early years of Facebook. As a leader, he would get very nervous addressing even his relatively small team and it all hands, but he has this habit of sometimes you’d ask him a question, and he would just stare at you. I think what happened the first time I met him in 2006. And it’s very unnerving, let me tell you. And you wonder whether there’s something cognitive going on there, but it turns on he’s just processing things. And he does it less now, but sometimes when you ask him a question that he didn’t expect, he doesn’t have an answer ready. He’ll give you that look for a little while, but nothing like the minutes long pause that makes you go insane that he used to have when he was a younger person.
Jim: Was it really minutes long or did it just seem like it?
Steven: Maybe it seemed like a bit. Sometimes I think you would be approaching that. I mean really way beyond what you’re used to in terms of a one to one interaction. Local people described about how they went insane when that happened. People like Don Graham who was the CEO of the Washington Post, actually he was my boss at the time. And Roger McNamee, it happened to him. A lot of people. I interviewed him once at the end of 2006 at a conference, and I asked him a question, and here he is on stage, and he didn’t answer the question for a minute. So people are getting nervous and the audience are like nudging each other. “Hey, what’s going on here? What’s going on?” So it really is something you don’t forget.
Jim: That’s interesting. I’m not quite sure what to make of it. Like you, I have never experienced that with somebody. If I asked them a question and they, let’s just say, 60 seconds, just sit there and think about it before they answer. I suppose at one level, it could be a mark of respect that he takes your question serious enough to think about it for 60 seconds. But on the other hand, it seems missing the normal social dialogue cues.
Steven: Yeah, it’s like he didn’t want to play that game in a way. I mean he was willing to take whatever that happened, because it was the way he wanted to behave. He wasn’t going to bow down to the niceties of social interaction. He was okay with himself being that way until really he had the responsibilities of a CEO and he became an adult. And you don’t see that with him so much. I once had that treatment from Bill Gates. I once in 1996 I believe, I was doing a story, the first story about the Browser Wars, and I went and visited him in his office and gave him a long, complicated opening question about the Browser Wars. And he gave me that silent treatment for a long time. And there was this dead space in the room and I just finally asked another question.
Jim: Did he never did answer?
Steven: Well, he did. He did. I think he was more conscious about doing it. It was a more of a tactic of saying, “Fuck you, I’m not going to answer this now.” And then he did later, he remembered it.
Jim: At some level that seems a little kind of lacking in social skills on Zuckerberg at that time when he was at Harvard. But on the other hand, something that struck me as showing great social intelligence, if in a Machiavellian way, was the way he played the Harvard Crimson. Again and again, he got the Harvard Crimson to do a fair amount of the work for him.
Steven: Yeah, yeah. It was like covering Zuckerberg for the Crimson was like the way a publication covers Silicon Valley. He was a Silicon Valley of Harvard, single-handedly. Obviously, he came to our attention after that basemaps issue, but they would write about his various projects and when they wanted someone to comment on something about the web and products like they would fall to him. And even after he left Harvard, obviously they covered the Facebook really, really closely and we went out and visited him in Palo Alto.
Jim: Masterful public relations for a 17 or 18 year old kid. I was impressed. I was never very good at public relations, and he was way better than I ever was, I’ll say that, when he was a pup. Also at Harvard, again, it was the subject of that not entirely accurate movie. I forget what it was called. Was it called? The Social Network, maybe?
Steven: Yeah, The Social Network. Yeah.
Jim: Yeah. You delve into some degree, the story of the Winklevoss twins, Greenspan, was Zuckerberg over the line and ripping them off? Or was it the hacker spirit of sharing ideas and seeing who comes up with something? What’s your take on that if you don’t mind?
Steven: Yeah, I didn’t want to dwell on that too much, but I had to look at it and really fresh and do my own research. And there’s a couple points there. One is that the Winklevoss twins were not going to create a product that took the world by storm. They were pretty clueless about how they were doing things. They’d been caulking about the thing for a year. It wasn’t an original idea. Friendster was already out. Lots of campuses had their equivalent of a Facebook. So it wasn’t a brilliant idea. There was something even on campus. You mentioned a guy named Greenspan. He had his own product, but what Zuckerberg did do, which was deceptive, was he said that he would help the Winklevoss twins and the other guy for ConnectU is what they were calling it, originally Harvard Connection, to code up their program.
Steven: It’s sort of telling that the key person that these people wanted to do was just someone they would hire as a job or they didn’t really have the skills to do that crucial stuff themselves, and their design really was going to be everything in terms of the success. And for two months, he said he was working on it, and he really didn’t work on it. And he dragged his feet on it, and he actually shared with some other people on instant messaging that that’s what he was doing, that he was slowing them down. But the two months difference that it made for ConnectU wouldn’t have made much difference anyway. They had someone working on it before, they didn’t do anything. They had someone working on it afterwards that didn’t do anything.
Steven: So as it turned out that Zuckerberg messing them up for two months was the best thing that ever happened to them because they wound up getting millions and millions of dollars in a settlement, and of course, they griped that they should have gotten more. But I think it was a total bonanza for them. And it’s just absurd to say as they are saying, and their biographer keeps saying, because he has another book out about him, that Mark Zuckerberg, had it not been for Mark Zuckerberg, that’s ridiculous. I do give him some credit for being on the Bitcoin thing early.
Jim: Yep. Yeah, that seems like a reasonable take. I mean, it does strike me having read it and getting your book that Zuckerberg was a little more to duplicitous than one might like. However, the probability of Winklevoss twins and friends actually having created Facebook, quickly, when you look at all the odd luck and right place, right time, that Facebook had with exactly the right product idea, seems unlikely.
Steven: Yeah. There’s an extraordinary number of people who have been made super rich, certainly simply by being around or involved with Mark Zuckerberg, who spent a lot of their time complaining about it. That’s what I found.
Jim: Yep. No, that’s interesting. That’s my next question was going to be people who were around very tightly with him early, very tightly, more tightly than anybody, and what’s their take on Zuck? Hughes and Moskovitz? Did you get a chance to talk to those guys?
Steven: Yeah, Dustin Moskovitz, who was probably the closest lieutenant in the early days of Facebook, and he turned out to be a really a great executive, a CTO for Facebook in the early days. When he left, he has never been on the record as trashing Facebook. There’s probably clues that he’s not thrilled with the direction that Facebook took, but there’s a possibility and I don’t know. Because actually, I didn’t talk to Dustin. And I think it’s telling that he didn’t talk to me, maybe because he didn’t want to have to express what he thought of Facebook, or maybe he just didn’t want to talk to me. But maybe he feels “This guy made me one of the richest people in the world. It’d be ridiculous for me to trash him.” Other people didn’t have that problem.
Steven: Chris Hughes walks away with hundreds of millions of dollars and has no problem for basically doing nothing extraordinary at all. What he brought to Facebook, he was basically answered the phone and for press stuff and did some other things which weren’t extraordinary, but he was lucky enough to be involved as a founder and now he’s going around saying that Facebook should be broken up. I don’t know. I mean, you’re allowed to express your opinion, but it doesn’t seem like a great karma thing to do. It’s not giving back the money. That’s the thing. If he said, “You know what? Here’s that five hundred million dollars, I’m going to give it all to charity. I’m going to live like I would have lived otherwise. I’m going to give up my nice house in Greenwich Village, my nice house in upstate New York, and all the other money I have, because it’s dirty money. It’s blood money.” They don’t say that. And so I think that it doesn’t seem kind to me. I don’t know.
Jim: Yep. Although, I did notice that there weren’t too many people that stuck with Zuck or that he stuck with for very long. I mean, he went through cleaning house.
Steven: Yeah. There are some. He went through cleaning house after Yahoo tried to buy Facebook for a billion dollars and the Zuckerberg didn’t want to do that. Zuckerberg eventually kept Facebook as we know, but he made a point of making sure that a lot of people around him who were telling him, “Sell, sell, sell,” were gone after that.
Jim: Yeah. That’s the Octavian in him. “You’re either with me or you’re not.” And I have to say, that was a tremendously huge Octavian move, was to turn down a billion dollars when it wasn’t at all clear that your business was worth a billion. But he saw it, and I think he saw that if he could stick his ground and just relentlessly focus and, “Oh by the way, have a night along long knives, and take out these people that didn’t agree,” he had his fair chance to build something much more significant. I mean, that’s a really big kind of world historical move. And how old was he at that time? 23 maybe, if that?
Steven: Yeah, no, no, it was like 22.
Jim: Yeah. That’s amazing. Got to give them some Octavian points there for that move. I sure as shit wouldn’t, my own entrepreneurial career was quite the opposite. I build them to sell, three to five years. Put the money in the bank, onto the next one. It’s a very different game.
Steven: Yeah. What’s interesting is he took those lessons and the way he felt and used it to get founders of other promising applications. I’m talking about WhatsApp and Instagram, and these are founders that didn’t want to sell their operation. And he got them to sell to him. So these are people who felt the same way he did, but he understood how to get under their skin and to make them make the concession that he refused to make.
Jim: Oh, on the other hand, he just overpaid so crazily. He paid 19% of the capital value of Facebook for WhatsApp, and it still hasn’t made any money. Right? Or very little.
Steven: Well, I’ll tell you though, if you put up what’s up for sale for $20 billion now, there’ll be a lot of takers. And if you put WhatsApp for sale for a hundred billion dollars, if you went public, you could argue, you can get a hundred billion dollars. It’s a one of the most successful social applications in the world. So even at that level, it’s worth a lot. And it’s worth a lot more to Facebook because they don’t have to compete against them.
Jim: Exactly. That’s why they actually did it, right? To basically co-op competition. And then Oculus, the other case, again, $2 billion for very early. Impressive, but not clear what the market was. Really hasn’t turned into a hell of a lot. He did it by brute force and I would say the essentially anti-competitive nature of WhatsApp made a lot of sense. But I can’t fault the founders of WhatsApp at all for turning down when you’re a 40 person company being paid $20 billion is ridiculous. So he just overwhelmed them. Nobody offered Zuckerberg a payout quite like that. He probably would have taken it, maybe not.
Steven: But he got Instagram for a billion dollars, which was what the offer was for him from Yahoo. So.
Jim: Well, that’s a good example. That’s more equivalent. He found founders who weren’t as Octavian as he was, essentially.
Steven: Yeah. Yeah. And the big thing that he was going to concede to them was independence. That they could still run it like they owned it, which they did for a while, until he decided that “Those days were over. I’m in charge now, and Instagram now exists to serve Facebook.”
Jim: Well, what you would expect eventually, right? That’s the nature of selling your company.
Steven: Yeah, exactly. So the founders of both WhatsApp and Instagram had to come to terms that as unhappy as they were, that deal was sealed when they signed the contract.
Jim: That’s why whenever I sold my companies, I negotiated my exit. I had no desire to continue to run the companies for longer than necessary. The reality is, if somebody else owns it, you’ve lost the ability to do what you want to do. Let’s go back a little bit to the late early Zuckerberg, which I found again very interesting, and again, very much in the spirit of the 1984 hacker ethos, which was the chapter you called Casa Facebook, where the whole bunch of them were living in a house. And some of that I think was in that movie also. And I loved the character, the guy who just keeps showing up in the more interesting places, Sean Parker, and his insight that it was the fact that Facebook was based on real names only, unlike the other social media platforms that made it so important. How formative do you think that Casa Facebook and Sean Parker and that epoch was in making Zuckerberg who he is?
Steven: Well, I think what happened in that period, this was a few months after the Facebook started, the team moved to Silicon Valley for the summer extensively. And they ran into Sean Parker who had a lot of experience, even though he was only a few years older than them. In the Valley, he had been in Napster. And he started a company called Plaxo, which he got screwed out of by the VC funders. And then they got in touch through Parker with people like Reid Hoffman and Mark Pincus. So immediately they were talking to the most important people in Silicon Valley who understood social networking, and they understood the ecosystem, how to get in touch with funders. And very quickly by the end of that summer, they had funding from Peter Thiel, he had millions of dollars. And Hoffman and Pincus put their own money in.
Steven: So that was important. But just as important was Zuckerberg being exposed to that growth mentality of Silicon Valley. And it was Dr. Berg, as we talked about earlier, was very big on expansion and growth. But he really saw how that could be possible by really getting in touch with the people who were the cutting edge of growth hacking in, in, in the Valley. And, and he began to think about what Facebook could be beyond a campus network. And I actually got hold of his secret notebook where he was writing his visions for Facebook in 2006 and how to change it from a campus network to something where everyone in the world would have it. So I think those years and those connections were very formative for him.
Jim: Yeah, I loved his notebook and your references to it. Do you actually have his notebook?
Steven: I don’t have the physical notebook but I have some… He destroyed it but I have the copies of the pages, which he doesn’t have anymore. So you know what? I showed it to him eventually and he couldn’t believe it. Because he said, “Wow, I’m looking at this. I don’t have it. I’m sorry. I got rid of it.”
Jim: It’s interesting because I actually have a notebook for one of my ventures. It’s one of my prized possessions. It’s one of those French little paper books with the flexible covers. What do you call them? I forget. But anyway, beautiful little notebook and it’s absolutely full of stuff, and I still go back and look at it and go, “Holy moly.” If I had the notebook for Facebook from 2006 that would be one of my prized possessions. Did he ever ask you for the copy?
Steven: Actually, I should send it to him. I promised I’d send it to him after the book came out. I think I will do that.
Jim: Yeah. If I was him, I would find that to be a prized possession. Let’s go back to talking about the insight that Sean Parker had about real names. I was a little late to the day on Facebook. I think I joined in 2009. And previously I had skipped Friendster, but I had little investment in my MySpace account, and what have you. And that was my reaction also, when I took a look at Facebook in 2009. I go, “Wow, this could be a much better ecosystem because these guys have figured out how to have a reasonably strong real name ID, and not the crazed MySpace thing with all anonymity that was ugly and too easily to be spamable, et cetera.”
Jim: I still believe to this day that the thing that let Facebook win at around that time, 2008-2009, was that the choice of real name identity was really different than what most people. Look at Reddit, for instance. Again, similar timeframe. Reddit has always been rigorously anonymous if you want to be, Twitter the same. But real names make a big difference. You and I were both on The Well and The Well was famous from the very beginning. No matter how big a celebrity you are, you shall use your real name. And didn’t allow The Well to conquer the world, but it did make for a very much better ecosystem than systems built around confidentiality. How central do you think real names is to the Zuckerberg vision?
Steven: Well, I think it’s an important, I think that one reason why there have a lot of problems now is that the scale that Facebook operates, it’s tough to police that, to make sure that…
Steven: The scale that Facebook operates, it’s tough to police that to make sure that everyone who is who they say they are. Facebook admits to 5% of the accounts are bogus accounts, and they report in eye-popping numbers of attempts to create fake accounts. Billions every quarter. Of course, a lot of those are basically just bought trying to open one account after another, but even 5% can cause a lot of problems. I think it is important to know that if your identity is… If it’s who you are, you’re going to be generally more careful about what you post, then you’re going to be answerable to it when you break the rules.
Steven: Facebook, I think, could be tougher about enforcing that, but I think that they don’t want to have their numbers go down if they’re too tough. They don’t want anyone turned away as a false positive for a fakery. So that 5%, which is probably the minimum, is a big problem for them. Because it’s a better community for all if everyone knows who they are, right? Just like if people walked around and you couldn’t tell who they were, they might be less polite. It’s like if you’re in a small town, you know who everyone is, there’s a big incentive to behave better because it’s your reputation.
Jim: Absolutely. And I do find it interesting that despite the fact the essentially utterly immense financial resources that Facebook has, it has not tried to upgrade the quality of its real name identity. And in fact, this I found very curious. They’ve, as I understand it, closed off their program of the equivalent of Twitter’s blue check because there is an argument that you don’t want to close the front door with too rigorous of screening. And I think Zuckerberg has said elsewhere that he’s particularly concerned about the third world where people don’t necessarily have firm documentary identity. And in the United States, easy enough to have people send a picture of their driver’s license or their other government ID. But in Uganda, that’s not going to work so well. But however, you could have upgraded ID.
Steven: Yeah. Also, to be fair, Facebook ran into a lot of problems of people who had good reason not to use their real identities. People who might have had sexual identity that they didn’t want to share with everyone. So that was a real dilemma for them. And they decided since Facebook was so important to people, they wouldn’t deny people access to that. So in that case they had a genuine dilemma.
Jim: Yeah. It seems to me, it’d be really good to me if they would go and offer two or three levels of augmented identity, right? Maybe charge for it, charge 10 bucks to go through the know your customer level.
Steven: Or you could then make your own setting to say, “Hey, I only want to interact with people who have been vetted,” like the Trusted Traveler program or something like that.
Jim: Yeah. I mean it seems like a natural thing to do, but probably their computer simulations show that it would….
Steven: Yeah. Wouldn’t help grow, I’ll tell you that.
Jim: Exactly. And that’s all it’s about. Which is my next topic. The next strikingly influential individual that Zuck hooked up with was Peter Thiel. And you tell the story that one of Thiel’s guys, I think it was, or a friend of Thiel’s or something, focused on the unbelievably amazing and rapidly growing engagement numbers of Facebook, even in those early days. And that’s what hooked Thiel to be the first significant investor. And as we talked about earlier, worshiping engagement in some sense became the dark hole into which Facebook has fallen or did fall. Any thoughts on Zuckerberg and Thiel?
Steven: Well, Thiel was the first big investor and he has been a board member for Facebook’s history, and I think is a trusted advisor to Zuckerberg. And he has long-advocated that companies do best when they have a monopoly. When they corner the market. So in that thinking, if it didn’t influence Zuckerberg, it certainly syncs with the way Facebook has behaved over the years.
Jim: Yup. I will say, I know Thiel just a little bit, and I would say he’s another really ruthless dude.
Steven: Yeah. Interesting guy. Interesting guy. I mean, back in those early days of Facebook, Thiel was one of those people that he had a PR person who would call you all the time and say, “Peter’s available to comment on this. Peter’s available comment on that.” Now he won’t talk to journalists, but back then he was eager to get his name out.
Jim: One of the things that you call out, and of course it’s famous about Facebook, it’s move fast and break things, but you also dig into the fact that it’s not just an attitude of move fast and break things, but also that they were one of the early companies that heavily invested in what we now call dev ops, where you don’t take months to bring out a feature. It takes hours to push a feature into production. If it doesn’t work, you pull it back. Right? Where did that ethos come from? Is that Zuckerberg himself? Is it Moskovitz? Some kind of groupthink that realized the strategic importance of dev ops and very rapid pushing of features out?
Steven: Yeah. We touched on that earlier. I mean, that was a huge advantage for Facebook in the early days, that they understood from Zuckerberg use of the tools, and he taught those tools to Moskovitz, who really wasn’t that much of a coder when he started. He just really wanted to become part of it and he learned it very quickly, that this is something you could do. People didn’t realize this. People grew up in the PC age, right? And the Microsoft way of looking at things that you would do an upgrade like every year or every couple of years, and you can do incremental dot upgrades like every few months. They didn’t realize that it was possible to do upgrades several times a day. You could push stuff out. And the advantage is that if something didn’t work, if there was a bug, you could just have a new version ready that would refresh the next time that people open the browser. And this was a great advantage of not being bound by the previous paradigm that Facebook took advantage of that enabled to move much quicker than everyone else.
Jim: Yep. And I will say my own business theory when I was an entrepreneur was I ripped off Hunter Thompson. Faster and faster until the thrill of speed exceeds the fear of death. Right? And it’s a huge competitive advantage, and they took it to the next level. Of course, Google had also moved to that, as we know.
Steven: I think it’s close. When Google started, Google was not in that paradigm. Google used to do updated search engines in every month in the early days. And that didn’t have to speed it up until it got the AOL deal in 2002, I think. Because AOL demanded that they upgraded it more and more frequently.
Jim: Yeah. And then of course nowadays, nobody sees the same Google, right. They have a hundred different tasks going on simultaneously. Very impressive. But you’re right, the Facebook probably got bare with brute force and as a deep philosophical foundation before anybody else. And that’s been very important. On the other hand, it’s an attitude that isn’t necessarily very complimentary to this heightened sense of social responsibility that we talked about at the beginning. And you know that’s going to be a real tension, a company whose DNA is move fast and break things. I think they’ve actually changed that to something else. So now being a responsible adult, or maybe that’s not the greatest idea, any sense on to what degree that tension between the original ethos and this new awareness post-2016 of social responsibility is playing out?
Steven: Yeah. Well the move fast and break things originally was meant to talk about what we were just talking about. The idea that you could crash the code. You could take the thing down, but that’s okay because in an hour we’ll have a new version. And it was a badge of honor among new engineers at Facebook to actually take the system down. And that became a metaphor for moving fast in a product sense as opposed to a code sense. And so you could have a product that broke privacy, and then fix it later. And you could break [me and more 01:17:47] and fix it later and that. And that’s where they got into trouble and now, just like they changed their motto, they’re trying to say, “Well, we now understand that probably it’s a good idea to think about the consequences of the products we put out before we put them out. So we’re now looking to be proactive about what impact our products can have on society.
Jim: That gives me a good transition point to a next interesting topic, which was the relationship between Sheryl Sandberg and Zuck. In some ways, maybe a precursor was the fairly prescient realization by Pierre Omidyar that he needed a lot of help, and brought in Meg Whitman at a very early stage. And although I do think that the partnership there was more one-sided toward Meg rather than Pierre over time. But yeah, to some degree, Zuck, as you point out, also said, “Here’s our big round circle around shit I don’t want to be dealed with. And Sheryl, you deal with it.”
Jim: And I’ll confess my own business career, I did the same, though not at the level Zuck did. I would always hire a very strong CFO, and I joke with them, “Hey, your job is to be the king of ash and trash. Anything I don’t want to deal with you get.” But I did not go anywhere near as far as Zuck did in pushing things over to Sheryl, which he did. Could you talk a little bit about how, to the degree you were able to ascertain, how that relationship started, how it’s matured, and where it’s at today?
Steven: Yeah. I talked about them both considerably about this, about the decision. And then he brought in Sandberg… He was a fantastic executive at Google and just a brilliant business person. So she did have a role, as a Sheryl reminded me when we talked. Thinking of that she had a company-wide role in helping with the culture of Facebook moving in a little off the dorm room culture and being friendlier to women in particular, and being a little more socially conscious in terms of a place to work.
Steven: But they actually divided up the company, and just as you said, there were things that Zuckerberg wasn’t so interested in. He was interested in engineering and product, and that was what was going to be with where he spent his time and the rest of its sales, lobbying in Washington, HR, content moderation, that went over to the Sheryl side. And that was her world that she would run and there was a disincentive for her to fly problems in her world to him. She felt that she should be dealing with them herself, but there are some problems that are so important that the CEO really needs to deal with it, and the company doesn’t really take it seriously until they see the signal that wait a minute, this is a CEO-level issue. And that’s some things like what happened in the 2016 election, that this stuff was in Sheryl’s world and it was not dealt with from a CEO-level, and it really should have been because it turns out these are the things that had terrible impact on the company over the next few years.
Jim: Yeah. As it also turns out, of course, when you do that division, you miss the overlap, which is that as we talked about earlier, if you build a product, which was Zuckerberg’s domain, that uses machine learning to identify cognitively button-pushing content, that’s going to have policy implications, right? So you can’t quite firewall those two things out, and perhaps that conversation didn’t happen if they firewall the two divisions too strongly.
Steven: Right, right. I mean, it wasn’t like there was no overlap, but the AI people reported to Zuckerberg, and they gave the tools to the data people who work for Sandberg. And it would be sold from Sandberg’s organization, so there was this disconnect between them. And another thing that wound up in Sheryl’s world was security. The chief security officer reported to the general counsel, who reported to the policy person, who reported to Sheryl Sandberg. Well wait a minute, your chief security officer, the person with a C in his title, how many steps is away from the COO, Sheryl, and then step even farther away, a big gap between him and the CEO. And I was shocked to find that the chief security officer never had a one-on-one with Mark Zuckerberg. Can you imagine that?
Jim: I remember reading that in your book. I was utterly shocked. And I was CEO of Network Solutions, and then later division president for Verisign’s digital certificate division. Security was absolutely at the top of my list. Right? And while we didn’t call them chief security officers in those days, the C titles were nowhere near as promiscuous as they are these days.
Steven: Yeah. Zuckerberg cut back on C titles in recent years. He didn’t want any other C’s.
Jim: Well, I think that’s a damn good thing, but I would certainly have a direct report who would keep me utterly apprised of security because security is the life and death of a data-driven network company. And Zuckerberg should have understood that. Where do you sense the Zuckerberg-Sandberg relationship is these days? Has anything changed? Have they learned lessons? How have they evolved in response to this much different, more adversarial public environment?
Steven: Well, I think there were tensions as these things played out over the past few years, but they’re sort of bound together at this point. I feel that I’ve had some super interesting interviews with Sandberg in the latter stages of my book. And one of them I said… Because Sandberg really prepares meticulously for every interaction that she has. That’s the kind of person she is. And and I wanted to get past the prep documents and really get down to a level of candor, which she doesn’t often show to outsiders. So I said, “I need two hours with you Sheryl,” which is unheard of in Sheryl scheduling, but she finally did give me those two hours. And in the second hour, we had an extraordinary exchange where her frustrations poured out about what had happened over the past few years.
Steven: And it was clear to me that she is a believer in Facebook. She’s a very exacting person. She can be tough on her subordinates, but she’s toughest on herself, and she’s in great pain that she didn’t perform in every aspect as well as she could have doing this crisis, or leading up to the crisis. She should’ve been a moral on top of things that prevent stuff from happening. So this was painful there. And it is part of this very human story of Facebook. And that’s really what I try to do in this book is tell that story without demonizing people, but showing them as a flawed human beings they are who can do something extraordinary like building this company that has now 3 billion people. Using his products in the world is a huge percentage of the world’s population, yet as well as it did a lot of good, it caused a lot of harm.
Steven: And they’re grappling with that, and then they fomented a lot of ill will. So how did that happen? It’s sort of a tragic story in a sense, and I hope people could read this like the story it is, and a very important story, because Facebook is so important to us. And Instagram and Facebook and WhatsApp, they’re all important. Oculus might be important one day. So it is behind the scenes, and this rich story of [paining 01:25:47] heights and paying a price for ambition. Maybe too much ambition.
Jim: Yeah, indeed. I would strongly recommend this book to my listeners because as Steven says, it’s a story. It draws you in. It’s full of human conflict, human foibles. The profiles of Sheryl Sandberg, someone I probably knew a lot less about than Zuckerberg, were very illuminating. So a double thumbs up on Steven’s book, Facebook. One last thing here before we go. What do you see as next in the evolution of Facebook? I mean, you’ve got your head into this more deeply than anybody. You have 40 years experience in the evolution of the technosphere. What do you think is next for Facebook as a company?
Steven: Well, as I started talking about this book, I had a book to cut short by this pandemic, and it’s been very interesting to see the impact of that on Facebook. Weirdly, I mean I don’t want to say… Obviously it’s a terrible thing to benefit from something that’s caused death and financial hardship on so many people, but people have turned to Facebook in a way they haven’t in recent years. Some people who very self-consciously deleted Facebook because they felt it was toxic and no good for the world are now coming back because it turns out to be one of the more effective tools for being in touch with people.
Steven: And what I’ve been pressing Zuckerberg to tell us is is this going to change maybe the way he sees Facebook? Maybe it was before he was encouraging this morality where it would be used more to keep in touch with people for more benign uses. And Facebook itself has been more aggressive about fact-checking content, policing content, and actually pushing content out to people about this virus. So it’s maybe an opportunity for Facebook, just like World War II was an opportunity for the U.S. To get out of the depression. Maybe this crisis is a way for Facebook to get past its reputational tailspin, which nothing it had been able to do before this had any effectiveness.
Jim: Interesting. Let’s hope so. I will say I’ve found, and I’ve never been a huge basher of Facebook, even though I will say over the years my engagement with public Facebook or open Facebook has declined, I do find Facebook very useful in its groups. Right? And during this COVID-19 pandemic, some of the groups I’ve been a member of, I would say, have done the best sense-making on what’s really going on and what to expect of any place I’ve seen. Better than the professional media. And I know Zuckerberg, and you mentioned it in the book, does have great hopes for groups as a way to move, move Facebook to the next level. It will be interesting to see if this COVID-19 provides a catalyst for that.
Steven: Yeah, yeah, no. Mark Zuckerberg would be very happy to hear you say that because as you’ve mentioned, he made a special focus the year after the election to tell these meaningful groups, as he calls them, the groups of people joined that give them valuable information or speak to important part of their identity. At that point like a hundred million people, which is a blip in Facebook, were all in those groups, and he wanted a billion people to be on those groups, and maybe that number is rising considerably now.
Jim: Yeah, I would say 90%, literally 90% of my usage of Facebook is in a handful, maybe a dozen groups that I belong to. And the quality of the discourse for a well moderated group is just remarkably better than it is on the open Facebook. And finally, Facebook has been investing in tools for group admins, which make it a lot better. I’d like to see them do more of that, but for many years the toolkit was pretty static. Well, Steven, I want to thank you for an incredibly insightful discussion here about your book, and again, like to encourage my readers who are interested in Facebook to check it out. A book well worth reading. Time well spent.
Steven: Yeah, thanks. And buy it from your independent bookstore. They’re really hurting.
Jim: Indeed. Thank you very much, and we’re going to wrap it up here.
Production services and audio editing by Jared Janes Consulting. Music by Tom Mahler at modernspacemusic.com.