Facebook had played a profound role in the election and was not being honest about it.
What Cambridge Analytica showed was that Facebook's business strategy was recklessly endangering
the privacy of their users. As a business strategy. These problems are built into the
product. And I don't know what's going to come out of this.
Brian Price here for Real Vision in Menlo Park, California, where today I'm going
to be sitting down with an investing legend. Roger McNamee is here to discuss what's
been happening with Facebook and his vision for the future. Roger, thanks for joining
us today. So I want you to do something for me that nobody else can, and that is connect
the dots between The Grateful Dead, two strokes, Bono, and Mark Zuckerberg, in a career.
would say that I have been in many ways the luckiest person alive in the sense that I
went into a career where timing is everything and started as a Wall Street analyst. On the
first day of the bull market of 1982 they asked me to cover technology so effectively
I had a 35 year tailwind. And you can explain everything that happened my career based on
the pure dumb luck of that combination of timing and being handed technology. And then
it just happens that I'm the kind of person who has a couple of passions on the side.
Music being the primary one. I've been a professional musician my whole adult life and miraculously
in doing so wound up getting to meet my heroes being, The Grateful Dead and, after Jerry
Garcia died, they found me somehow and reached out and said can you help us with keeping
our business together. And so I spent three years consulting with them on their dot net
which was their direct fan site and it was a pretty cool experience because, having gone
to a couple hundred shows I knew the scene really, really well but it's really different
when you're in the audience from when you're talking to the people in the boardroom. And
that was how I met Bono, a woman named Sheryl Sandberg who was working as chief of staff
to the Secretary of the Treasury was working with Bono in 1999 to forgive the debt of emerging
countries that were never going to be able to pay back. And Bono was curious about this
guy who's doing stuff for the Grateful Dead and said, I want to give you a chance to get
involved with that. And so Sheryl goes, You won't believe this but my brother in law works
for that guy. I know exactly who he is. And so she introduced me to Bono. And while I
was working on the Grateful Dead project I went to Dublin to meet with U2. And when I
came back I had two strokes. When I got off the airplane in San Francisco one stroke and
then a few hours later no one and I didn't know it was a stroke. I didn't do any of the
right things and miraculously it didn't kill me. I mean it was literally a miracle. And
then it took a long time; I had open heart surgery to get rid of the cause of the stroke
which was a birth defect in my heart. And when I came back Steve Jobs gave me a chance
to buy 18 percent of Apple in our Silver Lake fund. And to go on the board, and my partners
-- I didn't realize that my partners had decided while I was gone that they liked splitting
the money three ways. And so when they came back they were looking
to get rid of me so they said no to a chance to buy 18 percent of Apple at cash. And one
thing led to another and I I was working on a project with Bono also for Silver Lake and
it was obvious they didn't want me around so I quit. And I called Bono to say, hey I
quit. And he goes well screw them, we'll start our own firm. And I go. Nobody's calling you
start an investment firm. I mean I know your management company. There's just no way. And
he goes, no, we're really going to do this. And so that's how Elevation happened. Tell
me about your first meeting with Mark Zuckerberg. : Imagine if you will that I've been in the
business 25 years at that point which is more than a career as a tech investor. There had
been crashes along the way that wiped out most of the people I knew from the early part
of my career. So I had more experience as a public market investor than anybody who
was doing it in tech and was right up there with the senior-most venture capitals. One
of the things that I did was make myself available to young entrepreneurs. It's a great way to
get to know them when they're not raising money. They've got a problem. They're looking
for somebody who's got a perspective but without any conflict. I get a phone call from a guy
named Chris Kelly who is the chief privacy officer of Facebook and I barely knew Chris,
we'd met but we didn't know each other well. He calls and says, 'Roger. My boss has an
existential problem. And he needs somebody like you to help him think it through. Would
you be willing to take a meeting with him today?' Sure. I mean keep in mind this is
2006, March of 2006, the company is barely two years old. Mark is 22 years old. I'm 49.
At Elevation we had a conference rooms set up like a living room. Basically a giant video
game console and huge flat panel thing and you know we were at the intersection of technology
and media. So we had a meeting. Mark comes to my office, looking just like Mark Zuckerberg.
He's got the courier bag, he's got the T-shirt. We say hello. We sit down and I'm closer to
him in that setup than you are to me. And I said Mark before we start I've got to tell
you a few things because once you tell me what's going on, you'll assume that anything
I say after that is influenced by whatever you told me. So I want to say a couple things.
He says, go for it. I go, 'If it hasn't already happened either Microsoft or Yahoo
is going to offer a billion dollars to buy Facebook and everyone you know from your parents
to the board of directors, the management team, the employees, are going to tell you
to take the money. 'They'll tell you, Mark you're gonna have 650 million bucks; you
can change the world.' Your lead venture capitalist Jim Breyer's gonna say, 'I'll
back your next company. It will be even better than Facebook.' I said, Mark, I've been
watching this space a lot longer than Facebook has been in existence. And I think you have
done two things that are going to make all the difference. You have real identity and
you give the users the ability to control their privacy settings. So I think that combination
is going to make this product way more attractive to adults than to kids. So I think you haven't
even gotten to where your real market is and it will be very attractive to advertisers
because adults have all the money. And I said, the truth is you may have another idea as
good as Facebook but you'll never get the timing perfect twice. No one ever has. Lots
of entrepreneurs of great ideas but things like Facebook happen because you have the
perfect idea at the perfect moment of time. And that'll never happen again. Whereas
I think Facebook is going to be bigger than Google is now. Now that whole thing took me
about two minutes to say. There then ensued the most painful silence of my professional
career. And it went on nearly five minutes. And at the three minute mark I was ready to
howl. I mean I was white knuckled in my seat. Trust me. You have no idea how long five minutes
is until there is somebody sitting in front of you who's pantomiming all these thinker
poses. And not saying a word. He's clearly trying to say does he trust me or not. At
the five minute mark he finally relaxes and it's like you can see a thought bubble over
his head going, 'OK. I'm going to trust him.' And he goes, 'You're not going to
believe this.' I go, 'Dude, I'm so happy you said something, just try me.' He goes,
'In my bag, I've got an offer to buy the company for a billion dollars, and literally
everything you said is true. And I said well, look this is your company, what do you want
to do. He goes, well I don't want to disappoint anybody and I go, I get that. But if it were
just your choice what would you do? He goes, well I'd like to I'd like to play out the
hand I'd like to see how it'll go. And I go OK. Would you like my help to figure out
how to do that? - Yes. So we literally reviewed the company's voting rights. And it turned
out he had a golden vote. He had a situation where literally it didn't matter what everybody
else thought, if he thought something, that was the answer. And I said, here's the thing
Mark, remember that when these people invested in your company, when they joined you as an
employee, they were signing up for your vision. And if your vision isn't done, if you still
think that this game is worth playing, you can sit down with him and look him in the
eye and go listen, I don't think this is the right time to sell. And, when you prove that
you were correct, they are going to be really happy that they didn't sell out because Microsoft
and Yahoo are going to kill this company. There's no way they're going to see the vision
through the way you would. Now, he left my office after that. I'll bet the whole thing
was half an hour max. And he went home and killed the deal that afternoon. And about
a month later I got a call from him. Another thing that came up. As you can imagine, with
the whole team wanting to sell the company, he had to make a few changes to get people
who were aligned with the vision. And so I helped them deal with all that and then the
Winklevoss brothers thing came up and I helped them go through a crisis management. And then
the following year an opportunity came along. One of his early employees had a personal
change and needed to sell his options. He didn't have stock, he just had options. So
it required a really clever and really trusting buyer. You had to organize it, and he said,
would you like a chance to invest. And he said, here's the deal I'll give you choice,
you can to go on the board, or you can invest, but you can't do both. You know I'm a little
sour with my board because they tried to sell my company. I go, dude, I'm an investor. I've
got to invest. So I took the investment opportunity. And then shortly after that Sheryl Sandberg
called me up and goes, I need to come talk to you. When Sherly came out of Washington
in early 2000. she came and hung out in our office for about a month. And she had a copy
of The Tipping Point by Malcolm Gladwell. And she goes, have you read this book and
I said, no I haven't read that book. She says, because this book might as well be written
about you. I go, really? And she goes, well I want to work here. I'm going, wow. That's
really cool because I mean, we're trying somebody really really capable here. I go, wow. You
want to work here. That's awesome. So we spend the next week or two, talked about investing.
And my partner finally pulls me aside and goes Roger, this is insane, this woman can
change the world. If she works here she'll never get a chance to do that. She should
be working for Google. Now keep in mind our office was inside Kleiner Perkins in those
days and so it was four doors down to John Doerr's office so simple handoff and next
thing you know Sheryl goes to work at Google. So when she calls me up it's not like this
is the first career conversation I've had, but she goes, I've been given a chance to
be the president of the Washington Post. 'm going, are you nuts? I mean here at Google,
you're killing those guys. And the dumbest thing you could do would be, go from the winner
to the loser. I'm going, Washington Post, I have enormous emotional attachment to Washington
Post, but realistically how are you going to save the newspaper. If you can do that
you've got to talk to Zuckerberg and maybe go to Facebook. Because he needs somebody
to create the business. She's like, well, he's 23, I don't know if I can work for
a 23 year old. I'm going, he's not your normal 23. I think it's worth the conversation. So
I call Mark and I go, so Mark I think I got the person for you. And he goes really, who?
Sheryl Sandberg. He goes, yeah but she's at Google. I'm going, Mark give me a closer
proxy for what you're doing. He goes, yeah you're right. And the thing about Mark and
I knew this, was his mom's a doctor. Really strong personality. He's got nothing but sisters.
I was convinced he could work with a woman who was successful which a lot of Silicon
Valley people can't do. Anyway. It only took a couple of months. You know, they get together,
get to know each other. It turned out there was a good chemistry there. And it's not like
they're both run of the mill people. I mean Mark, you know he's in some ways a classic
successful Silicon Valley entrepreneur. But on the sort of extreme side. And Sheryl is,
her self-control is simply off the rails. I mean, if you watch her do an interview it's
like you're watching master give a master class in staying on message. But both of them
tremendously ambitious and they both wanted to change the world profoundly, and I thought
together they could do it. So once Sheryl came on board. The company was shifting from
what I would characterize as its startup mode, into an operational mode. And I'm not an operator
and so it was obvious to me, not to Mark, but it was obvious to me at the time that
my days as a mentor were going to come to an end pretty quickly. But they had one thing
left for me to work out which was mobile. And because we had made this big investment
in Palm, the pilot guys, to make the first web phone called the Palm Pre, I knew a lot
about what was going on in mobile and I was convinced everything on desktops was going
to wind up on smartphones. The smartphones, you can think about them as a phone but in
reality you are mostly going to do Internet stuff on them. And that was not a well understood
concept. In 2010 or 2009 as we were having the conversation. But Mark was really into
it and unlike some of the earlier topics, this was one where there was no obvious right
answer. So we have a lot of back and forth on it, which I learned from and I'm quite
confident he learned from too. And then you know in sometime towards the end of 2009 just
go, dude, I think you've outgrown me. I think you're all set. You know I'm just going
to the background but I'll be cheering for you and I'll be here if you ever need me but
I don't think you're going. Which meant I missed, the creation of the business model.
And by the time 2016 came around, it meant I didn't understand the mechanics for how
the business of Facebook worked, how they used the techniques of propaganda, and the
techniques of casino gambling, on a smartphone. To create levels of psychological addiction
that are analogous to a gambling addiction or analogous to a videogame addiction. But
with one really, really important difference which is that because the way the product
worked they had the ability, or their advertisers had the ability, to manipulate what people
think. And it took me a while to figure that out. But when they did it was… It was really
disturbing. I mean it was like, oh my god. This thing that was about sharing family photos
and birthdays and pictures of kittens. It's suddenly, now a tool that bad actors could
use to harm innocent people. In a lot of different ways. You described as your baby. At one point
in timeYeah.. So here's the problem. I started my career as a public market investor and
as a public market investor even in tech. Where you're working interactively with management
teams I built my entire brand was built on being an above average analyst of products.
Everybody else worked on spreadsheets and trying to forecast earnings and what I realized
was in an industry as dynamic as personal computers in the 80s, that if the product
was hot, the estimate was always too low and if it wasn't hot, the estimate was always
too high. So what you had to do was figure out, was the product going to be hot or not
hot. And so I became really good at that. And it turns out that because no other investors
were doing that I had got to have a special relationship with a lot of people, many of
whom are famous now. People like Bill Gates and Steve Jobs, but many of whom were just
immensely successful but less well known. And over time as I did more venture capital
my relationships to companies got deeper and deeper, and my impact got greater. But it's
hard to top Facebook. I mean the combination of absolute success and the fact that, it
would have been acquired by Yahoo before any of this happened, and who knows what would
have happened to the business without Sherl. Those two things meant that my fingerprints
are on it. So I felt like this truly was my baby. And so imagine, scroll forward to January,
February of 2016. My wife and I are on vacation. I'm on Facebook. I love Facebook. I used it
every single day. I'm as addicted as anybody. And I love the birthday things. I love sharing
photographs and looking at other people's stuff, and I'm in a band, and it's the way
we communicate with the fans of the band. So I'm on there looking around and it's the
beginning of the New Hampshire primary, 2016. And all of a sudden I see these memes, photographs
with text on them. Coming from groups ostensibly associated with the Bernie Sanders campaign.
But deeply misogynistic and in a way that no campaign would be, and that wouldn't be
surprising, except they were spreading so virally that I realized somebody was spending
money. Now. Who would spend money, to spread deeply misogynistic memes? That was a head
scratcher. And so I just made a mental note of it. I mean I discovered later course, that
it was almost certainly the Russians. Fast forward a month. Sometime in March of 2016
there's a news report that Facebook had ejected a firm that was using its application's program
interface to harvest data about people who expressed interest in Black Lives Matter.
They were then selling it to police departments. I mean. Truly evil. Now Facebook threw them
off the site but not until the damage been done. These people's lives have been irreparably
changed by you know, by their actions. And I go, whoa. I mean. That's. Unlike, the Bernie
bro thing, you could see who had done this and you could see that, they had just used
the Facebook tools created for advertisers to do it. Fast forward to June. Brexit. The
British are voting on whether or not to leave the European Union. The final polls say that
they're going to remain and remain's going to win by four points. That night out comes
the election returns and, leave has won by four points. So, eight point swing. And, in
the post mortems there was a lot of talk about the role Facebook played. And what was interesting
was nobody was blaming Facebook but if you were in my position looking at this thing
you're going, whoa. Leave had a really, really, inflammatorycampaign. They're basically
saying those evil immigrants are going to destroy your culture, take away your jobs,
and they're ruining the country and all the crime is blamed on them. And then they were
offering this pie in the sky thing of, A, we're going to save billions of dollars or
billions of pounds on exiting the EU, and take all that money and pour it into the national
system. So effectively they were saying to everybody you can vote because of some racially
motivated animus, but you can feel good about it because you're going to save the national
health system. Meanwhile the Remain side has no emotion at all. They're basically going,
we have the sweetest deal on Earth. We get all the benefits of EU membership and we get
to keep our own currency. That's a great deal, don't screw it up. Should have won in a walk.
I mean the British are; I mean stay the course is the British way. And, yet the thing swings
eight points. And I'm thinking of myself, is Facebook, giving an advantage to inflammatory
political campaigns over neutral ones? That was the hypothesis that Brexit brought you
to. And again I don't have any notion of a Russian connection at this point. Within two
months all of a sudden there's a lot of news about Russia right? You know we all heard
about the DNC hacks. You know, John Podesta's e-mails and all that stuff. Wikileaks. You
learn about Manafort and his whole relationship and all of a sudden you go, whoa, that's creepy.
And then in August there is this news report that housing and urban development, has cited
Facebook for having advertising tools that enable people who own real estate to discriminate,
in violation of the Fair Housing Act. Now I've got four data points, unrelated, all
pointing to the same thing. Bad actors using Facebook's standard ad tools to harm innocent
people. I reach out to Recode. The. Tech blog. I reach out to Kara Swisher and Walt Mossberg
and I go guys, I'm seeing this stuff. What are you seeing. Dead silence. No reaction.
Zero reaction. They don't even respond. I do it again maybe three weeks later. So now
we're probably early September. I don't hear back right away but then Walt sends me a note
and goes you know, you might be onto something. Kara's not interested in covering the story,
we're not going to cover it. But you should write an op-ed for us. Take your time, write
something. Let's start a conversation about this. So I set to work writing an op-ed and
I don't think I'm in any rush, because I don't think it's going to affect the outcome of
the election and I'm really worried that because I don't think it can affect the outcome of
election, that if it goes down and Clinton wins, that they're going to dismiss my concerns
because, hey it didn't affect the election. So I focus on a balance thing with all the
different, all the different things that I'd seen. So, the Black Lives Matter stuff, Brexit
stuff, the Housing Urban Development stuff. But it's hard to write. Why? I mean I'm trying
to write an op ed right and I'm trying to stick to the facts and not exaggerate anything
and yet make this tight case. I finish it on the 30th of October and my wife points
out, it was such a brilliant insight, hey let's send it to Mark and Sheryl. I mean,
they're your friends, you love this company. Your goal is to help them not cause trouble,
and all that was true. And so I sent it to Mark and Sheryl. And they get right back to
me. I mean within, hours. And both very thoughtful replies, but saying basically the same thing.
So what they said was, Roger, we really appreciate your reaching out. We believe the things that
you saw are isolated. Not systemic. And that we have taken steps to ensure that all of
them, can't happen again. And they refer explicitly to the Black Lives Matter thing where obviously
they had evicted the people who did it. And they said you know, we take you seriously.
You know, you've been a friend of ours for a long time so we're going to have one of
our senior people work really closely with you, to figure out if there's something we
should be investigating. And they turn me over to Dan Rose. Now Dan I think is the second
longest serving executive of Facebook, and he's somebody I knew really well, respected
a lot, and liked very much. And Dan gives me the same basic shtick the next day. But
with one important added note: he goes, Roger, you know, we're a platform. We're not a media
company. So as such we're not responsible for what third parties do on the platform.
And we go back and forth, roughly once a day, right up until the election. Then the election
happens. And I'm apoplectic, at this point. I go, OK guys I'm sorry. You have played a
role here. We don't know exactly what the role is, but the platform has been used. It's
been used in Brexit, it's been used in the U.S. election. And Dan's going, you understand
we're platform not a media company— I'm going, dude, you got 1.8. 1.7 billion members
at that time. If they decide that you're responsible for destabilizing democracy, it won't matter
what the US law says. Your trust will be destroyed. And I'm begging them, basically look you want
to do this like Johnson and Johnson, when some dude tampered with Tylenol. Poisoned
a few people I think in Chicago. They didn't sit around and debate it. They literally took
every bottle off of every shelf from every retail location everywhere. And they kept
it off until they created tamperproof packaging. They basically saidm we didn't put that poison
in the bottle. But these are our customers and we're going to take care of them. We're
going to act as though this is entirely our responsibility. And they did it instantly,
and I said guys, nobody is going to blame you for whathappened here. If you get right
on top of it. And with complete sincerity. Commit yourself to, helping the government
figure this thing out. This goes on for weeks. I mean, almost three months basically, although
with less and less frequency, because Dan's not moving at all. I mean he's listening carefully
and he's being incredibly patient with me but, not budging. You can imagine that my
attempts to convince Dan, got, pretty emotional. Because, I didn't know exactly how Facebook
had been used to affect the election but based on what had happened in Brexit, I had no doubt.
And in particular because one of the things that came out from the election was that there
were a really large number of, people who'd voted for Obama who had not voted for Clinton.
And it occurred to me that Facebook, because it essentially is about inflammatory content,
it's about outrage cycles, is the perfect tool for voter suppression. And so I'm sitting,
I think to myself… I mean Trump won because of really spectacularly well executed voter
suppression. And, Facebook, it played a role. So I wouldn't let go. I keep in touch with
Dan and he goes, how about if you just send me more examples. And I think I got up to
maybe 15 or 16 different examples of, situations where they had contributed to bad actors harming
innocent people. And finally in February of 2017 I realize, their position was not moving.
I mean, if I hadn't been so concerned about the thing, I would have known on the first
day it wasn't moving, because I know the people and… Philosophically they view criticism
and regulation as forms of friction, to be blown past as opposed to things to listen
to and actually dig into. Because again they're in too big a rush, and friction is the enemy
when you're in a rush. And so the story goes on and on, and you know it didn't change until
late December when Chamath Palihapitiya, who had been their head of growth came out and
did this confessional presentation at Stanford talking about, how much he regretted the harm
that they had created, and it was big coming from him, because he'd run growth, he'd run
the algorithm. So you know, him saying that was really different than, us or Sean Parker
or any of the other people who had expressed doubts, because he'd hired all the people
in their growth group. If those people decided this was unacceptable I was going to cause
a revolt. They came down on Chamath like a ton of bricks and I think that was their last
window where they could have done the Tylenol, Johnson and Johnson thing. Where it was a
year post election. You know, you're getting pretty long in the tooth for going, we didn't
know. But what they do instead, they basically said, we're going to treat this like the Alamo.
We're going to quash dissent and we're going to deny the whole thing. And we realize oh
my god, they're going to blow this. They're actually going to do the thing I warned them
about which is to say they're going to harm their brand! Not just democracy but they're
going to actually harm their business What are we supposed to do now? And I had written
this really long essay for Washington Monthly. It was designed to help policy makers in Washington
understand the issues and then have a prescription. Essentially, things like the global data protection
regulations coming out in Europe, that were about privacy, and all of it was scheduled
to come on January 2nd. And what happened was on January 1st , Mark Zuckerberg puts
out a New Year's resolution in which he says we're going to spend the year fixing Facebook.
My thing comes out the next day and for all intents and purposes, it was a 7500 word rebuttal.
I'd written it two months earlier. But for all intents and purposes it worked like a
rebuttal. And the result was all of a sudden everybody wanted our opinion again. And we're
starting to get tens of millions of unduplicated reach on television, on multiple networks
multiple times a day, but all basically talking about this problem that Facebook had played
a profound role in the election, and was not being honest about it, and that in fact, the
problem wasn't based on a hack, it was based on the Russians using the product exactly
as was meant to be used except for a nefarious purpose. And that resonated with some really
interesting people. I mean I'm sitting there and somebody forwards me a tweet from Tim
Berners-Lee, the man who created the World Wide Web. He'd found this article which
was aimed at the audience inside the Beltway. Somehow it had reached him in Europe, and
he shared it with everybody on his list, which was huge, I mean everybody follows Tim Berners-Lee.
So the next big thing that happened was the Cambridge Analytica. Bombshell. The Observer
and The Guardian in the UK, and the New York Times, het this whistleblower. Named Wiley,
Christopher Wiley who had been the original engineer at Cambridge Analytica, and basically
came out with the full report that, Cambridge Analytica had found a researcher who had been
working with Facebook already, and had a trusted relationship with him. And persuaded him to,
do another study, academic study. Except the data was all going to go to Cambridge Analytica
and they were going to build an election business around it. And what we now know is that they
harvested 87 million user profiles using a tool that Facebook had had in the markets
since 2010. And the reason this story blew up I think the way it did, was that when the
tool went in the market in 2010, people protested right away, and they protested right away
because some of the people who used it were game developers with huge audiences. I don't
know whether Cityville, which was a successor to Farmville used it or not but it was designed
for people like that. Cityville had 61 million users 50 days after it started. And, well
the simple math was that at that number everyone in America would have known half a dozen people
who were playing Cityville, which means they would have been harvested half a dozen different
times, just from that one thing. There were nine million applications on the Facebook
platform when they went public in 2012. If 1 percent of those applications harvested
the friends lists, that would have been ninety thousand applications harvesting. So that
was pretty creepy but the real problem was that Facebook signed a consent decree with
the Federal Trade Commission in 2011 that said, you can't do that. You have to have
informed consent. It must be explicitly called out. People must have an opportunity to know
in advance if their stuff is going to be used and they must have the ability to stop it
before it's shared. Facebook basically had a choice at that moment. They could have gotten
rid of this tool that was designed, essentially it was designed to make Facebook more viral
and make applications that were high-use applications. Games would increase minutes of use per day
per user. And so this was designed to increase all those metrics for Facebook. So it was
part of their plan. And they wanted lots of people to use it. I don't know. I mean maybe
it was 10 percent, in which case it was 900,000 apps that used it, whatever it was, it was
a huge number. Some of which were gigantic. And it turns out that the reverse was true.
So if you used a product like Facebook on an Android phone, Facebook would simply download
all the metadata from your Android phone into their account. So. What Cambridge Analytica
showed was that Facebook's business strategy, was recklessly endangering the privacy of
their users. As a business strategy. That they had signed a consent decree and did neither--
eliminating the tool, nor informing the users.And Sandy Parakilas, who was part of our team,
had been the manager of user privacy for the Facebook platform. The very platform on which
the Cambridge Analytica product ran. He'd had that job from the consent decree until
shortly after the IPO. And he left. And part of his frustration was the Facebook paid lip
service to the consent decree. They didn't actually do the things necessary enforce it.
So when that news came out, and it played out over three days, it was like we had all
of Watergate crammed into three days… it basically added tremendous color to why people
like us were concerned about Facebook. And it took you right to the edge without people
completely understanding another really important issue, which is that Facebook had offered
to embed employees in both presidential campaigns. The Clinton campaign turned it down. Trump
took it. Now Trump was known to be working with Cambridge Analytica. Cambridge Analytica
is incredibly emotional. They blast this to everybody. Stephen Bannon, Trump's adviser
had been part of starting Cambridge Analytica so this is like not a secret. But here's the
thing. When Facebook did the deal that allowed Cambridge Analytica to harvest all those user
profiles, that was three years after the consent decree. It should not have happened. And Facebook
claims that they didn't realize it was Cambridge Analytica until December of 2015, when the
Guardian published a story about which point Facebook goes and says to both the researcher
Alexander Cogen and to Cambridge Analytica, you have to destroy this and you have to certify
you destroy it. But they didn't send anybody into chat. And then roughly six months later
they embed three employees in the Trump campaign working in a war room in the San Antonio data
office of Trump. Working side by side with Cambridge Analytica people, on this gigantic
data set. That was obviously the same one that had been misappropriated by Cambridge
Analytica two years earlier. And here's the thing: The top management of Facebook knew
they had employees embedded in the campaign. Everybody knew that Cambridge Analytica was
working for Trump and there wasn't enough time between December and June to recreate
that data set. With all this information, how would you characterize Facebook's thinking
when it comes to individual users? I think it's really simple. You know, the line about
advertising is when the product is free, the user isn't the customer, the user is the product.
In Facebook's case though it's more like the user it's the fuel. That there's something
almost parasitic about it, because the psychological manipulation that takes place doesn't apply
to everybody on the platform. But if you think about the United States alone, there've
always been people, a meaningful percentage of the population that believes things that
are demonstrably not true. You know flat earth, contrails, whatever stuff that you can just
feel is obviously not true. And I don't know what the normal number was, 7, 10 percent
something like that. But today if you look at it between things on the left like contrails
and antivacs and things on the right like climate change denial, it's probably a third
of the population. And, Facebook has played a huge role in taking that number from whatever
it used to be to whatever it is now. And it has become literally the perfect tool for
spreading disinformation and making people not only believe it, but identify with it.
Like you know, it's their identity. With all this in mind, how would you describe how attitudes
have shifted in Silicon Valley from the time when you were there and in the prime of your
career, to now? Silicon Valley had a philosophy that began in the early 2000s. This libertarian
ideal that said, we're going to disrupt things and that's okay because we're not responsible
for the consequences of what we do. This libertarian model was sort of like it was situational,
but it basically said you know, you're really smart, you're well-educated, your intentions
are good, whatever you do is fine. So when I started my career, the Silicon Valley was
still focused on the needs of government. We were in the era where defense spending
was the largest category of technology followed by mainframe computers. And in that era, that's
the era of the white plastic pocket protector. You know, the guys with the short sleeve white
shirts and a tie. And you know if you watch Apollo 13, people in Silicon Valley looked
like that. Personal computer industry begins. It's an era where from an engineering point
of view, there's not enough of anything, there's no processing power, there's not enough memory,
not enough storage, not enough bandwidth to do what you want to do. So Silicon Valley
made tools and the tools required a manual half a foot thick in order to use them properly.
But it was very respectful in the sense that we were trying to make the world a better
place with these better tools. And I think that lasted past the millennium. It was an
era that valued experience almost above everything else, because if you're dealing with scarcity
you don't want to have to make the same mistakes over again. Somewhere around 2002, 2003 suddenly
we flipped. And there was enough of everything. And Silicon Valley had a chance to rethink
the whole proposition of what computers were going to do. And as a community what we settled
on was we were going to go for infinite scale. We were going to go for products that were
global in a completely different way. They were consumer products that were global and
which meant billions of users. In that model, there were two or three things you needed
to make that realistic. The first thing you needed was you needed to have this notion
that was embodied in the slogan of Facebook: move fast and break things. This idea that
that you were going to have a vision, you were going to pursue it relentlessly, and
you were going to run over whatever obstacles came your way, and you want to avoid friction
at all costs. And the second thing you needed when you needed to absolve yourself of responsibility
for the consequences of what you did. And that's where the libertarian values came in.
And the Valley bought into it pretty deeply. And the notion was, if you move fast and break
things you're going to hurt some people, and you've got to be OK with that. Right? And
not everybody was OK with that. The old timers were like looking at it, going really? But
the young crowd, because keep in mind the other thing that happened here was when there
was too much of everything you didn't need experience anymore. So Mark Zuckerberg can
literally hire all his friends from Harvard, with no experience, no sense of history, nobody'd
ever read a novel. So they would have been unconcerned or unaffected by the values that
had preceded them. So the folks who were left out of all of that would look at it and go,
that doesn't look right. And the new guys would go, you guys are old. What do you know?
The world began with the Internet. And people got so rich, so quickly, that it became self
reinforcing. And it validated itself. In fact the whole world decided it was OK. I mean
they looked at these things, and went, wow. Zero to a billion people in 10 years. It's
like, what's wrong with that? I mean it's puppy photos, it's birthdays, all that. I
mean, and the incredible thing was that Facebook, Google, Twitter, Amazon, what they really
delivered was dramatic advances in convenience. The products were incredibly easy to use,
incredibly convenient. They were free, obviously. I mean Amazon would sell you stuff, but you
know there was an awful lot of stuff that was free. I mean their proposition was compelling.
But nobody talked about the possibility of there being downsides. That there would be
a dark side to all this stuff. And so, much as with food in the 40s and 50s when we adopted
things like TV dinners and convenience food, at a time when people had a lot going on and
big families… Nobody said, Hey this is going to lead to an epidemic of obesity. We didn't
know that, that all that sugar salt and fat was going to cause problems downstream. And
the same thing happened with these tech platforms. It just happened in 10 years instead of 30
or 40. And the Valley has been incredibly slow to accept that there's a problem with
that. In fact I'll be really interested when I go to TED this year as to whether people
are open and happy about what we're doing, or whether they are in fact. You know, unhappy
because we're raining on their parade. Roger, now that we've seen the dark side. How do
we fix it? The worst part of the experience that I've had on this whole thing was coming
to grips with… these problems are built into the products. Facebook in particular,
but also for Twitter, also for YouTube, which are the other people who are involved in the
election manipulation. And then if you look at things affecting kids, whether it's YouTube
Kids or Instagram or Snapchat or Facebook Messenger for kids, same problem. The design
of the products. Allows for manipulation by outsiders, and it leads to addiction and unhappiness
for the user, even if there's no external manipulation. Well, hang on. That's not easy
to fix And you really can't fix it without the cooperation of the people inside the company.
So when I originally reached out in October of 2016 that was with the hope that they would
investigate it, not realizing that this was something that was so inherent in the product,
that they were going to have to change the business model in order to fix the problem.
We've now gone 16, 17 months since then, and they have shown no sign of a willingness to
actually make the changes necessary to eliminate the risks. So now we're in a, how do we mitigate
the damage, mode. And there are a few things that I think, a few areas where we have to
do work. The first is we have to look at data privacy and the consumer right to some kind
of ownership of their own data. In Europe they have a new law called the General Data
Protection Regulation that is going into effect May 25th. And that is essentially a consumer's
bill of rights for data. And you know it's not perfect but it's really good, and it's
a great blueprint because it applies to all EU citizens no matter where they are in the
world. So everybody is going to have to support it all around the world. So my advice to both
Facebook and to Google has been explicitly, embrace GDPR. Embrace the European model globally
and do it like a religious mantra. These businesses are based on trust. And the only way to regain
the trust is for people to be materially less worried about the integrity of these companies
than they are today. So that would be step one. Step two is related to the election stuff,
and there are a couple of parts to that. The first is that there are things these guys
could do voluntarily they have not done. Facebook has refused to cooperate with the authorities
relative the analysis of what happened in 2016. There were 126 million Facebook users
affected--directly touched by the Russian interference. 20 million on Instagram. The
obvious thing to do is to provide all of the data related to those accounts, each time
they were touched all of the things that they saw, to the investigators, in a way that's
searchable and analyzable. Nobody's asking Facebook to expose their algorithms. Just
give us the output. And their argument, which is complete nonsense, is, oh if we do that
then we have to give stuff to authoritarians and bad countries. And I'm going, wait a minute.
Facebook has this notion of community standards that are individual to every market. And in
authoritarian regimes the authoritarian controls those community standards. If you go to Myanmar
the community standard is repression of the Rohingya minority, to the point of it being
characterized as a as a genocide. And Facebook is the tool they use to make that acceptable.
It's the tool that the government in the Philippines uses to make death squads acceptable. I mean,
you can't tell me we can't help solve the fate of democracy in the United States, because
you're worried about those guys doing something different. What they're doing already is so
horrible. It's hard to imagine it getting worse. So that data is really important. Second
thing is they have to follow through on Senator Richard Blumenthal's request that they reach
out to every one of those 126 million people on Facebook, and 20 million on Instagram,
reach out to them personally. With a really detailed message that says, in 2016, the Russians
interfered in the U.S. presidential election. They manipulated Facebook. We did not catch
it. Here is every time you were exposed to the stuff. You need to understand all of this
is disinformation from a hostile foreign power designed to undermine our democracy. We as
Americans have to stand up against them. And the punch line should be, the effort in 2016
was about suppressing the vote. If we want to minimize the damage in the future, the
best thing to do is to have everybody vote. And Facebook is the best one to do that message
because, everybody looks and says, well that person was effected and that person was effected,
but I wasn't effected. And that's nonsense. Only a 137 million people voted in our election.
And 146 million people were affected. Between Facebook and Instagram. That's more than the
number people who voted. And they aren't random. They were targeted. They were a combination
of people who were known to be pro Trump with the positive message. And then the communities
that they thought had the highest probability of being persuaded not to vote. The vast majority
of people were in that. So they are people of color, there are people like Bernie Sanders
people who might be Jill Stein curious... You know if you have all these different things,
really intensely targeted. And getting those people to recognize that they were manipulated,
Facebook's in a unique position to pull that off. And they haven't done it. So those
two things they could do on their own. They don't need any help. Then, you have to look
at what they can do to protect future elections. And they're making baby steps in that direction.
With disclosure on the ads and things like that, but most of this was done inside filter
bubbles. Filter bubbles are what you get when you have a product, as you do with Facebook,
where each person has their own channel that's built around what they think they like; it's
really what Facebook wants them to like. And, Facebook surrounds you with people who believe
the same things you do, encourages you to join groups of like-minded people and they
do that because that clustering is good for the advertising. There's a side effect, that
when you're surround by people who agree with you, your positions become more rigid and
more extreme. And that's good for Facebook because you become more emotional. That's
how we got from say, 7 to 10 percent of people believing things that are demonstrably not
true to a third. Facebook's played a huge role in that. And so if you want to protect
elections, you have to find some way to pierce those filter bubbles. And you say, well one
way you could do it would be to have more mainstream news in people's feeds. But what
did Facebook do in January? They took mainstream news out of people's feeds. Had they done
in 2015, it would have magnified the Russian interference. So that's not a good idea. So
my point is Facebook hasn't yet done even one thing that's going to help. And we need
the cooperation. And then the last piece which is really profoundly important relates to
how data security works broadly. And we can never put the genie back in the bottle. And
we have to assume that everyone in the United States has had their data harvested at least
once and that that data is somewhere out on the internet, nobody knows who has it nobody
knows where it is, and you can't get it back. I don't know what Facebook does about that.
They've done a bunch of things. Recently where they've reduced the number of places that
applications can get user data on their site. You think to yourself, well that sounds like
progress. The problem with that is that the things they're doing now, are all things they
should have done in 2011 when they signed the consent decree. These were all things
that were actually, the consent degree said, you must do these things. So what they're
basically admitting is we ignored the consent decree for seven years. I don't think we're
supposed to give them credit for this. For showing up seven years late to a party that
they were required to attend. I think this is really hard. And there is an op-ed written
by Tim Wu, a professor at Columbia, in which he says the most important thing to do is
to replace Facebook. Now I don't know how you're going to do that. But the one thing
I know is that in 1956, AT&T cell phone company signed a consent decree to end in antitrust
case, in which they agreed to two things. The first was they agreed not to enter any
new markets which basically meant they weren't going to enter the computer industry. And
then the second thing they did was they agreed to freely license their entire patent portfolio
at no cost. What's really interesting about that is that by not entering new markets they
allow the personal, the mainframe computer minicomputer, the PC industry has to happen
independently. But the patent portfolio is where the secret was. Because in that patent
portfolio was something called the transistor. Silicon Valley as we know it today was created
by the AT&T consent decree and the most remarkable thing about it was that AT&T continued to
prosper. And yet we created all of Silicon Valley. The semiconductor industry, the computer
industry, the software industry, the Internet industry, data networking industry, cellular…
all of those things were created as a direct result of that AT&T consent decree. And it's
all about creating the opportunity for competition. And that's what I would like to see happen.
I would like to see solutions that… I don't see any benefit to punishing Facebook and
Google and Twitter or others. I do think they should be restricted in what they can do.
But I think for the most part the most useful thing we can do is create real competitors
who have different business ideas, and different business plans. And we should reward people
who do things that serve the public interest. You know, Silicon Valley has spent the last
40 years getting rid of jobs. Why? I mean that's sociopathic, we're at this point
now where we need to create good jobs for people, particularly good jobs for people
coming out of industries that are dying. There's no reason Silicon Valley can't do that with
a proper set of incentives. And you know, the Internet platforms have done the opposite.
And we need to create incentives for a new generation to come along and do the right
thing, and that's what I hope will happen. You understand the past. You see where we
are right now. And you have a vision for the future. So I guess my question is if Mark
called you tomorrow, and said I need you, will you join my board? Oh I would do that
in a heartbeat, but that's never going to happen. I mean that, that is roughly equivalent
of like, if you had wings would you fly. I mean I'm an enormous fan of Facebook. I'm
an enormous fan of Mark and Sheryl. What about the stock? I think they're killing the brand.
I turned it over to a manager to manage my position, because of what I was doing. And
the manager sold a chunk of my position quite recently. Facebook is still by far my largest
investment position. So I'm still very deeply attached to it. From a personal economic perspective.
And I'm still deeply emotionally tied. And, while it hasn't appeared to them as though
I've been trying to help, that's been my goal throughout this entire thing. I mean, somebody
asked me an interview not too long ago, this is after after the Mueller indictment, which
was basically a superset of the list of hypotheses we'd given Warner, eight months earlier. And…
they said, wow you must be feeling really great. And I go, what planet are you coming
from. It's the worst thing I've ever been involved in. I mean. I was so proud of this
company. And it never occurred to me it would ever do any harm. Just never occurred to me
and maybe I was naïve. And I'll accept that. But I took up this challenge. First the challenge
of getting a conversation going, and now the larger challenge of how do we fix it. With
the goal that we could fix it. And with a clear sense that, because of my biography
and because of my understanding of the product and knowing the people, that I might have
an important role to play here. But it hasn't gone the way I hoped. I mean, I hoped that
they would take my original memo and use that as a basis for doing the right thing. I hoped
that inquiries from Congress would cause them to do the right thing. I hoped that having
former employees like Chamath Palihapitiya and Sean Parker and Justin Rosenstein, talking
about how important it was to change, that that would cause them to do the right thing.
None of those things have worked. And they're still not working now. Mark's testified
in Washington D.C. and, you know, it all sounds good but when you strip it out and actually
look at what's going on for the most part, they are doing things that they wanted to
do anyway and crediting this crisis for creating the motivation. And when they're actually
doing something, like on personal privacy where they're being responsible to the consent
decree, they're doing it seven years later than they should have done. I would love to
help them get this right. But. There is no sign, and there'll be a better, I mean, I
think the truth is now after all that's gone on, there'll be a better messenger
than me. You know I've been forced to take up this mantle of being a critic which is
not where I normally belong. I'm an analyst, right? And in this this particular story I'm
Jimmy Stewart in the Alfred Hitchcock movie. I saw something that I wasn't supposed to
see. And pulled on the thread, and all of a sudden found myself in the midst of something
that was bigger than I was capable of handling. And all of that has made me very unpopular,
not just with people at Facebook. And I don't know what's going to come out of this. I don't
know that democracy in the United States is going to survive its brush with Facebook.
You know, you look at what's going on in Washington right now and it's hard to be confident that
we're going to have a happy ending. You know with trade wars, you got real wars being threatened,
you got all kinds of stuff going on and you got all these people with really important
jobs who seem to think that the purpose of being in Washington is to enrich themselves.
As an investor you go… basically we're maximizing uncertainty, which is the investor's enemy.
And we're doing it the old fashioned way with corrupt behavior. And, whether I like it or
not, Facebook was one of the tools that these people used to produce this outcome. We have
the Russians, the Trump campaign. Others presumably. And there's no easy fix. And no way to put
the genie back in the bottle. So you know. I knock on wood that people you know, there
are literally thousands of people who are domain experts on each individual part of
this problem. And many of them have really great ideas. And what I've been hoping to
do, what we've been trying to do, is to shine a light on them. We started something called
the Center for Human Technology. And we're working on a thing called the Ledger of Harms
and the Ledger of Harms is essentially a catalog of all of the failure modes of internet technology
and smartphones, from addiction to election interference. Basically killing off startups.
But in great detail with links to all of the best known work on the subject. That's phase
one, we're going to release that in I hope the next month. Once that's out we're going
to share it with all the researchers and ask them to connect their work to it. And the
goal there is to shine a light on all the great work going on in the field, that right
now nobody can see because it's taking place too close to the action, and there's no way
to get it to policymakers, there's no way to get it to the tech companies. And so the
hope with the Ledger of Harms is to shine light on that, and then as people start to
come up with remedies connect those into it too. And that way anybody who wants to learn
about this can go to whatever level of depth they want to go, to understand what the problems
are. What is known about them, who's doing the best work, what solutions they've come
up with. Because it's not going to be us. My role in this whole thing is to run into
the room with my hair on fire and go, hey, my hair is on fire. And that's worked out
pretty well. But you know that's one trick. And my pony doesn't have a second trick. And
so, my hope is that I'll be obsolete in this whole exercise pretty quickly and we can hand
it off to people who really know what they're doing. Because the Jimmy Stewart character
is supposed to go off happily into the sunset at the end of the movie. I want to make sure
that happens.
Không có nhận xét nào:
Đăng nhận xét