Mark Zuckerberg on Facebook’s hardest year, and what comes next
The Ezra Klein Show
0:00
0:00

Full episode transcript -

0:0

this episode is brought to you by progressive. What would you do it with an extra $800? Buy a plane ticket, Pay down your student loans. Treat yourself to those shoes you've been eyeing with progressive. You could find out drivers who switch and save save an average of $796 on car insurance. Get your quote online at progressive dot com and see how much you could be saving national average annual car insurance savings by new customers surveyed who saved with progressive in 2019. Is your business as secure as it could be? It's not if you're still running Windows seven. Support ended for Windows seven on January 14th. Get modern Windows 10 Pro Devices with Intel Core and Intel Core V Pro processors and the latest version of Microsoft Office. You can boost productivity, cut management time and cost and make your business more secure. Upgrade now and be great. Visit Microsoft's partner, C. D. W at c d w dot com slash Be great. Make the shift

1:6

because we didn't invest enough. I think we will dig through this hole, but it'll take a few years. I wish I could solve all these issues and in three months or six months. But I just think the reality is that solving some of these questions just gonna take a longer period of time.

1:33

Welcome to the crunch on the Box media Podcast network. My guest Today, you may remember from such social media platforms as facebook dot com. My guest is Mark Zuckerberg, who has been in the news along with his platform quite a bit lately. I have been thinking in the past couple of months about something he wrote in, I Think was February of 2017 and it's really this remarkable document that I recommend you go look up if you have some time. It's a It's a manifesto about what Facebook can and should be for the future, the world. And it is. This was a moment when people actually talking about Zuckerberg is a 2020 presidential candidate. He was on this somewhat peculiar nationwide tour where he was meeting with farmers in the Midwest, and I mean, it had a lot of political dimensions to it. But he came out with his manifesto, where he offered up a really profoundly ambitious vision for Facebook a vision for Facebook that did nothing less than situate Facebook within a broader architecture of human social evolution and talking about humankind, he wrote.

Today we're close to taking our next step. Our greatest opportunities are now global, like spreading prosperity and freedom, promoting peace and understanding, lifting people out of poverty and accelerating science. Our greatest challenges also need global responses like ending terrorism, fighting climate change and preventing pandemics. Progress now requires humanity coming together not just the cities or nations, but is a global community. And Facebook, he said, could be quote the social infrastructure that created that global community. That was, Ah, grand vision. It was It was a vision that was super national.

It was bigger than running. Any one country is running the thing. That would be the mediating platform between all the countries, the thing that would allow humankind to expand its cooperation to level It had never been before. One thing we've seen over the the intervening time and in some ways things that Zuckerberg was beginning to respond to in that manifesto is that well, possibly if you create that social infrastructure, if you make it easier for nations and ideologies and ethnic groups to Clyde into each other to communicate both within about each other rather than learning how to cooperate will fall apart. We will divide. We'll learn how to fight. There have been a lot of individual news stories recently. Cambridge Analytica, Russian bots. We talk about those, but I didn't want to spend our main time on them. I wanted to talk about what Facebook has become. With two billion users.

It is something that is bigger than any one government, and I say this in the podcast. But when it fails, it fails consequences that are like that when a government feels so how do you govern something? How do you manage something? How do you create accountability in something that is so vast in scale? That is, a private corporation run by a CEO who is control over the voting stock that can literally change the world, change elections, change the future, at least in the near term of humanity, but is again a private corporation. It is not a government. It is not, run like a government or run like a multi national institution like the U. N.

There's a lot of trust it needs to be there. They need to be built, and Facebook of nothing else is suffering a crisis right now. Trust. So I'm grateful Toe Zuckerberg for coming on the program. We talk about this and how he's thinking about building that governance structure, what he thinks about Tim Cook's shot at him the other week that Tim Cook would just never be in this position because his business model is amore virtuous business model. Zuckerberg has very strong words about that. We talk about what these problems are face because facing what it will take to solve them. I don't wantto preview the whole conversation. But suffice to say, I think that it is worth hearing the scale of the problem sucker burgers, thinking about the scale of his ambition in solving them and just where he is right now, where he is and where he thinks Facebook is in the plans he has for it. As always, you can email me and give me show feedback and ideas as a client.

Show at box dot com Without further ado, here is Mark Zuckerberg, CEO of Facebook. So I want to begin with something you said recently in an interview. Which is it? Facebook is more like a government than a traditional company. Can you expand on that point of it? Sure. So one

5:54

of the things that we have to do is basically people share, ah, a whole lot of content and then sometimes their disputes between people right around, whether that content is acceptable or whether it's hate speech or valid political speech. Right, whether it is unorganised ation, which is is deemed to be a bad or hateful or terrorist organization, or one that's expressing a reasonable point of view. And I think more than a lot of other companies we're in a position where sometimes we have to adjudicate those kind of disputes between different members of our community. And in order to do that in one of the things that I think we've had to do is build up a whole set of policies and governance around how that works. But I think it's actually one of the most interesting philosophical questions that we face is now, with a community of more than two billion people all around the world, in every different country where there are wildly different social and cultural norms one of the things that that I think we're gonna need to work on a lot going forward is it's just not clear to me that in us sitting in an office here in California are best placed to always determine what the policy should be for people all around the world. And I've been working on and thinking through, you know, how can you set up a more democratic or community oriented process that reflects the values of people around the world? And that's one of the things that I just I really think we need to get right, because I'm just not sure that the current state is is a great one.

7:30

I'd love to hear about even knowing its nascent. Where you're thinking is on that, because one of the ways in which your comment that it's more the government struck me is recognizing that when Facebook gets it wrong, the consequences are on this scale of when the government gets it wrong in elections could lose legitimacy in a country or ethnic violence could break out. And it made me wonder. Has Facebook just become too big and too vast and too consequential for normal corporate governance structures and also normal private company incentives. I mean, we have very few things that run just like a private company where, if something goes awry, can have the effects of Facebook. Has it made you question

8:10

any of that? Well, I think we're continually thinking through this, you know, as the Internet gets to broader scale and some of these service's reach a bigger scale than anything has before. We're constantly confronted with new challenges, and I try to judge our success. Not by Are there no problems that come up? But when an issue comes up, can we deal with it responsibly and make sure that that we can address? It's that those kind of issues don't come up again in the future? You know, you mentioned our governance. One of the things that I feel really lucky that we have is that this company structure where you know, the end of the day it's a controlled company. We're not at the whims of short term shareholders. We can really design these products and decisions with what is going to be in the best interest of the community over time. I think that ends up being just really important and has it many important moments to the company's history.

9:3

I think that's interesting, because it is one of the ways you all are different, and I can imagine reading it both ways. On the one hand, you're more insulated from short term pressures of the market. On the other hand, you have a lot more just personal power, giving her control over the voting stock. There's no quadrennial election for CEO Facebook on, and that's a normal way that at least democracies run accountability. Do you think that makes you, in some cases less accountable? I mean, not that you would ever make the wrong decision, But if for some reason you did if if things began to go awry, would that be more dangerous for Facebook? Given your centrality to it and the scale of company,

9:42

I certainly think that's a fair question. My goal here is to create a governance structure around the content in the community that reflects Maur what people in the community want then necessarily what short term oriented shareholders might want. And if we do that, well, then I think that that could really break ground on governance for this kind of an Internet community. But certainly if we if we don't do it well, then I think we'll fail toe handle a lot of the issues that are coming up. So you mentioned, you want to get a sense of how I'm thinking about some of these things going forward here. A few of the principles, so one is just transparency. But right now, I don't think we are transparent enough around the prevalence of different issues on the platform. You know, you hear a lot of anecdotes about about issues. Journalists do a good job surfacing, whether it's fake news or or other kinds of problematic content.

But you know, we haven't done a good job of publishing and being transparent about the prevalence of those kind of issues and the work that we're doing in the trends of how we're driving those things down over time. So like that, that's one important measure of accountability and governance. A second is some sort of independent appeal process right now. Now, if you post something on Facebook and someone reports it and in our community operations and review team looks at it and decides that he needs to get taken down. You know, there's no really way to appeal that. I think in any kind of good functioning democratic system, there needs to be a way to an appeal, and I think we can build that internally as a first step. But over the long term, what I'd really like to get to is an independent appeal is, well, so you know,

maybe, you know, you have it where some folks that Facebook make the first decision based on the community standards that are outlined and then people can get a second opinion. But then you can imagine even some sort of structure, almost like a Supreme Court or appeals board that is made up of independent folks who don't work for Facebook, who ultimately get to make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world. But, you know, I think we're breaking new ground here. We need to be thoughtful, and I do think that the current structure that we have gives us the ability to experiment with things like this. But it's a big open question about whether we're goingto get to the right place and

12:9

do that quickly. I'm really interested to hear you begin talking about this idea of independent governance structures because one way that governments do this is they have institutions that are at cross purposes with each other that have different kinds of legitimacy. And I wonder how much that influences your thinking on transparency to when I hear you say that you need to do a better job talking about how many people are affected by something or what you're doing internally to actually solve it. One of the themes for the past year that I think has been damaging for Facebook is you know, initially the answer is very, very few people saw fake news or very, very few people saw anything from Russia related bots and then slowly comes out now, actually was more millions, maybe hundreds of millions. And it wasn't the transparency there it was. It was how to know you could trust what was coming out. And I wonder if part of transparency for you all has to be creating modes of information that are independent in their own ways.

13:5

Yeah, I think that that's a good point. And I certainly think what you're saying is a fair criticism. You know, it's tough to be transparent when we don't first have a full understanding of where the state of some of the systems are. So certainly, you know, in 2016 we were certainly behind having an understanding and operational excellence on preventing things like misinformation, Russian interference. And you can bet that that's just a huge focus for us now going forward. You know, right now in the company, I think we have about 14,000 people working on security on and community operations and review just to make sure that we can really nail down some of those issues that we had in 2016 and 2018. And I think it's gonna be a big year for us and for elections around the world. There's not only the really important midterms here in the U. S, but their presidential elections and India in Brazil and Mexico and Pakistan on Hungary and a number of other countries,

and I I think it's fair to say that we were not as on top of that as we should have been upfront, but one of the things that I think we've needed to do is now go develop the playbook that we can both be transparent about about what we're seeing and just be much more effective about eliminating some of these risk. So if you talk about elections for a moment, you know, after the 2016 U. S. Elections, a number of months later there were the French elections. And for that we spent a bunch of time developing new AI ai tools to find the kind of fake accounts spreading misinformation. And we we took down, I think, was more than 30,000 accounts. And I think the reports out of out of France, where that people felt like that was a much cleaner election on social media. A few months later, there were the German elections,

and there we augmented the playbook again to work directly with the Election Commission in Germany, where the idea is that it has an Internet company will have ability to see some of the content that's in some of the issues that might be happening in an election. But if you work with the government in a country, they'll really actually have a fuller understanding of water order going on what are all the issues that we would need to focus on and again. By working with the German government, we were able to focus on a few specific issues. And I think there again people felt a lot better about how that election went on. Social media and then fast forward to last year, 2017 in the special election in Alabama, we deployed a number of new tools that we'd developed toe find fake accounts who were trying to spread false news. We got them off before a lot of the discussion around the election. And again, I think we felt, ah, lot better about the result there. So I think we felt we felt good about the pace at which are tools to detect. This are improving compared to the adversaries that we're up against.

15:57

Let me ask you about your tools to punish it, though, because the upside of being able to to move a national election using Facebook is very high because, you know, look, if you get caught, if you're Russia and you are executing a massive bought operation, a sophisticated one to try to move that the U. S. Election well, you know, if you get caught hacking into our election systems, which they also did try to Dio and Hillary Clinton wins, the consequences of that could be really severe. Sanctions could be tremendous. And you could even imagine something about escalating up into armed conflict at a certain level. If you do this on Facebook,

you know, maybe you get caught and they shut down your boss. But one thing that you don't have it in not being a government is really the ability to punish. If Cambridge Analytica messes with everybody's privacy, you can't throw them in jail in the way that if your ah doctor and you do recurring violations of HIPPA, you face very severe legal consequences. So I wonder. One question I have is Do you have capacity to do not just detection but sanction? Is there a way to to increase the cost of using your platform for these for these kinds of efforts? So, yes,

17:8

there are a number of things that we d'oh. But as you say since we're not a government, there are fewer penalties that we can impose on countries that might be trying to act in this way. Um although you know, having a tool to get information out about what you're doing in your country is relevant and having the ability to block bad actors is an important one that we don't take lightly and in terms of when we use it might make sense to go through. There are three big categories of fake news, and I can walk through how we're basically approaching this. There's a group of people who are like spammers. These are the type of people who, in pre social media days would've been sending you Viagra, e mails and the basic playbook that you want to run on that is, just make it non economical for them to do. That s o the first stuff that we did. Once we realize that this was an issue, you know, a number of them ran Facebook ads on their Web pages, so we immediately sent anyone who's even remotely sketchy.

No way, or you're gonna be able to use our tools to monetize. Okay, so the amount of money that they made went down and that made it so that some of the effort slow down. Then you know they're trying to pump this content into, uh, into Facebook with the hopes that people will click on it and see ads and make money where they're often not ideological. As our systems get better. Detecting this, we show the content less, which drives the economic value for them down. And eventually you just get to a point where they go and do something else where there are always gonna be spammers in the world. But by making its that they can make less money, they're gonna go do other stuff. And I think we're getting closer and closer to that line and driving more people away. So that's the first category,

or spammers and people with an economic motive. The second category are state actors, right? So that's basically the Russian interference effort, and that is basically a security problem. Rights of that. You never fully solve it, but you strengthen your defenses. They're not doing it for money. Do you make it harder and harder? You get rid of the fake accounts and the tools that they have for using this. You know, we can't do this all by ourselves. We try to work with local governments everywhere who have more tools. Thio to punish them and have more insight into what is going on across their country. I said they could tell us what to focus on and that when I feel like we're making good progress on two,

then there's the third category. So once you get past economic spammers and state actors, you 3rd 1 which is the most nuanced, are basically really media outlets who are probably saying what they think is true but just have varying levels of accuracy or trustworthiness and what they're saying And that, I think, is actually the most challenging portion of the issue to deal with, because they're I think there are. There are quite large free speech issues, but where you get into, you know, folks are saying stuff that may be wrong, but like they mean it right, they think that they're speaking their truth and do you really want to shut them down for doing that? So we've been probably the most careful on that piece. But recently this year we rolled in a number of changes to news feed that try to boost in the ranking broadly trusted news sources. So what that basically means is we've surveyed people across the whole community and asked them in whether they trust different news sources to take you to the Wall Street Journal or New York Times,

even if not everyone reads them. The people who don't read them typically still think that they're good, trustworthy journalism, whereas if you get down to blog's that may be on more the fringe. They'll have their their strong supporters. But people who don't necessarily read them often don't trust them as much. And by applying that kind of a a lens on this, you know, we know that people in our community want broadly trusted content that is helping toe surface more of the things that they're. They're building common ground in our society and maybe pushing out a little bit of the stuff that is less trustworthy, even though we're gonna continue to be very sensitive to not, I'm suppressed people's ability to say what they would they

21:14

believe. So one thing I hear when you give me that taxonomy is okay, so you have the first run of fake news, which is a business problem. There's a business of doing fake news and the way you can staunch that is taken with the money, then you have a technical problem that is around state actors running massive disinformation campaigns. This third group, which is the conceptual problem I do think is really interesting. And one thing I hear when I hear that because I'm somebody who came up is a blogger and had a lot of love for the idea of the open Internet and the way the gates were falling down is that that also creates a huge return to incumbency. If you're the New York Times and you've been around for a long time and you're well known people trust you and if you're some new upstart and you know now Box has been around that enough that I'm not doing any special pleading here. But if you're somebody who's wants to begin a media organization two months from now, if Facebook is way people get their news and the way Facebook ranks, it's news. Feed is news people already trust. It's gonna be a lot harder for new organizations to breakthrough.

22:16

I think that's an important point that we stood a lot of time thinking about it because one of the great things about about the Internet and the service is that we're trying to build or you're giving everyone a voice where that's that's so deep in our mission that we care about this. We definitely think about that. All the changes that we're making, I think it's important to keep in mind that of all the strategies that I just laid out, they're made up of many different actions which each have relatively subtle effects, Right? So the broadly trusted shift that I just mentioned, it changes how much something might be seen. I don't know. Just call it in the range of maybe 20% MB or less, right? But it's not going toe. Make it so that you can't share what you think. So that you that if if someone wants toe, have access to your content that they're not gonna get at it.

23:8

What we're really

23:9

trying to do is make it so that the content that people see is actually really meaningful to them. And, you know, one of the things that I think we often get criticized for is incorrectly in this case is people say, Hey, you know, you're just ranking the system based on you know what people like and click on

23:27

and it turns out

23:28

that's actually not true. And we move past that many years back because there was this issue of Clickbait on the Internet, where there were a bunch of publications that would that would push content in tow into Facebook. And essentially people would would click on them because they had sensational titles. But they would not feel good about having read that content where they feel like, Hey, this wasn't actually what the headline said it was gonna be. This was a waste of time.

23:53

That

23:54

was one of the first times that that those basic metrics on clicks likes and comments on the content really stopped working to help us show the most meaningful content. So the way that this actually works today broadly is we have panels of hundreds or thousands of people who come in, and we show them all the content of their friends and pages who they follow have shared, and we ask them to rank it right basically say what were the most meaningful things that you wish were at the top of feed, and then we try to design algorithms that just map to what people are actually telling us is meaningful to them, not what they click on, not what is gonna make us the most revenue, but what people actually find meaningful and valuable. So we're making shifts like the broadly trusted shift to help build common ground. The reason why we're doing that is because it actually maps to what people are telling us they want at a deep level that turns out to be the right tactic to get there in this case. But I don't think, you know, as a blogger, as a publisher, you need to worry at all that the people who want to see your content or not gonna be able to see it that's that's our whole job is to make it so that that people can connect with the things that they want to.

25:4

Hey, I'm Sean Ramos for my host today. Explain for box, and we are celebrating Episode 505 100 episodes. We've been asking one question every day. What should the news sound like? Maybe it should sound trippy. I realized for the first time that these trees were my parents. Maybe it should sound like a song, and sometimes it just sounds like two people trying to cope. I'm a current senior at Douglas. I'm a 2000 to graduate of Columbine. What if we had done more? You know, what if we had been as vocal is, you guys are being The news is different every day, right?

Well, so is our show. And at 500 episodes, it's the best it's ever been. So come kick it with us today, explained from the Box Media Podcast Network. Hi, I'm Ariel Damn Ross, and I'm the host of a podcast called Reset. Reset is a podcast about the impact of technology, how humans have shaped it and how technology reflects our values and biases. Reset is also a podcast about how much good tech has accomplished and how much better it could be if we listen to each other. Recently, we've covered stories about how artificial intelligence is being used to track Corona virus. We've also covered how to solve the problem of child predators on gaming sites and why our night sky might soon be filled with thousands of new satellites subscribed to reset for free on apple podcasts Spotify or, in your favorite podcast,

stop later nerds. 11 of things that has been coming up a lot in the conversation is whether the business model of roughly monetizing user attention is what is letting in a lot of problems. Tim Cook, the CEO of Apple gave, gave an interview the other day, and he was asked, What would you do if you were in Mark Zuckerberg shoes? He said, I would not be in Mark Zuckerberg shoes. Apple sells products to users. It doesn't sell users toe advertisers. And so it's, ah, sound or business model that doesn't open itself to these problems. Do you think part of the problem here is the business model where attention ends up dominating ah, above all else? And so anything that can engage does have at least some powerful value within the ecosystem.

27:34

You know, I find that argument that if you're not paying that somehow, we can't care about you to be extremely glib, right, and and not at all aligned with the truth. You know, the reality here is that if you want to build a service that helps connect everyone in the world, then there are a lot of people who can't afford to pay and therefore, as with a lot of media. Having an advertising supported model is the only rational model that that can support of building the service toe to reach people. That doesn't mean that we don't know that we're not primarily focused on serving people, I think probably to the dissatisfaction of our sales team here. And I make all of our decisions based on what's going toe matter to our community and improve the experience and focus much less very little on the advertising side of the of the business. But I mean, look, if you want to build a service which is not just serving rich people, then you need to have something that people can afford.

You know, I thought Jeff Bezos had an excellent saying on this one. In one of his Kindle launches a number of years back, he said, There are companies that work hard to charge you more, and there are companies that work hard to charge you less. And Ed Facebook. We are squarely in the camp of the companies that work hard to charge you lefts and provide a free service that everyone can use. I don't think at all that that means that that we don't care about people. I think to the contrary. I think it's important that we don't all get Stockholm syndrome and let the companies that work hard to charge you, MME. Or convince you that they actually care more about you because that's that sounds ridiculous to

29:21

May. So I want to say before I ask this next question that at some point during this podcast, this podcast is gonna break and I'm gonna saying ad for something that is going to be probably totally bizarre within the context of this conversation. So I am, I am also than an advertising model and I have a lot of sympathy for the advertising model. But I also think the advertising model can blind us. I think it can. It creates incentives that we do operate under and that we do justify backwards towards. And one of the questions I do ask is whether diversifying the model doesn't make sense. If I understand and I might not the what's that model, which is also part of Facebook, is subscription, right. People pay about dollar year, summer month and it's a small

30:6

amount of where we actually we actually got rid of that.

30:8

Well, see, there you go, Tonto. But but the point I want to make is that you don't need to only serve rich people to diversify away from it, just being about attention. And when it is about attention, when it is about advertising, we do overtime need to show growth to Wall Street. And I do think you guys do, even if you do have an unusual voting share structure that does Cole you towards getting more and more and more of people's attention. Over time, I did an interview. Trust on Harris has be been a critic of Facebook and other platforms in Silicon Valley. We're talking about the way in which you had said that some of the changes you're making have brought down a little bit the amount of time people are spending on the platform, and he said, You know,

it's great, but he couldn't do that by 50%. Wall Street would freak out. His board would freak out. There are costs to this model, and I do wonder how you think about at least protecting yourself against some of them dominating in the long run. Well, I think

31:2

our responsibility here is Thio. Make sure that the time that people spend on Facebook is time well spent, where we don't have teams who have, you know is their primary goal make. It's that people spend more time. The way that we that I designed the goals for the teams is that you try to build the best experience that you can, and what we find is that when people have a better experience on Facebook and Instagram and what's up, then whatever the alternative is that they could be doing where we're that's watching TV or doing something else that I think naturally over time. If they're finding that experience valuable, then they'll spend more time there come. But I don't think it's really right to assume that people spending time on a service is is bad. But at the same time, I also think maximizing the time that people spend is not really the goal either. You know, one of the things that we've found in the last year we've done a lot of research in tow. What drives well being for people right and what uses of social networks are positive and are correlated with happiness and long term measures of health and all the measures of well being that you'd expect and what areas there are not as positive. And the thing that we've found is that you can kind of break out Facebook and social media use into two categories.

One is where people are connecting and building relationships, even if it's subtle, right? Even if it's you know, I post a photo and someone haven't talked to in a while comments and may not be a super deep interaction. But you know, that person is reminding me that they care about me and we're having a direction on. And even if I hadn't talked to that person a while, it's It's nice, right? T kind of remind each other that you care. The other part of what of the use is basically content consumption rights That's watching videos, reading news, passively consuming content in a way where you're not actually interacting with anyone or building a relationship while you're doing that. And what we find is that the things that were there about interacting with people in building relationships end of being correlated with all of the measures of of long term well being that you'd expect. Where's the things that are primarily just about content consumption,

even if they're informative or entertaining, and people say they, like them or not, is correlated with the long term measures of well being. So this is another shift that we've made in news feed in our systems this year is to prioritize, showing more content from your friends and family first. All right, so that way you'll be more likely to have interactions that are meaningful to you on that Maur of the time they're spending. His building is building those relationships. That change actually took time spent down a little bit, right? That was that was part of what I was talking about on on that earnings call. Or at least you know we don't make these changes all at once. I think of the Arctic scholars talking about an early version of Adam and there's there's more there. But over the long term, you even if times spent,

goes down. But people are spending more time on Facebook, actually building relationships with people who they care about. That's gonna build a stronger community and build a stronger business over the long term, regardless of Wall Street what Wall Street thinks about it in the near term. So you know, that's what our incentive is is toe to make sure that we build the best service and that it's it's good for people and and and and we're trying to do this for the community over the long term, not the next

34:30

quarter. I don't ask you another question about the advertising model, and this one is a tricky question asked, because it bears very directly on my industry. But something I've seen in the coverage in past couple days has been a perception that Facebook that a lot of the critical coverage from the media comes from journalists angry that Facebook is pretty well decimating the advertising market. That journalism depends on the Dow Jones publisher. Will Lewis said that the diversion of advertising is killing news and that it has to stop. Is he right or wrong? And given that so much of the advertising on Facebook is around news that journalism organizations are paying to publish what what responsibility do you feel you have to the people creating real news for their business model to work? Given that what their business model creates does in fact, lead to value not just for the world but for Facebook itself.

35:20

So I do think a big responsibility that we have is to help support high quality journalism. And that's not just the big traditional institutions, but a big part of what I would I actually think about when I'm thinking about high quality journalism are is local news, right? And I think they're almost two different strategies in terms of how you how you address that for the larger institutions and maybe even some the smaller ones as well. Subscriptions, I think, is really a key. A key point on this. A lot has changed with the Internet. It's not just social media. It's that now everyone has a voice. There's just a lot more competition. You know, if you were The New York Times before the Internet, then you were by far the biggest game in town in New York, right?

And advertisers and readers who wanted Toa understand what was going on needed toe get your content. Now the opportunity is broader rights. They could reach more people, but there's also just way more competition. That is a challenge. But what I think Ah lot of these these business models are moving towards are a higher percentage of subscriptions where the people who are getting the most value from U are contributing a disproportionate amount to the revenue. And there are certainly a lot of things that we can do on Facebook to help people to help these news organizations Dr subscriptions. And that's certainly been a lot of the work that we've done and we we have and will continue doing local news. I think some of the solutions there might be a little bit different, but it's easy to lose track of how important this is. There's been a lot of conversation about civic engagement changing, and I think it's people can lose sight of how closely tied that convey the local news. We're in a in a town with a strong local newspaper. People are much more informed, are much more likely to be civically active, so this ends up being important.

I'm not just for informing people, but for having a well functioning democracy, and that's one of the things we're on Facebook. We've taken steps toe to show Maur local news to people who are in who live in those areas. That was a big shift that we that we made this year, but also working with them specifically quitting funds to support them and working on both subscriptions and adds there should hopefully create a more thriving ecosystem.

37:29

Well, I want to go here from the local toe, very global ambitions. I've been thinking a lot in preparing this interview about the 2017 manifesto you wrote where you said that you wanted Facebook to help humankind take its next step. And you said, Quote that progress now requires humanity coming together, not just the cities or nations, but also the global community. And then you said the Facebook could be the social infrastructure for that. In retrospect, I think a key question here has become whether creating infrastructure, where all the tensions of countries and ethnicities and regions and ideologies can more easily Clyde into each other will actually help us become that that global community, or will further terrorists apart, has your thinking on that changed it all over the past year, year and 1/2?

38:17

Sure, I mean, I think since I wrote that, we've certainly learned more about how to do this. But the big thing that I was thinking about when I wrote. That was how the world coming closer together is not a given, right for most of Facebook's existence in 2004 when I got started, you know, if you told me that people weren't going t o keep connecting Mawr in that there wouldn't be more global cooperation. I mean, I kind of had taken that as a given that the world would move in that direction. I think over the last few years the political reality has been that a lot of people are feeling left behind by globalization and different issues. And there's been a big rise of isolationism on nationalism that I think threatens some of the global cooperation that will be required to solve some of the bigger issues like maintaining peace, addressing climate change, eventually collaborating a lot in accelerating science and curing diseases and eliminating poverty. So I kind of take it as this is a huge part of our mission is that I think a lot of these problems require people coming together and having a global understanding. One of the things that I was that I found heartening is if you ask Millennials,

would they identify the most with its Not It's another nationality or even their ethnicity. It's the the plurality identify as a citizen of the world, and I love that strong. And that's that I think reflects the values of of where we need to go in order to solve some of these bigger questions. So now the question is, How do you do that? And it's clear that just helping people connect by itself isn't always positive. When you give people a tool, it's more positive than negative. Clearly, there's a lot of good things that happen. But then there's also abuse and there are bad things that happen and a bunch bigger. Part of the focus for me now is making sure that as we're connecting people, we are helping to build bonds and bring people closer together rather than just focused on the mechanics of the connection and the infrastructure, as you say. But I think that there's there's a number of different pieces that you need to do here.

Civic society basically starts bottom up right. You need to have well functioning. Groups and communities were very focused on that. You need a well informed citizenry. So we're very focused on making sure that that the quality of journalism and that everyone has a voice and that people can get access to the content that they need, that I think, isn't it ends up being really important civic engagement, both being involved in elections and increasingly working to eliminate interference and different nation states. Trying to interfere in each other's elections ends up being really important. And then I think part of what we need to do is work on some of the new types of governance questions that we started out this conversation with because there hasn't been a community like this that has spanned so many different countries, and that's an open question. But I think someone will need to work that out. So those were some of the things that I'm focused on, but you know, right now, Ah, lot of people aren't aren't is focused on connecting the world or bringing countries closer together is maybe they were a few years back and I still view that is an important part of of our vision for where the world should go, that we do what we can to stay committed to that and stay. Keep that as positive of a thing is possible and hopefully, can I help the world move in that direction.

41:42

Hi, this is Matthew Iglesias. Co hosted the Weeds, is a podcast for people who love diving into the details behind the policies and ideas that helped drive our lives. Every Tuesday and Friday, I'm joined by Caroline Jane Coast in as reclined on a variety of other leading box voices policy experts from around the country to dig into the weeds on important issues. Recent we've covered the Democratic debates. We had a deep conversation with sudden rising Semitism. Welcome Sarah Kliff from The New York Times back to finally figure out how we can achieve universal health care if you're the kind of person that likes to dive deep or you just want to keep up with the current political landscape. This show is for you. Subscribe to the lease for free right now in favour pocket staff to get new episodes automatically from Fox in the box Media podcast. Hi, I'm Cara Swisher, host of Rico Decode I. Just an interview with Carroll, Enoch and Phil Rucker, the authors of a new book about Donald Trump called a very stable genius.

We talked about a lot of things pretty much Donald Trump and how he's changed the presidency, and they're incredible investigative reporting of what's happening there in real time. Of course, since they've written the book, 20 100 things have happened. And so we were hoping we'll get a 2nd 1 But you can hear the full interview right now on my podcast, Rico Decode with Cara Swisher. Find it on Apple podcast Spotify or ever you listen to podcasts. One of the scary stories I've read about Facebook over the past year is that it had become, ah, real source of anti Rohingya propaganda in Myanmar, and this become accidentally part of a, you know, an ethic and ethnic cleansing. One of the things that was said during that story was by Phil Robertson, who's the deputy director of Human Rights Watch in Asia.

And the point was making was that Facebook is dominant for news information in Myanmar. But it's not an incredibly important market for Facebook. It doesn't get the attention that when something goes wrong in America that we give things to go wrong in America. I doubt you have a proportionate amount of staff in Myanmar to what you have in America. And he said that you guys end up being like an absentee landlord in Southeast Asia. Is Facebook too big to manage its global scale in some of these other countries, the ones we don't always talk about in this conversation effectively. So

43:59

one of the things that I think we need to get better at as we grow is becoming a more global company. We have offices all over the world, and a lot of different places were already quite global. But just based on the fact that we're your headquarters is here in California and you know, the vast majority of our community is not even in the U. S. I think does make this just a constant challenge for us to make sure that where putting do attention on all of the people in different parts of the community around the world, the me and more issues have, I think, gotten a lot of focus inside the company and they're they're real issues. And we take this really seriously. I mean, one of the I remember one Saturday morning I got a phone call. And, you know, we detected that people were trying to spread sensational messages through it was Facebook Messenger in this case to each side of the conflict basically telling the Muslims, Hey,

there's about to be an uprising of the Buddhist. So, you know, make sure that that you are armed and go to this place, and then the same thing on the on the other side. So that's the kind of thing where I think it is clear that people were trying tow, use our tools in orderto in order to incite real harm. Now, in that case, you know, we are systems detective that's going on. We stop those messages from going through and hopefully were able to to prevent any kind of real world harm there. But, I mean, this is certainly something that we're paying a lot of attention to. It's a real issue, Andi. We want to make sure that all of the tools that we're bringing to bear on eliminating hate speech, inciting violence and basically protecting the integrity of civil discussions that were doing in places like Myanmar as well as places like the US that do get a disproportionate amount

45:48

of the attention. I think if you go back a couple years in technology, world and technology rhetoric, a lot of the slogans people had that we all read optimistically have come to take on darker connotations to the idea that anything is possible. The anything has become wider or the idea that you want to make the world more open and connected. I think it's become more obvious. An opening connected world could be a better world in it, and it could be a worse world. So when you think about the 20 year time frame, what will you be looking for to see if a spoke succeeded if it made the world a better place? Well,

46:27

I don't think it's gonna take 20 years, you know, I think the basic point that you're getting at is that we're really idealistic, right? And when we started, you know, I think we thought about how good it would be if people could connect if everyone had a voice. I mean, these are values that I think are broadly shared, and frankly, I just think we didn't spend enough time investing in or thinking through some of the downside uses of the tools for the 1st 10 years of the company, people were getting these new tools and ability to connect with people on dhe, share new kinds of content and new experiences, and was everyone was just focused on the positive. And then I think for the last couple of years, as this has become a new normal right is having having a lot of these. These tools,

like now people are appropriately focused on some of the risks and downsides as well. And I think we were too slow in investing enough in that. It's not like we did nothing. I mean, it's the beginning of last year. I think we might have had 10,000 people working on security, so that's a lot. But by the end of this year, we're going 20,000 people working on security in terms of resolving a lot of these issues. Yeah, I think it's just a case where we because we didn't invest enough, I think we will dig through this hole. It will take a few years. I wish I could solve all these issues and in three months or six months. But I just think the reality is that solving some of these questions, Just gonna take a longer period of time.

Now, the good news there is that we really started investing, Maur, you know, at least a year ago. So if it's gonna be a three year process than you know, I think we're about a year in already, and hopefully by the end of this year, we'll have really started to turn the corner on some of these issues. But it's gonna be a long a long term thing. And like any security issue, you never fully solve it. You just make it harder for people to do bad things. But getting back to your question in terms of over the long term, how do we judge whether the impact of this is? You know,

I think human nature is generally positive. I'm an optimist in that way, but there's no doubt that our responsibility is to amplify the good parts of what people could do and they connect and to mitigate and prevent the bad things that people might do to try to abuse each other. And over the long term, I think that that's the big question is how have we enabled people to come together in new ways. Whether that's creating new jobs, creating new businesses, spreading new ideas, promoting a more open discourse, allowing good ideas, toe spread and through society more quickly than they might have otherwise. And on the other side did we do a good job of preventing the abuse right of making it so that, you know, governments are an interfering in each other's civic elections and, um, and and processes like that are we eliminating,

or at least dramatically reducing things like hate speech, that air offensive, that maybe when people come to contact, even if they're connecting, it's it's actually having a divisive effect? And I standing here even though we're in the middle of a lot of issues, and I certainly think we we could have done a better job s so far. I'm optimistic that we're going to address a lot of those challenges on. But we'll get through this in that when you look back five years from now, 10 years from now, I do think that people will look at the net effects of being able to connect online and have a voice and share what matters to them that that's just a massively positive

49:55

thing in the world. Thank you, Mark Zuckerberg, for being on the program. Thank you. To my engineer, Griffin Tanner. To my producer, Julian Weinberger. The Ezra Kline Show is a box media podcast production, and we will be back next week. Mmm.

powered by SmashNotes