Move fast and break things” wasn’t a roadmap. It was a warning—and we ignored it. Kim and Jason discuss what happens when companies prioritize speed over responsibility, scale over safety, and engagement over everything else. From internal debates that never happened to public harm no one wants to own, they talk about how the systems built to “disrupt” ended up eroding trust, and why it doesn’t have to stay this way. Want to build a culture that actually encourages learning, debate, and accountability? Start here.
Listen to the episode:
Episode at a Glance: Move Fast & Break Things
Radical Candor Podcast Checklist: Move Fast & Break Things
- Tip number one: Consider the cost of failure. Don’t skip the debate phase. And build a team culture that supports both speed and learning.
- Tip number two: Moving fast and breaking things isn’t inherently good or bad. It’s about applying a thoughtful approach in the right context with the right process. So, making sure that you slow down enough to debate something, to make sure that you’re still on the right track.
- Tip number three: Always focus on learning. The whole point of moving fast and potentially breaking things is to learn fast. It’s not to create a land grab and establish a monopoly that then destroys democracy.
Radical Candor Podcast Resources: Move Fast & Break Things
- How To Get Shit Done | Radical Candor Podcast 4 | 2
- Leaders Can Move Fast And Fix Things
- CEO Of $4.2 Billion Tech Giant Says Defying Silicon Valley’s ‘Move Fast And Break Things’ Mantra Was Essential To Growing His Business | Fortune
- Amy Edmondson—The Science Of Failing Well | Radical Candor Podcast 5 | 18
- The Measurement Problem—Development Versus Management | Radical Candor Podcast 3 | 7
- Move Fast, Break Things — And Be Accountable For It
The TLDR Radical Candor Podcast Transcript: Move Fast & Break Things
[00:00:00] Kim Scott: Hello everybody. Welcome to the Radical Candor podcast. I’m Kim Scott.
[00:00:07] Jason Rosoff: And I’m Jason Rosoff. And today we’re gonna be discussing a philosophy that’s become almost mythical in tech circles and that we’re watching play out in real time in the US government. This philosophy is often referred to as the move fast and break things, uh, approach to work, and when this approach is effective and when it might lead us astray. So with that, let’s get going.
[00:00:31] Kim Scott: I have to say, Jason, that one of my favorite photos, and we will put it in the show notes, uh, comes from a friend of mine who used to work at Facebook, which, uh, which is the company that coined this move fast and break things. And the new sign was slow down and fix your shit. Uh, so move fast and break things is, uh, in my book, sort of obnoxiously aggressive. Uh, however, I will say there is, at Google we called it launch and iterate. And, and that seems to me to be a better way to say what the, the good part of this, which is that if you’re so afraid of never making a mistake, you can’t innovate, you can’t fix things. It sort of the ethos behind, whoops-a-daisy, uh, which is publicly saying, oh, I’m, I messed that one up.
[00:01:28] And, uh, and I’m, I’m gonna do better, uh, next time. I mean, another way to say the good part of this is something that is in, on the bottom of a friend of mine’s every email says, make new mistakes. Uh, so, so I think that it, it needs to be okay, um, to make a mistake, you can’t innovate if it’s not okay to make a mistake. Uh, and it even needs to be okay to admit mistakes in situations where it’s really not okay to make mistakes, like hospitals. This is kind of what’s behind Amy Edmondson, uh, and, um, and her book, uh, the, The Fearless Organization. A lot of the research she did behind psychological safety is if you can’t admit a mistake, if you can’t share a mistake, then you’re gonna make, you’re doomed to make it over and over and over again.
[00:02:19] Jason Rosoff: That’s right. Yeah.
[00:02:20] Kim Scott: And, and paradoxically, the organizations that were, the hospitals where the most mistakes were reported were also the safest hospitals, which was not what she expected. So that’s my blink response when I hear move fast and break things, especially in the context of, um, of firing lots of people, it, it feels evil to me.
[00:02:44] Jason Rosoff: Yeah, I, I, there, there was a very concrete example of this, which was, and there’s an excerpt, which we’ll try to find for the show notes of one of the cabinet meetings where, uh, where Elon Musk is talking about making mistakes,
[00:03:00] Kim Scott: And first of all, what’s he doing in a cabinet meeting? He’s an unelected bureaucrat. Like that’s, that the fact that he was in the room is an example of, uh, moving fast and breaking things when they should be slowing down and fixing their shit.
[00:03:15] Jason Rosoff: Yes. Um, but he, he said, um, he, he said jokingly that they had, he said, we’re gonna make mistakes. Like for example, we cut all the funding to Ebola research.
[00:03:27] Kim Scott: Oh god.
[00:03:27] Jason Rosoff: And then we realized, whoops, we probably should have kept the funding for Ebola research. So we, we turned that funding back on. And it was like, in the moment, that clip makes it seem sort of reasonable, but it doesn’t capture the whole picture. Because what, part of what’s happening is that the uncertainty of grant money is, is unwinding programs. Uh, so it’s not just like the, the, the absence, uh, the a, like, it’s not that a short term absence of funding has no effect. It’s in fact the case that in some cases they were so dependent on the next distribution of money that they literally had to stop an experiment midstream. And so if there were live samples of something, for example, they couldn’t afford to keep those, uh, you know, uh, uh, frozen. So they, those samples went bad. Like, there, there’s a real effect.
[00:04:24] Kim Scott: There’s a real cost to that mistake,
[00:04:27] Jason Rosoff: Correct. And so it’s, it, on the one hand, like it, it’s like, you know, credit where credit’s due, realizing that was a terribly like, stupid decision and it needed to be undone. And on the other hand, the, one of the things that I would like to add to your list of what, you know, things to look out for when you’re considering a sort of ship and iterate is what is the cost? What are the side effects of, uh, of negative outcomes?
[00:04:56] Kim Scott: Yes.
[00:04:56] Jason Rosoff: Uh, because if the side effects are significant or potentially very costly, um, then the, it’s much more prudent to slow down and, and make sure you’re making the right decision.
[00:05:08] Kim Scott: Yeah, exactly. I mean, launch and iterate was about search results. Nobody, nobody, you know, uh, it, it did not apply to a nuclear power plant, for example.
[00:05:20] Jason Rosoff: Right.
[00:05:20] Kim Scott: Uh, like that is not, you would not wanna launch and iterate at a nuclear power plant. You, you wanna make damn sure that you’re, that you’re not gonna, that you’re not gonna blow up a major metropolitan area or any area if, if you, if you launch. You, you gotta test and test and test. And even when the, even when the consequences of failure are not that dire, but like Apple was not a launch and iterate kind of culture ’cause they were making hardware.
[00:05:53] It wasn’t so, you couldn’t just push a fix out. And, and once you have sold this phone to someone, you, you know, you can’t really fix it very easily, very cheaply anyway. It’s not just a matter of, of pushing a software patch out. Uh, and, and so they tested it and tested. Apple was much more of a, you know, measure a hundred times cut once, uh, kind of culture. And which is not to say that it wasn’t innovative. Apple’s obviously incredibly innovative, so you don’t even have to launch and iterate to, to be innovative. And you certainly don’t have to move fast and break things. Um, and I think well, go ahead, you were gonna say something.
[00:06:38] Jason Rosoff: No, I, I think the, the, the, the sort of like the, the, the, the cost of failure calculation, I, I think you’re, I think you’re right. Like there, there are some times in human history where in fact, the, the cost of the, there, we took great risks with people’s lives because the cost of failure was even higher. So I, I just, I, I, I think it’s important to put everything on a spectrum. So like during, uh, like when we were like testing aircraft, for example. There are all these people who like put their lives at risk to test this aircraft. So even though they were testing to try to get it right, it was not a zero, the cost of failure was still high. Like test pilots died, uh, uh, in testing these aircraft.
[00:07:27] Um, but you know, the, the, the work that, that, that they did wind up putting us in a position to be able to turn the tide of World War ll. So like the, there, there, there was the, the, the potential for success was great, but the thing that they did really well was they were very clearheaded about the cost of failure. Meaning they understood that they’re putting people’s lives at risk. And they said, we’re, we’re, we’re measuring that against the good that we think this can do. And it’s not that when people do that, they always get it right. But that, I think that’s a pretty big difference than the, the sort of like, har har we, you know, we undid Ebola funding. You know what I’m saying? Like the joke,
[00:08:08] Kim Scott: Yeah. That’s not funny. Yeah.
[00:08:09] Jason Rosoff: Right. Exactly.
[00:08:10] Kim Scott: Because he’s not gonna bare that cost. Musk is not gonna, and, and there’s, and, and part of the problem with what’s happening is that, you know, part of the American experiment was checks and balances, right?
[00:08:25] Jason Rosoff: Yeah.
[00:08:25] Kim Scott: And so we were putting checks and balances in place. We, our, our, you know, the, the, our governmental system put checks and balances in place so that the president didn’t have too much power. The congress didn’t have too much power, the judiciary didn’t have too much power. They could check each other’s power. And unfortunately, those checks and balances are being undone before our very eyes. And so, so there it’s, and then it becomes impossible to hold people accountable for, uh, part of the cost of failure means that if you fail the, there, there should be some accountability for failure. Um, and in this case, there’s not, uh, that’s the problem of having an unelected bureaucrat in, in these cabinet meetings.
[00:09:14] Jason Rosoff: One of, I, I think,
[00:09:16] Kim Scott: One of, one of many.
[00:09:18] Jason Rosoff: I think the other thing that’s on my mind is just like we, we’ve, uh, just to like bring back a topic we’ve talked about many times, which is one of the issues with, with move fast and break things, even if you’re aware of the potential cost, the potential negative, uh, uh, impact of failure, is that it’s often quite hard to measure the full cost, uh, of failure. And so if, if your attitude is like, there isn’t really, there doesn’t need to be accountability for failure, you know, if you, you have this like sort of careless attitude, uh, toward it
[00:09:57] Kim Scott: Like Careless People.
[00:09:59] Jason Rosoff: Yes. You, you wind up, you wind up being much more susceptible to the measurement problem, which is that it is very hard to measure the things that really matter when you’re doing these calculations. And so it actually takes, it does take slowing down to really consider like what the externalities of what we’re doing actually are for you to realize, uh, oh, like there may be, it may not be so simple as just turning the funding back on. Like may be setting ourselves back if we do this in this way.
[00:10:30] Kim Scott: I think there’s also, like size matters. And I think a lot of the launch and iterate and even the, the move fast and break things kind of culture came from a world in which these companies were small, right?
[00:10:44] Jason Rosoff: Right, right, right.
[00:10:44] Kim Scott: When, when they started saying launch and iterate, like it didn’t really matter if Facebook made a mistake in the early days because Facebook didn’t really matter in the early days. And now as we have seen, it’s having a huge impact on the psychological wellbeing of all of us. Polarization, uh, you know, uh, in, in Myanmar, genocide was, was planned on the, on, on the, on Facebook. And it really does matter, like, like mistakes really, uh, have real world, terrible human consequences.
[00:11:21] Jason Rosoff: Right.
[00:11:21] Kim Scott: And, uh, and I, I’ve been thinking, I, I’ve been doing a fair amount of retrospection about my career in tech and, and what, you know, what has gone wrong, uh, in, in, in recent years. And I think, in fact, I may even have to go back to the first job I had outta business school was at the FCC in 1996. And this was when the Telecom Act came out. And, and Section 230 is a section of the Telecom Act of 1996. Section 230, uh, explicitly lets social media platforms, lets tech platforms off the hook for, um, for any content that is for any, um, accountability for content moderation. And in 1996, I mean, even then it was sort of questionable writing this blank check to this new industry. But I could, you know, I, I understand what we were thinking at the time. We were thinking it doesn’t make sense to regulate a tech, uh, a company, companies that don’t even exist yet.
[00:12:27] I mean, in 1996, Google hadn’t even been founded, and I think Zuckerberg was in like third grade or something. Um, and, and so, so these, these, these platforms didn’t exist, uh, at, at the time. And, and so there was some sense to it, but like, I wish that we hadn’t in, in retrospect, written this blank check for all time. Because now this check is coming due and we can’t afford to pay it. Uh, and, and, and so at this point, it is time to hold these companies accountable for the harm that, um, that the content in these, on these platforms does. And I think also you, you know, you raised one of my favorite topics of all time, the measurement problem. Like part, part of the issue is that if you just are measuring engagement, which is what sort of, which is all they really measure, or I shouldn’t say that’s all they measure, but that’s a, that’s very important to Facebook’s business is engagement.
[00:13:32] ‘Cause the more engaged, uh, users are, the more likely people are to contribute content to Facebook and to Instagram and to all these other, and, and, and. Uh, and, and people tend to engage more with negative content, with negative emotions, which is why these platforms tend to, tend to, um, sort of spew more FOMO, fear of missing out. More, more deeply enraging content that polarizes us all. More content about eating disorders, all of these kinds of things because we, we pay attention to these things. And so they’re just kind of tracking what they can measure. It’s very hard to measure the value of a well-functioning society, right? And that, believe me, it doesn’t factor in clearly to their, um, to, even though in the end, uh, if we, if society dissolves, it’ll kill the goose that lay the golden eggs. So I don’t, you know, it seems like you would factor it in.
[00:14:40] Jason Rosoff: You know, so I, I mean, I spent many of the formative years of my career in Silicon Valley working at a nonprofit, so it’s on the other side.
[00:14:47] Kim Scott: At Khan Academy?
[00:14:49] Jason Rosoff: At, at Khan Academy. Yeah.
[00:14:50] Kim Scott: Yeah.
[00:14:50] Jason Rosoff: Uh, and, and I will say like, you know, we thought a lot about these externalities, like, you know, what does it mean to give this away for,
[00:14:59] Kim Scott: Sorry, can I interrupt? Can you explain to people who don’t know what an externality is?
[00:15:03] Jason Rosoff: Oh, sure. Yeah. So, uh, but just a side effect is maybe a simpler way to say it, like the, the side effects of what we were doing. Um, so K Academy was providing free sort of world class educational content. Um, and we were mostly fine with that because most of the other people who were providing content were, uh, educational publishers. And we didn’t think the educational publishers were doing a great job of providing great content for teachers and students to use.
[00:15:34] Uh, but you know, we were, we were conscious of the fact that, uh, there may be some side effects, right? So like, uh, like for example, you know, as much as we might not like to think about it this, this way, a textbook, uh, you can use for many years. And many students can use a textbook, uh, and you don’t need a computer and you don’t need internet access to act, to use a textbook. Um, but like we were conscious of the fact that if we were successful, uh, and we undermined this other way of getting access to stuff like it, it had the potential of creating, of exacerbating inequities in the system.
[00:16:16] Kim Scott: Yeah.
[00:16:16] Jason Rosoff: Um, even though it would be done for all the best reasons, um, and there wasn’t a profit motive, so we were giving, we were giving it away for free, but there are these potential side effects of what we were doing. Uh, and, and we thought about that a lot. And what that caused us to do was to like build partnerships with people who were bringing free or very inexpensive internet access, free or very inexpensive computing to schools, uh, in the US and around the world. Like we thought about like how do we make sure that if we, if we’re successful, that we don’t wind up just sort of, you know, giving really great access to stuff to rich kids who already had all the things that they already needed. Um, uh, and I, and I think that,
[00:17:02] Kim Scott: How much time do you think Facebook spends worrying about that particular issue?
[00:17:06] Jason Rosoff: I mean, the,
[00:17:08] Kim Scott: I’m sure there are people, I’m sure there are, well, let me pause. I’m sure there are people at Facebook who care deeply, deeply about this. Uh, like I do not mean to dismiss everyone who works at Facebook. I have friends who work at Facebook or Meta, um., Uh, so, so I, I certainly don’t mean to, uh, to cast dispersion on all these people. But as like, if you’re making that argument and meanwhile these other metrics that are making all the money, you’re, it’s, you’re, you’re, the argument that this thing that you care passionately about is likely to get lost in the noise.
[00:17:40] Jason Rosoff: Yes. I, and I, I was gonna say the exact same thing. Like, I, I think the, the, in part there’s a, you know, there’s a benefit of, you know, like the Khan Academy relied on donations. So that limited our scope and our ability to grow. Um, and that meant it was easier to keep people aligned around the mission and it was easier to sort of, to bubble up or center these conversations around making sure that we were actually achieving the mission and not just sort of reinforcing, uh, inequities that were already there. Um. And so I do imagine that there, there are some people at Facebook who are every day having a discussion about like, how can they help to fix some of the, the problems that, um, you know, social media as a medium has, has created and all this other stuff. But, um,
[00:18:27] Kim Scott: As there are at Google, like there’s great program, Grow with Google, where, where, where Google is trying to, to offer, uh, content to people to learn skills that will help them get jobs.
[00:18:39] Jason Rosoff: Right. Right. It exists, it exists everywhere, but it’s sort of not at the, the core of the machinery.
[00:18:46] Kim Scott: Yeah.
[00:18:46] Jason Rosoff: It exists on the periphery. Um, as these organizations grow, uh, and, and I think that the, back to the measurement problem, sort of what’s measured is managed, right? So like if, if short-term profits are measured, if engagement is measured, like those are the things that actually wind up being managed against. And I, I think you’re totally right. That the long-term play is a bad one, which is like if it actually undermines society and like the health of, and people become so disgusted with it that they, you know, they leave these platforms, then they have no users left, so they can’t make any, any more money. Um, but again, I think it’s a consideration at the periphery as opposed to like at the core.
[00:19:30] Kim Scott: Yeah. Well, I mean, and I think part of the, part of the problem is to, to be fair is not only the, the metrics that that drive Facebook’s business, but also the market. I mean like if, if the market rewards quarterly earnings, it’s really hard to worry about the, the, the downstream impact of your product. Uh, although it’s not impossible, like, again, I don’t mean to, I don’t mean for this to be a Facebook bashing Google, um, promoting podcast, but,
[00:20:01] Jason Rosoff: Right.
[00:20:01] Kim Scott: Like in the S one letter, when I took the job at Google in 2004, and you can feel free to push back and tell me I’m being, you know, I had drunk the Kool-Aid. Uh, you know, Larry and Sergey said, we are not a normal company. We don’t intend to become one, and we’re gonna invest a lot of money in things that are not gonna make short-term return for shareholders. And if you’re not comfortable with that, you know, don’t, don’t buy our stock. Uh, you know, and we’re gonna continue to reward and treat our people well. And if you’re not, if you’re not happy with that, don’t, you know, don’t buy the stock. And that to me was really important. And it had, so let’s talk about sort of the importance of debate. Like another problem of, of move fast and break things is that there’s no time for discussion. And in order to, in order to really innovate, you need to create time and space for discussion.
[00:20:55] Jason Rosoff: Yeah. Everything we were just talking about is, this is exactly what was on my mind. Like there are people who want to have these debates, who have really good arguments for why, you know, Khan Academy
[00:21:06] Kim Scott: Should or shouldn’t do this.
[00:21:07] Jason Rosoff: Should or shouldn’t do something. And, and to your point, like, I don’t think it’s just the attitude, like to some degree the, it’s also size. It makes a difference here. Like the larger the organization gets, the harder it is to have debates with the people who really matter. Like who, whose arguments are going to move the needle, maybe is a better way to say it. It’s not about the people who matter, but it’s about the arguments that they’re able to make. And so as a result, you wind up with, uh, and I think this was some of what we, what was in, uh, Careless People is like, you wind up with these sort of like echoy echo chambers, like tiny echo chambers inside the company where there’s like reinforcement of bad behavior, um, in part because they’re not listening to other people, you know what I’m saying? They’re like, they’re not inviting the disagreement. They’re not inviting the debate.
[00:22:00] Kim Scott: And for folks who are not familiar with Careless People, this is a, a book. You wanna talk about Careless People for a second?
[00:22:07] Jason Rosoff: Oh no, you go ahead. You got it.
[00:22:08] Kim Scott: Yeah. It’s a, it’s, it is a book, uh, it, it’s sort of a, a memoir, uh, written by a former Facebook employee. And she really describes in great detail, uh, some, some of the, the problems with the way the systems worked and the, the, uh, again, the, the negative externalities, the negative impact on all of us, the negative side effects that all of us are bearing the cost of these, uh, of these negative externalities that are created by the way that Facebook system works. Uh, and, and, and Meta sort of, uh, prevented when, you know, sued the author and prevented her from talking about her book. So we’re trying to talk about her book more.
[00:22:58] By the way, it also prompted me to read Frances Haugen’s book, The Power of One, which I hadn’t read before, is also really a great explanation of how these systems work and, and what we could do to make them, uh, not create these terrible negative side effects for society. Uh, and begs a lot of questions like why Facebook isn’t already doing these things. And I think part of the answer is that there’s no public debate about, uh, you know, what, how their algorithm should work. And, and there’s no sense that there should be a public debate about that.
[00:23:38] Jason Rosoff: And, and I, and I think even to some extent there’s very limited internal debate.
[00:23:42] Kim Scott: Debate, yes. Yes.
[00:23:43] Jason Rosoff: Like the, the there and, and so, if you are like a free market capitalist believer, in theory, the idea is like, is that people can have an opinion about this and they can vote with their money, right? They can basically say, I’m not gonna give my money, but, but that to some extent is undone by the business models that they’ve created, which is essentially monetizing. They don’t charge us to access the content. They charge advertisers, uh, essentially to support the, the, the platforms. Uh, and so it’s very hard, like right before we got on the podcast, we were talking about, you know, should we continue to post on Facebook? Should we continue to contribute content to, to Facebook and Instagram?
[00:24:35] Um, and, and I think what’s, part of what’s interesting about that debate is like to, it’s hard to affect a real protest when you’re not directly, like the, the, the consequences are, are, are fairly delayed for, for a company like, like Facebook. Like it would take one massive, uh, you know, movements of people away from the platform, uh, to, to start to really have, uh, a noticeable negative impact on, on, on their revenue. And that’s different, right? Like, you know, when Tylenol, uh, had to recall, uh, this is like in the seventies or something. Like they had this, this thing they had to, like people stopped buying Tylenol. Like the money dried up, like it, it went away, um, fair fairly, fairly immediately. Uh, so there was like a, there was actually like a, a market response. There was, there was like a consequence.
[00:25:28] Kim Scott: Yeah. Although if, if everybody, let’s say if even 10% of the people who contribute, I mean, most Facebook users are read, are readers, not writers.
[00:25:38] Jason Rosoff: That’s right. Yeah.
[00:25:39] Kim Scott: If only 10% of the, and we’re writers on the, we’re giving away our content to this platform, and if, if only 10% of the people who actually post to Facebook quit posting, uh, that would have a huge and, and very quick problem. In fact, this is the thing. This was, I had two ah-ha’s when I read, uh, uh, when I read these two books, Careless People and The Power of One. One, ah-ha was that part of the reason why Facebook started advocating, Frances Haugen explains this, started sort of pushing more polarizing content in their platform is that it got more engagement.
[00:26:24] People who were contributing content were slowing down. They weren’t contributing as much content ’cause they weren’t getting so many reactions from people, likes and, and, and whatnot. And they found that when they promoted in their algorithm more extreme content, it got more engagement and then people started contributing and it was really a vicious cycle. ‘Cause then even if you didn’t believe these extreme things, or if you were writing headlines, for example, you started writing these click bait headlines. And so that’s a real, that, that was one real problem. The other thing that, and maybe this is just my own stupidity, that I didn’t really, because I’ve always sort of wondered like, why do they allow all these political ads?
[00:27:07] Like it’s not that, it’s not that much money. Like why don’t they just disallow them? And what I realized was they, they, they allow them because now they are king makers. Now they can help you get elected or unelected. And so no official dares regulate, uh, um, uh, Facebook because Facebook can prevent them from getting elected. So, so this like is, is a huge dampening on democratic debate about what we should, uh, ways that we should regulate, um, content on, on, on Facebook slash Instagram slash Meta. Uh, and that was like, I don’t know why I didn’t realize that before, but I’m like, oh, of course. Yeah. So that I think is, is important to, to think about.
[00:28:04] Jason Rosoff: Yeah. So like there’s multiple layers of which we needed debate and, and we didn’t have it. Like internally I’m sure there were people who were like, we should not be doing this. Like, uh, you know, the engagement thing is, is good in the, in the sense that we’re getting more people to write, but it’s bad in the sense that with like the content is worse and people feel worse about it. Like I’m sure there were people making that argument inside of Facebook
[00:28:27] Kim Scott: Yes.
[00:28:27] Jason Rosoff: As they were deciding to, to
[00:28:29] Kim Scott: Yes, of course, of course.
[00:28:30] Jason Rosoff: To do this. And, and then there was like, to your point, the public debate where like, like the public having an opinion about whether or not this is, uh, or, or the more broader market, having an opinion about whether or not what they’re doing is good or should continue, that doesn’t really happen or doesn’t feel like it, it should happen. Um, and then there’s the, uh, there’s also like the government level debate.
[00:28:52] Kim Scott: Yes. Yeah. The public, the democracy level debate, like how about, yeah. And it, it seems increasingly, it feels to me anyway, like there’s a move to just do away. I, I don’t wanna have the debate at all, so let’s, in, how about we just harm democracy. And that I, I don’t that obvious, that obviously is very worrisome. Uh, but I wanna go back to, there was a key moment in my career where somebody raised the issue of, of the importance of having sort of public debate about decisions that, this is when I was working at, at Google. I, I went, so I was managing AdSense, the ad, the AdSense online team. And, and we, uh, in addition to, to sort of trying to grow the business, I was also in charge of policy enforcement and, and creating policy. And, and that meant that at the same time that I was trying to grow revenue, I was also in charge of, of terminating AdSense publishers who violated Google’s policies.
[00:30:02] Jason Rosoff: Got it.
[00:30:03] Kim Scott: And I was not actually, this goes back to the measurement problem. I don’t think it would’ve mattered, uh, in fact, I can tell you for sure, it did not matter how fast it grew, what, how fast, how much money AdSense made, did not specifically impact my compensation. Uh, and that may seem sort of nutty to people who was in, or somewhat to manage someone who was in charge of sales and operations. But there was an understanding at Google that, that we, we couldn’t, if we measured things that narrowly, we were gonna get the wrong kinds of behaviors. And so I was, I was equally as passionate about taking down the, the bad sites, uh, or the sites that were violating policy, I shouldn’t call them bad sites.
[00:30:55] As I was about growing revenue, and I was really excited about growing revenue, believe me. But, but I, I really believed that, that, you know, you weed your garden and if you allow your garden to get overrun by weeds, like that’s not good for your garden long term. And I don’t, I don’t understand why that doesn’t happen, uh, more at, at, at Facebook. Because it’s possible to create a system where people are caring about both of these things. Um, but so I thought I was doing a great job. Uh, that’s, that’s the TLDR there. But, uh, but then I was invited to speak at this class called Liberation Technology. It was a class taught at Stanford and it was a class taught by Josh Cohen, who is an old friend of mine, person I like a lot.
[00:31:48] I was describing to him content moderation, challenges and like this big debate I had had, and we had had the debate at, at Google, this was actually around Blogger, which I also managed for a while, the, uh, policy enforcement. And there was somebody had written something calling for genocide, uh, uh, basically kill all the X people. And I shouldn’t even say X, kill all the A, B, C people. And, and I wanted to take it down. I, I believed that, that that kind of content had no part on our, uh, uh, in the AdSense network. And I, I was just gonna pull it down, but I didn’t have unilateral decision making authority. This was something that had to be discussed more broadly.
[00:32:38] So I was in this big meeting at one point. You know, Eric Schmidt sort of agreed with me. He said, you know, if we had a dance hall, uh, and not Blogger, you know, we would not rent it out to the KKK. Like, why would we allow this kind of content? And, and yet both Eric and I got overruled by kind of the, the free speech crowd. And in retrospect, I think I was right and I wish I had fought harder. But these are hard questions because their point of view was that you’re better off knowing who believes these things and who’s saying it than not knowing, and, you know, forcing this kind of, uh, stuff to go underground.
[00:33:23] And so in the end, the, I think if I, if memory serves, which it often doesn’t the older I get, but I think what happened was we put a, a content warning, we left the site up, but we put a content warning saying we, you know, something like this is bad and, although I’m sure that’s not what it said. And uh, and so I was talking about this in the class and thinking, you know, that we had a pretty good debate process. And I’ll never forget, Josh looked at me and he said, you are making those decisions. And at first I was kind of insulted, you know? I’m like, but what’s wrong? Of course. I like, why am I not qualified?
[00:34:04] Jason Rosoff: What’s wrong with me? Yeah.
[00:34:05] Kim Scott: And he was like, you were totally unqualified to make this. And, and, you know, once I got over feeling kind of offended because this was the most interesting part of my job, hands down, you know, I really did care about it. But he’s, Josh said, and now I think he’s right, like these decisions, that there needs to be some democratic oversight to these kinds of decisions because they have such a huge impact on, uh, on our whole society. Um, so this is a big, Josh was right, I was wrong. But at the time I was like, oh, Josh, you don’t understand. The government could never be involved in these kinds of decisions. You know, like I have people, we have a, we had a policy about we, no porn sites, and some clever person took a picture of himself in front of a toaster as though it was a toast, but of course the toaster was very shiny and he was naked.
[00:34:59] Jason Rosoff: Yeah.
[00:35:02] Kim Scott: Like it’s complicated to, to manage all the content. And there was another moment where I, there were, there were these ads for bestiality showing up on a, this is all a long segue, but they’re funny stories. Uh, but there, there was a, there, there was a, an ad for bestiality that kept showing up on this parenting magazine that was an AdSense publisher.
[00:35:29] And obviously they were very upset about this bestiality ad. And I went to, I, I called up the, the ad content moderation team, which was in another country. But anyway, they were, they were reviewing all the ads and I was saying, why, why are these ads even showing up anywhere? Like we have a policy against porn ads. And, um, this person claimed that bestiality didn’t count as porn. I was like, gosh, that was not on my bingo card today. The argument about whether, so anyway, I told these stories and I, you know, they all got a big laugh and nobody really stopped to think about, should we have democratic oversight over some of these decisions. And I now believe we should. Josh Cohen was right
[00:36:18] Jason Rosoff: I’m not arguing against it, but I’ll just say, I think it’s so much harder in practice to achieve the, the goal of oversight in a way, here, here, just to give your team credit for a second. Like, I think your team probably thought about these things very seriously and maybe even put like days or hours of time.
[00:36:39] Kim Scott: Oh. And it went all the way. I mean, you know, the CEO of the company was willing to spend his time on this. Like Google took this very seriously.
[00:36:46] Jason Rosoff: Right. And i guess like my experience with, uh, like, I don’t know exactly how you encourage that kind of debate and at what level and who gets involved outside of these companies. I, I’m like, I’m open to the idea, but I’m just recognizing the practical challenge of the fact that everybody else who would be contributing to that conversation would, you know, either need to be paid by, by Google or would be doing it on a volunteer basis, you know what I’m saying? Like, they’d be volunteering their time to do this thing. And that I think is like, I think it’s part of the reason why the public debate hasn’t been vigorous about this stuff is because it’s like, how do you make the time to really understand what’s going on? And instead of public debate, what we get is hot takes.
[00:37:29] Kim Scott: Yes.
[00:37:29] Jason Rosoff: You know what I’m saying?
[00:37:29] Kim Scott: Yes, exactly.
[00:37:31] Jason Rosoff: What we get is like one person saying, you know, something which is missing a whole bunch of context. But it’s sort of punchy. And so as a result, a bunch of people are like, yeah, I agree with that person. And then the other person sort of fires back and their thing also misses a bunch of context. And as a result, we’re not really having a debate. We’re just sort of like throwing barbs at each other.
[00:37:49] Kim Scott: Yes. And the debate certainly should not happen on social media. Uh, for sure. It needs to happen in a different, in, in a different way.
[00:37:59] Jason Rosoff: Um, and I think I, I, I guess what I would say is like, what you’re, it seems to me what you’re describing as sort of the third pillar of, uh, avoiding the pitfalls of move fast and break things, which is like the, having the right kind of culture. Now did you get, the systems weren’t perfect. There should probably, should have been some external, uh, involvement, but I do think this idea that, uh, we’re going to have a public debate and we’re not gonna give unilateral decision making power to you, to the person who’s technically in charge of this. We’re going to, we’re gonna force a public debate, uh, on this. And there’s gonna be a record of that debate. Like, we’re, we’re actually gonna like, record this thinking, uh, for, for posterity is like, those are important cultural rituals, uh, that I, I think don’t exist in a lot of organizations.
[00:38:44] Kim Scott: Yeah.
[00:38:44] Jason Rosoff: Um, and going back to your original point, it’s one of the reasons why without those types of rituals, without like ritualizing debate for example, and saying this is an important part of how we make decisions as an organization, um, without removing unilateral decision making power, for example, and setting that as a cultural touchstone, I think it’s very easy for, you know, the necessity of the moment to overtake good, good thinking. Like, like it, it feel like, things feel urgent and as a result, um, uh, people don’t slow down to have the debate.
[00:39:23] And I think more than that, what you were saying at the very top of the podcast was like it also, without creating those cultural norms, it also discourages people from talking about mistakes that they make. Uh, and when you combine those things together, when you have no culture of debate and no discussion of mistakes, it becomes very easy to see how you could go very deep down a rabbit hole of bad things happening. And, and people sort of looking around being, and like, whose job is it to put the brakes on, on this?
[00:39:56] Kim Scott: Yeah. Yeah.
[00:39:56] Jason Rosoff: Like what is the, what is the way that we respond, like collectively to these bad decisions or bad behavior that, that we’re observing? And I think, well, I, I, I think a lot of cultures would benefit from, uh, you know, the, the removal of unilateral decision making power and, uh, a a, a push toward public, uh, you know, as, as public as you can make it, a debate on important decisions.
[00:40:22] Kim Scott: Yeah. There’s, there is a conference room, uh, at, uh, Meta right next to Zuckerberg’s office that says, good news only or only good news. Like that’s a disaster. That’s an example of, uh, the wrong kind of culture. Like, don’t tell me what’s wrong. You know, like you’ve gotta have that culture, uh, where, where leaders are soliciting feedback and are eager to hear the bad news and that contrary point of view, uh, not the good news only kind of culture.
[00:40:55] Uh, and I think it’s really important that companies be willing, and this is something that tech companies have not traditionally been willing to do, to be held accountable by the public, by the government. Like there, there is a reason why we have a, like all these NDAs and, and um, and, you know, agreements that don’t allow you to sue, uh, that forced arbitration agreements. Like that is a, that is a, an example of a culture that is trying to avoid being held accountable by, by, by our government, by, by the systems that we have in place to, to hold, uh, wealthy, big companies accountable.
[00:41:37] Jason Rosoff: Alright, well, let’s, let’s, let’s try to summarize our, our guidance here for people who want to be able to move fast, but don’t want to break things in an irre irreparably bad way.
[00:41:49] Kim Scott: Yeah, it’s okay, whoops-a-daisy is one thing, like destroying democracy is another. I’ll just, I’ll put it that way. Alright, so tip number one, consider the cost of failure. Don’t skip the debate phase. And build a team culture that supports both speed and learning.
[00:42:10] Jason Rosoff: Yeah. Tip number two, moving fast and breaking things isn’t inherently good or bad. It’s about applying a thoughtful approach in the right context with the, the, the right process. So making sure that you slow down enough to, to debate something, uh, to make sure that you’re still on the right track.
[00:42:30] Kim Scott: And tip number three, always focus on learning. The whole point of moving fast and potentially breaking things is to learn fast. Uh, it’s not to create a land grab and, uh, and establish a monopoly that, uh, that then destroys democracy.
[00:42:52] Jason Rosoff: And with that, we invite you to head over to RadicalCandor.com/podcast to see the show notes for this episode. Remember, praise in public and criticize in private. If you like what you hear, please rate, review, and follow us on, uh, whatever platform you like to listen to your podcasts. If you have feedback, please email it to us at podcast@RadicalCandor.
[00:43:14] Kim Scott: By the way, if you have feedback on whether you think, we’re gonna invite some public debate, do you think that we should stop posting on Instagram and Facebook? Let us know your thoughts. We are eager for them. Thank you.
[00:43:27] Amy Sandler: The Radical Candor Podcast is based on the book Radical Candor: Be a Kick-Ass Boss Without Losing Your Humanity by Kim Scott. Episodes are written and produced by Brandi Neal, with script editing by me, Amy Sandler. The show features Radical Candor co-founders Kim Scott and Jason Rosoff, and is hosted by me still, Amy Sandler. Nick Carissimi is our audio engineer. The Radical Candor podcasting music was composed by Cliff Goldmacher. Follow us on LinkedIn, Radical Candor the company, and visit us at RadicalCandor.com.
Have questions about Radical Candor? Let's talk >>
Follow Us
Instagram
TikTok
LinkedIn
YouTube
Radical Candor Podcast Listeners Get 10% Off The Feedback Loop
You’ll get an hour of hilarious content about a team whose feedback fails are costing them business; improv-inspired exercises to teach everyone the skills they need to work better together, and after-episode action plans you can put into practice immediately.
We’re offering Radical Candor podcast listeners 10% off the self-paced e-course. Follow this link and enter the promo code FEEDBACK at checkout.
Watch the Radical Candor Videobook
We’re excited to announce that Radical Candor is now available as an hour-long videobook that you can stream at LIT Videobooks. Get yours to stream now >>
Episodes are written and produced by Brandi Neal with script editing by Amy Sandler. The show features Radical Candor co-founders Kim Scott and Jason Rosoff and is hosted by Amy Sandler. Nick Carissimi is our audio engineer.
The Radical Candor Podcast theme music was composed by Cliff Goldmacher. Order his book: The Reason For The Rhymes: Mastering the Seven Essential Skills of Innovation by Learning to Write Songs.
Download our free learning guides >>
Take the Radical Candor quiz >>
Sign up for our Radical Candor email newsletter >>
Shop the Radical Candor store >>
Get Radical Candor coaching and consulting for your team >>
Get Radical Candor coaching and consulting for your company >>
Meet the team >>