ALISON BEARD: Welcome to the HBR IdeaCast from Harvard Business Review. I’m Alison Beard.
In the past decade, the big five tech companies, Facebook, Amazon, Apple, Microsoft, and Google/Alphabet, have extended their reach and revenues in amazing ways. They’ve brought us lots of useful products and services, and they dominate various segments of their industry. They’re also great businesses. In 2020, they collectively earned income of nearly $200 billion.
But these tech giants and their leaders are also facing a lot of criticism for the negative impact they have on society, for the misinformation and vitriol spread online, for invading our privacy, for quashing competition, and for avoiding taxes in a way that allows them to pile up cash while a lot of the people whose personal data they profit from are struggling.
Is it possible to keep the good that big tech and all the smaller companies in the industry have created while getting rid of the bad?
Our guest today has some ideas. Mehran Sahami is a professor at Stanford and a former Google employee. Along with Stanford colleagues, Rob Reich and Jeremy Weinstein, he’s the author of System Error: Where Big Tech Went Wrong and How We Can Reboot. Mehran, thanks so much for speaking with me.
MEHRAN SAHAMI: Thanks for having me. It’s a pleasure to be here.
ALISON BEARD: So the first question is pretty obvious, where exactly did big tech go wrong? Facebook used to be a connector of people, now it’s a killer of democracy. Google was a search engine, now it’s a privacy invader. Amazon, a shopping platform that’s busting unions and small businesses. So how did we get here?
MEHRAN SAHAMI: That’s a great question, and part of the way we have to think about how we got here is the mindset of the people who’ve created these products. Now, if you think of the technology mindset, it’s oftentimes around quantitative metrics and around optimization. We like to refer to it as the optimization mindset. The idea is setting particular metrics that you want in your business that you want to try to optimize at scale. And so if you think about something like Facebook, what they want to do is they want to create connection, but how do they actually measure that? What they have is a proxy for connection, something like how often people engage on the site, how much time they spend there, how many pieces of content they click on. But clicking on something isn’t really connection. It’s a proxy for it. And if you take that at scale and you try to optimize it, what happens is you actually get externalities that are not what you wanted.
So you promote pieces of content that people are more likely to click on. Those might be pieces of content that are actually more titillating or more click-baity than, say, truthful content. And so, as a result, there might be a greater amplification of misinformation than truthful information because what it’s doing is maximizing that metric. And you can see this across a number of sites. So, for example, for YouTube, they might want to maximize the amount of time we spend watching videos because they equate the fact that we’re spending our time watching those videos with the fact that we’re happy. But, in fact, you can see the flaw there. Just because we’re watching videos doesn’t mean we’re happy and it ignores other values we might care about. And when we maximize one value like screen time because we’re equating it for happiness, we’re giving up other values that we might actually care about and that are important to society.
ALISON BEARD: And so the issue is that these companies are made up of engineers who are taught to optimize and be efficient. I would argue that people in the financial industry are taught the same thing. They then become the executives leading these companies, the VCs funding these companies, and so there’s no one waving the flag for other kinds of values?
MEHRAN SAHAMI: Well, they are well-meaning people by and large. I don’t think they have negative intent, at least the vast majority of them. But the problem is that most things in life involve a value trade off, and when you’re optimizing and you’re picking one of those values, the other ones are getting left behind. And that’s part of the issue is, how do you actually take a broader look at some of these criteria that they’re optimizing but also think about the fact that the criteria by themselves are just a poor proxy for what we actually care about?
ALISON BEARD: I would argue that lots of industries and companies have this problem. They’re making value trade-offs. They’re doing both good and bad things for society. So why is big tech different? Why are we so focused on big tech?
MEHRAN SAHAMI: Because at this point in time we’re seeing the externalities from big tech on display in full force. So we’re seeing the notion of connection turning into rampant misinformation online. We’re also seeing the platforms take the market power that they have and turn it into political power so that they can continue to maintain the same free regulatory structure that they’ve been under for the past 30 years. And so what we’ve lost in that process are guardrails that bring back the values we might care about as a society as opposed to the values that might be important to the company.
ALISON BEARD: Yeah. So Azeem Azhar, who hosts an HBR podcast and also has a new book out on some of these issues, he argues that our institutions just haven’t been able to keep up with the exponential growth of these companies. So in a way, governments and we the consumers have let all of it happen. Is that fair?
MEHRAN SAHAMI: I think it’s fair from the standpoint of first thinking that the government has basically given big tech a pass. The regulatory structure in the 1990s through things like the Communications Decency Act was set up to give companies pretty broad reign in terms of the way they did business in the United States. You can see that pretty clearly if you were to contrast, for example, the United States and the European Union with the kind of protections they have around data. Right now one of the choices that we give to people in the United States is in the free market we say, “If you don’t like these applications, well, you can just disengage.” There’s the Delete Facebook Movement. You don’t have to use these apps. So what’s wrong with that? You just have this choice. And the analogy I liken that to is consider driving on the road.
The CDC estimates that about one million people are killed annually on the road. So should our choice just be whether you drive or you don’t? If that’s the only choice we have, then we’ve lost a lot of values because you can see that there’s real value in being able to drive despite the fact that it’s dangerous. In the same sense, there’s real value to using these tech platforms despite the fact that they may take our data, they may try to get us to engage more. So what did we do in the case of roads? We didn’t just tell people, “Drive carefully. Good luck.” We created a whole set of safety regulations. There’s stoplights, there’s lanes on roads, there’s speed limits, and so there’s a whole system that makes driving more safe while at the same time we still count on individuals to drive safely.
That’s the kind of regulation that we would call for, for big tech, certain guardrails that prevent certain kinds of practices like being able to have free reign over someone’s data. If we get the values we care about, we can get a safer information super highway while at the same time promoting innovation.
ALISON BEARD: Yeah. But you also in the book advocate for self reform. What are some of the ways that the industry can course correct itself?
MEHRAN SAHAMI: So we look at four main areas. One of them is algorithmic decision-making where we think about the fact that more and more…