Andreas Ekström: Can We Solve For Bias In Tech?

Feb 15, 2019

Part 3 of the TED Radio Hour episode Bias and Perception.

About Andreas Ekström's TED Talk

We think of search engines as unbiased sources of information. But they're not—and they can be manipulated. Andreas Ekström asks: who should hold the burden of addressing bias in search engines?

About Andreas Ekström

Andreas Ekström is staff writer at Sydsvenskan, a daily morning paper in Malmö, Sweden. Through his writing and research, he focuses on online media and digital equality.

Ekström is the author of several books, including the bestselling Swedish book The Google Code.

Ekström is also a columnist and a commentator, and he often lectures and leads seminars on the digital revolution.

Copyright 2019 NPR. To see more, visit https://www.npr.org.

GUY RAZ, HOST:

On the show today, ideas about the biases we carry and the ways we try to address them. And like many other problems, one way to get around our biases could be with technology because a lot of us assume that technology is always objective.

ANDREAS EKSTROM: Well, we don't think that a machine has an opinion, and we forget that there's a programmer behind every machine that has told it how to prioritize. So that means that, you know, if you want to really run all the way with that ball, you can say that there's not a single search query that you can make that is unbiased because it's all an effect of what a real person has decided that the algorithm should do.

RAZ: This is Andreas Ekstrom.

EKSTROM: Yeah, I am a Swedish reporter and author turned speaker-educator. I try to understand a little bit about what's happening with society through the digital revolution.

RAZ: Is it possible for anything or anyone or any search result to be totally unbiased, to be completely objective? I mean, is objectivity even possible?

EKSTROM: Yes, for undisputed scientific facts. What is the capital of Sweden? It's Stockholm. There - that's undisputed. You can Google that, and I don't think you'll find a single website that will tell you differently. So yes, there are, but they're very singular, very isolated facts like that. You want to try to Google an answer to the question, why is there an armed conflict between Israel and Palestine? That's not a great thing to Google because that takes a lot of knowledge and historical context to even begin to understand. And sometimes we tend to mix these things up. What we're trying now, large scale, really, is we're trying to see, can we replace human judgment with an algorithm? We can gather the facts, sure. But can we gather knowledge the way we gather facts? Absolutely not. That's two completely different things.

RAZ: Andreas Ekstrom picks up this idea from the TED stage.

(SOUNDBITE OF TEDx TALK)

EKSTROM: And to get to knowledge, you have to bring 10 or 20 or a hundred facts to the table and acknowledge them and say, yes, these are all true. But because of who I am - young or old, or black or white, or gay or straight - I will value them differently. And I will say, yes, this is true, but this is more important to me than that. And this is where it becomes interesting because this is where we become human. This is when we start to argue, to form society. And to really get somewhere, we need to filter all our facts here - through friends and neighbors and parents and children and coworkers and newspapers and magazines - to finally be grounded in real knowledge, which is something that a search engine is a poor help to achieve.

RAZ: When we come back, Andreas Ekstrom explains how Google search results can be manipulated. On the show today, ideas about bias and perception. I'm Guy Raz, and you're listening to the TED Radio Hour from NPR.

(SOUNDBITE OF MUSIC)

RAZ: It's the TED Radio Hour from NPR. I'm Guy Raz. And on the show today, ideas about bias and perception. And before the break, we were hearing from the writer Andreas Ekstrom about the inherent bias in search engines and how they can't always tell the difference between what's true and what's popular.

EKSTROM: A lot of the algorithm is based upon popularity. So if an answer to a question is really popular, then Google tends to think that it's also correct and relevant. And why? Well, because those people are super active and link to each other and update often, and all those things, you know, is credibility in the Google universe.

RAZ: Andreas Ekstrom picks up his idea from the TED stage with a Google search.

(SOUNDBITE OF ARCHIVED RECORDING)

EKSTROM: We'll start by Michelle Obama, First Lady of the United States. And we'll click for pictures, say - perfect search result, more or less. It's just her in the picture, not even the president. How does this work? They look at two things more than anything.

First, what does it say in the caption? What does it say under the picture on each website? Does it say Michelle Obama under the picture? Pretty good indication it's actually her on there. Second, Google looks at the picture file, the name of the file as such uploaded to the website. Again, is it called michelleobama.jpeg? Pretty good indication it's not Clint Eastwood in the picture. So you got those two, and you get a search result like this - almost.

Now, in 2009, Michelle Obama was the victim of a racist campaign where people set out to insult her through her search results. There was a picture distributed widely over the Internet where her face was distorted to look like a monkey. And that picture was published all over, and people published it very, very purposefully to get it up here in the search result. They made sure to write Michelle Obama in the caption, and they made sure to upload the picture as michelleobama.jpeg or the like.

So you get why - to manipulate the search result. And it worked too. So when you picture-Googled for Michelle Obama in 2009, that distorted monkey picture showed up among the first results. Now, the results are self-cleansing. And that's sort of the beauty of it because Google measures relevance every hour, every day. However, Google didn't settle for that this time. They just thought that's racist, and it's a bad search result. And we're going to go back and clean that up manually. We are going to write some code and fix it, which they did.

And I don't think that anyone in this room thinks that that was a bad idea. Me neither. But then, couple of years go by. And the world's most Googled Anders, Anders Behring Breivik, did what he did. This is July 22nd in 2011 and a terrible day in Norwegian history. This man, a terrorist, blew up couple of government buildings, walking distance from where we are right now in Oslo, Norway. And he traveled out to the island of Utoya and shot and killed a group of kids. Almost 80 people died that day.

And a lot of people would describe this act of terror as two steps, that he did two things - he blew up the buildings, and he shot those kids. It's not true. It was three steps. He blew up those buildings, he shot those kids, and he sat down and waited for the world to Google him. And if there was somebody who immediately understood this, it was a Swedish web developer, a search engine optimization expert, in Stockholm named Nikke Lindqvist. He told everybody, if there's something that this guy wants right now, it's to control the image of himself.

Let's see if we can distort that. Let's see if we in the civilized world can protest against what he did through insulting him in his search results. And how? He told all of his readers the following. Go out there on the Internet, find pictures of dog poop on sidewalks, publish them in your feeds, on your websites, on your blogs. Make sure to write the terrorist's name in the caption. Make sure to name the picture file breivik.jpeg. Let's teach Google that that's the face of a terrorist. And it worked.

Strangely enough, Google didn't intervene this time. They did not step in and manually clean those search results up. So the million-dollar question - is there anything different between these two happenings here? Is there anything different between what happened to Michelle Obama and what happened to Anders Behring Breivik? Of course not. It's the exact same thing, yet Google intervened in one case and not in the other.

RAZ: In this example of Anders Breivik and Michelle Obama, I think most of us would say, yeah, that was a right thing to do, right? You don't want...

EKSTROM: Sure.

RAZ: ...One, and you do want the other thing to happen. So - but I guess what you're arguing is that, yes, in this case, it is a good outcome. But what if it's a - more of a gray area, right? Like, what happens then?

EKSTROM: Yeah. And this is - and I use the example just because it's so easy to just - we can agree, you know, that a mass murderer is not somebody that we need to look after a whole lot when it comes to his search results, right? We don't have to care so much about that because he - maybe he has consumed those rights, if you will.

However, just make it a little more difficult. Let's just, you know, make it about two regular people who are fighting for the same political office, or let's just make it whatever we want to make it. Immediately, you get to a point where you have to say, well, Google, you did manually actually change this. That means that you have an opinion. You have a bias. And that makes you editors, you know? So let's just take an easy example. If you google for the Holocaust, you don't immediately see the worst cases of people saying that it never happened, right? So they have actually manually gone into their search results just to make sure that people who said the Holocaust was a hoax, they don't get that top-ranking space.

And everybody, you know, would say that, OK, well, that was a good decision, right? Because that's a hoax, and those people are bad people, and of course we should do that. I agree with that. But then that also means that, you know, human judgment has just taken place.

Where do we draw the line? Where - what else, Google? You know, there are other bad things out there. What else is it that you shouldn't be linking to? What else is there? And the moment where Google accept that they have that responsibility, oh, congratulations, you're the editors of the world.

RAZ: I think I understand your point here. But, I mean, what do you expect Google or anyone in that position to do? To not intervene?

EKSTROM: I'd like to - for the battle to be not necessarily to fight bias because I think maybe that's impossible. There are just some experiences that are so profound and so - they're shaping us so strongly that I think that we can probably never be completely neutral and free from them. And you know what? I'm not even sure that I would want to be that person. I'd like to - I don't mind carrying a set of values with me. I think maybe that's a part of being human.

But how about making sure that we're already aware that we have them and then be able to talk about them and identify when they come into play and really mess with our judgment? Because it - you know, sometimes that happens. That would be probably a better starting point or even a better end goal. Let's just agree that this is something that we all have and carry. Let's make sure that it doesn't influence us in an unhealthy way.

RAZ: That's Andreas Ekstrom. He's a journalist and author of several books in Swedish, including "The Google Code." You can see his full talk at ted.com. Transcript provided by NPR, Copyright NPR.