Day of Learning 2013 - Binna Kandola: Diffusing Bias | Facing History & Ourselves
Video

Day of Learning 2013 - Binna Kandola: Diffusing Bias

Binna Kandola delivers a talk as part of the Day of Learning “Reimagining Self and Other.”
Last Updated:

At a Glance

Video

Language

English — US

Subject

  • Social Studies
  • Culture & Identity

Day of Learning 2013 - Binna Kandola: Diffusing Bias

[AUDIENCE APPLAUDING]

We've been talking about identities. Names are actually an important part of our identities. My name is Binna Kandola. I was doing some research a few years ago. I'm a psychologist. I was doing some research a few years ago and had to issue some questionnaires to a group of students.

One of the tools of a psychologist is you test and then you retest. So I went to his college, I tested the students. And a few months later, I came back. I was testing the same students again. The principal, who I had met before, was introducing me to the two lecturers whose session I was about to interrupt. And he said, this is Janice, and this is James. And I'd like to introduce you to Mr. Tandoori.

And their mouths dropped open. And I thought, I better put him straight here. I said, actually my name's not Tandoori. My name's not Tandoori, it's Kandola. It's chicken Kandola.

[AUDIENCE LAUGHS]

We can all make mistakes. Well, I'm a psychologist, I work with organizations. One of the things I noticed-- I work in the diversity area. One of the things I noticed a few years ago now was that our clients were coming up to us and saying, they've been working in the diversity field for 10, 15 years. They'd made progress. But if they had thought about it 10 or 15 years before, they probably would have liked to have made more progress than they had done.

And this was universal. It didn't matter about the sector, it was public or private, it didn't matter, it was manufacturing, it was finance, it was everywhere. And of course clearly something was going on here. There's something getting in the way. And I thought, this is worth investigating a bit further.

So I did a big literature review, I did some studies. And it quickly became apparent that there is a problem. And the problem is us. We are the problem. And the problem is we're all biased without exception. So there's 7 billion people on the planet, and the 7 billion people have bias of some shape or form.

The world is not divided up into those people who have bias and those who don't. It is divided up, though, into those people who recognize they have bias and those people who think they have none. And ironically-- and the work on unconscious bias is full of ironies-- one of the ironies is that those people who believe they have no bias probably are the most biased because there's no reflection going on. If I believed I had no bias, why on earth would I ever need to reflect on my behavior, review my decisions, or change anything about myself? Because I'm perfectly content in what I'm doing.

So what I want to do in this session is actually just introduce you to how the biases we're talking about apply in organizations, and some of the things that we can do, perhaps, to minimize the effects of some of the biases that we have.

So there are two lines there. There are two orange lines. There is a small one in the middle between the three blocks, and there is a thicker one at the top. Which one looks longer to you? Top one. Yeah. The top one looks longer.

It's not going to help this, but-- because it looks like it gets smaller as it goes down. But they are in fact the same size. They're the same size. The illusion works. And I know they're the same size because my assistant did it for me. She said, is this what you want? I said, no, they're meant to be the same size. [AUDIENCE LAUGHS] I actually measured them, so I can assure you they are the same size.

And the converging lines means that we automatically turn that two-dimensional image into a three-dimensional one. It's automatic, we can't help it. If something is further away and it looks like it's the same size as something that's closer, it must, by definition, be bigger, right? So we interpret the whole time.

We did a study three years ago. We asked people to take part in a psychology experiment. They walked into the room. At the bottom end of the room there was a counter with somebody standing behind the counter with a sign above them saying "experiment."

They walk over to the counter, where they are handed a consent form. They complete the consent form, hand it back to the person behind the counter, who says, oh, I need to staple this. They duck down underneath the counter, staple the form, come back up, give it back to the person, say, can you go to the room over there, please?

What they didn't realize was that was the beginning and the end of our experiment. Right? That was the beginning and the end of our experiment. [AUDIENCE LAUGHS] The person who went down underneath the table was not the same person who came up. [AUDIENCE LAUGHS] And in the room over there, people were asked, what did you notice? And people noticed the counter, noticed the form, stapled, the sign that said experiment. But something like 80% of the time, people did not notice it was a different person. They did not notice it was different.

It doesn't sound credible, but it is true, they didn't notice. And it wasn't like we use identical twins. They were two different people. We had them wearing similar clothes to begin with, but in the end, we had them wearing very different clothes. But there were two changes that we could never make without them noticing. We could never change their gender or their color without the other person noticing. We notice some difference more than others. It's automatic. It's like that. We just notice.

Noticing isn't necessarily bad, but as we've heard, actually there are associations or stereotypes associated with different groups. And the groups-- the difference we notice more than others are color, gender, age, and physical disability. And clearly that's related to the visual cues that we're picking up on. But we notice difference. And the difference actually has associations with them, which are related to stereotypes.

Who is that? Yeah. It's Barack Obama is your president, upside down. It kind of is, it kind of isn't. To paraphrase some British comedians, he's got all the right features, just not necessarily in the right order.

His mouth and his eyes were, in fact, the right way up the first time, when he was upside down. His mouth and his eyes where, in fact, the right way. They are now upside down. There are two things going on here. One, we're very quick decision makers. Once we've made a decision about something, we stop scanning, right? We've made our minds up, we stop scanning. So we're not processing anymore. And secondly, it's about experience and expectations. We've never experienced him looking like this, and so we don't expect it. We create in effect, a self-fulfilling prophecy.

So in interviews, for example, we know that untrained interviewers will make their minds up about a candidate within the first two minutes. And clearly, they're basing it on some physical characteristics, and maybe color, gender, age, disability. The points I was making a moment ago.

But it may be related to other things, it may be the grip when you shake hands, how firm is that handshake? We like people who do give us a firm handshake and look you straight in the eye.

I was in Canada earlier this year. I was told that First Nation people-- some First Nation groups in Canada are told, it's disrespectful to shake somebody hard by the hand. So we're making these very quick, superficial judgments about people, which may not be accurate.

The second thing is about experience and expectations. Women can't park a car. can't parallel park, can't reverse park. Lack of spatial reasoning means you can't do it. All right? So I was in my local supermarket on Saturday and somebody is taking 12 goes to reverse park into that bay. And I'm thinking, why don't you just give it to your husband? And then you drive past and you see it is a man behind the wheel and you think, what sort of man are you? [AUDIENCE LAUGHS]

Now, a piece of research was published last year in Britain on this topic of parking. And you got two groups of people asked to park a car. The women in the first group are told, we know you can't park, we know you can't park, lack of spatial reasoning and all that, we know you can't park, just do the best you can. It's only cones, it's only cones. You'll never damage the car, you'll never damage the car. And if you flatten a cone, we'll replace it with another one, we've got loads more available. All right? Do the best you can.

The women in the second group are told, all this stuff you've heard about lack of spatial reasoning and women can't park a car, it's all a load of rubbish. You can park a car just as well as any man. So why don't you prove everybody wrong?

So you've got prove everybody wrong versus do the best you can. And the women in the second group not only performed as well as the men, they actually outperformed the men. Whereas the women in the first group actually significantly underperformed. So we can create our own self-fulfilling prophecies. We can actually-- by our own expectations, we can impact other people's behavior and see what we're expecting to see.

In this particular image here, a lot of people see-- there's one predominant image that people see here. Sometimes people see more than one thing, sometimes people don't see the most dominant thing, they see other things. What do you see? A dog? Yeah? Yeah. There's a dog. A lot of people see a dog. I was with a group recently, and nearly everybody in the room could see a dog. And one of their colleagues couldn't see a dog, and another one of his colleagues helped him out and said of course you can see the dog, it's below the whale.

[AUDIENCE LAUGHS]

Strictly speaking, strictly speaking, there's no outline of anything there. There's two things going on there. One, we don't like random. We're always trying to make sense of the world around us. And secondly, I told you there was something there to be seen. All right? I told you there was something there to be seen. I created a motivation, a goal in you to start searching for something. And it's an example of what's known as priming. And priming are the ways in which we can be influenced without realizing that we have been influenced. Ways in which we can be influenced without realizing that we have been influenced.

A French psychologist had two groups of interviewers about to interview the same people. They're interviewing the same candidates. One group of people is told, go and meet your candidate, bring them back to your office, and then, when you're finished, escort them from the premises. The other group of people is told, meeting the same people, remember, are told, go meet your elderly candidates, bring them back to your office, and when you finish, escort from the premises.

What they found was, the people who thought they were going to be meeting an elderly candidate actually walked more slowly to meet them. Their behavior had changed. Their behavior had changed even before they'd met the people. It was clearly a set of-- there was clearly a set of associations that they were making, elderly, aged, infirm, slow, I'd better slow down. And whether those associations were conscious or unconscious, they were clearly being made or impacting and actually having an impact on their behavior. So priming is another way.

So we've got two sources of bias. We notice difference, and we notice some differences more than others. And secondly, about the way that we interpret the world around us. Now, there's a football team, a soccer team, that I support, it's called Aston Villa. All right? You may not have heard of them. But we did win the European championship in 1982. [AUDIENCE LAUGHS]

Our local rivals are a team called Birmingham City. They're two miles away. Literally next door to one another. So our local rival team is called Birmingham City. Intellectually, I know that the fans must be the same. It's the same catchment area, it's the same city, basically the same people. We must be the same. Intellectually, I know that. Emotionally, I know we are better than them. And not only that, my children know it, too. [AUDIENCE LAUGHS] I think it's one of the proudest achievements I may have had as a parent.

But anyway-- we form groups. We're very social animals. So the third area of bias is about the way we form groups. We form in-groups and out-groups. The groups that we're part of, they're our in-groups, and then, by definition, the other groups are our out-groups. And there are some very interesting things that happen.

We actually view our in-group differently from the out-group. So we actually view them differently in the sense that we see people in the in-group as individuals and we accept difference. We actually think about them differently. So actually we are more likely to remember the positive contributions from in-group members and we're more likely to remember the things that they do. And thirdly, we behave differently towards in-group members. We are more likely to make sacrifices and we're more likely to be helpful towards other in-group members.

And we view out-groups, consequently, in a very different way. So out-groups are viewed as being homogeneous. They're are all the same. We minimize difference. The French, dot, dot, dot. You complete that sentence in your heads.

It doesn't matter whether you finish that sentence positively or negatively, the French will have been treated as a homogeneous group of people. They are all the same. We will remember the negative things they've done, we will forget their contributions, and generally speaking, we won't be so helpful towards them.

This can have an impact in terms of societies. It can also have an impact in teams. In a team, you could actually get groups of people who-- create in-groups and out-groups within your teams. And it may be that we value the contributions of some people more than others. We don't listen to people. We actually miss out on the talent that's available to us.

So what can we do? Well, one of the things that we can do is actually turn the mirror on ourselves. Instead of thinking that bias is somebody else's problem, which is what we tend to do, it's actually a problem for me. One of the ways that we can do-- one of the things we can do to increase our self-awareness is actually to do some tests of unconscious bias. One of them developed-- it's actually on the Harvard website. One of the academics who developed it is actually here at Harvard, but there's two other academics who developed it, as well.

It's called the Implicit Association Test. If you Google IAT, it will take you to the test. It's easier to do than to describe It's a reaction time test. Basically stimuli come to the screen and you react as quickly as you can. Right?

And I did this for the first time 10 years ago. In the United Kingdom, I'm described as an Asian person. So people from the Indian subcontinent are described as Asian. And I did the first test. What I gravitated towards, for kind of obvious reasons, was Asian and white faces, and good and bad words. Asian, white faces, good and bad words, they come up on screen. You react as quickly as you can.

Just to go over my background again, I am a psychologist, it is a science, very rigorous, highly methodical, very analytical, highly statistical. And also I work in the diversity field, so I'm not judging people, not stereotyping them, not making assumptions. So essentially, the fairest person in Britain.

[AUDIENCE LAUGHS]

I took this test. And it gave me a result at the end. It said that I had a bias associating Asian people with good. All right? This never surprises anybody, but it really shocked me. So given my background education, experience, and training, I did exactly what you'd expect me to do in the circumstances. I went, OK, best of three. [AUDIENCE LAUGHS]

So I did it again. Got the same result. Best of five. Got the same result. I did it three times in a row and I got the same result three times. And I walked away from my desk thinking, what a load of rubbish. What a load of rubbish. I'm clearly the fairest person in Britain. And that test did not validate it, so the test has to be wrong.

The next day, we were making a DVD. I was in studios in North London. Seven actors. Didn't know them. We broke for lunch. There's a table with sandwiches on there. And there's just chairs scattered in the room. I picked up a sandwich, sat down next to one of the actors. And as soon as I sat down, I realized I'd parked myself next to the only other Asian person in the room. Right? Now, my act was unconscious. At no point did I think, aha, Asian person, I must go and sit next to him. My act was unconscious, but also it was not random. Unconscious, but not random. I may not be aware of it, but something impelled me to go in one direction rather than another. And just that self awareness is actually an important first step in, kind of, tackling things to do with bias.

And the secondary, ridiculous as it may seem, is actually just to tell ourselves not to do it. Just tell ourselves to stop. I am not going to do it. It takes a conscious effort. The unconscious processes are obviously wearing away, but through a conscious effort we can actually make ourselves stop.

I am not going to stereotype. You can actually-- one of the things that we've done actually is to actually set fairness as a goal when you're making decisions. Set fairness as a goal. And we've found that that can reduce the levels of unconscious biases exhibited by a group of people.

And you can take it a step further. Give yourself an instruction. When I'm doing these interviews, then I will do-- I will try my hardest to be as objective as possible. Or when I'm doing interviews, then I will not stereotype. Those when-then statements we found to be very powerful. And we actually found that you can reduce unconscious bias by those mechanisms. Very simple things. And there's lots of other things that you can do. I'm just concentrating on some of the more straightforward things that we can undertake. So we can turn the mirror on ourselves, tell ourselves to stop and instruct ourselves to be fair and set fairness as the goal.

And the final point, essentially, is about everybody taking responsibility for this. There is a particular responsibility for leaders to role model the behavior. If a leader can actually talk about topics like bias, actually role model the behavior that they're expecting of other people, it has a huge impact on other people. But we can also challenge one another. I know it can be difficult in some circumstances so maybe not challenge on our own. Maybe get some allies with us and actually challenge collectively.

But actually, this challenge and questioning, are we being fair here? Challenge doesn't have to be unpleasant, maybe just like telling a story. It may be asking a question, it doesn't have to be aggressive. This process of challenging is actually important to make us rethink decisions.

I did some work with an accounting firm a couple of years ago. Six partners about to make promotion decisions the next day. And I met with them and said-- I told them this stuff about bias. I subsequently found out that there was one candidate who met all the criteria, who they-- five out of the six was going to reject. Met all the criteria. Five out of six were going to reject him. And the sixth one said, well, why are we turning him down? He meets all the criteria. He said because he's too big, he's actually too big. He needs to lose weight.

And he said, so you're saying if he loses five or six stones, if he would have come in five or six stones lighter, we would have appointed him? And they went, yeah. It's for his own good. So if he loses the weight, comes back next year, we'll appoint him. He said, but what about all that training we did on bias yesterday? He said, yeah, well, that's about women and minorities, isn't it? It's not about big people.

But they changed their minds. He stuck to his guns and obviously he's in a peer group. That makes it easier. But he stuck to his guns and they actually did appoint him. So challenge.

So there are three, kind of, areas of bias that operate in organizations. One is, we notice difference and there are associations. And there's some difference we notice more than others. And there are associations and stereotypes associated with those particular groups. Secondly, it's the way we interpret the world around us: priming, quick decision making, experience and expectations. And thirdly, it's about the way that we socialize in groups. And we have a different set of expectations for in-groups than we do for out-groups.

And there are three things that we can do. I mean, lots of other things, but three that we could do immediately. One is that we can turn the mirror on ourselves. Instead of blaming other people, actually just kind of reflect on our own behavior, increasing our own self awareness. That can make a difference. Secondly, we can actually just tell ourselves not to do it. Consciously try not to display bias in decisions we're making. And the third thing we can do is actually to challenge, in an appropriate manner, people around us. Get us to review decisions. And leaders in particular, have a higher role here in terms of being role models to other people in the organization.

So finally, I just want to leave you with the thoughts of a British broadcaster and comedian, Jeremy Hardy. He did a program about prejudice. And he was looking at various phobias, Islamophobia, xenophobia, homophobia, amongst others. And he's made the point, actually, that these fears and these hatreds are genuinely felt. They are feelings that are genuinely felt. But all phobias are genuinely felt. There are, in fact, people who have a fear of buttons. Believe it or not, there are people who have the fear of buttons. And he points out that actually-- and, of course, this is a genuine fear that they have, and we shouldn't revile them for having this particular phobia, but we should gently point out to them that whilst they may have this fear of buttons, the fault actually lies with them and not with the buttons. Thank you.

[AUDIENCE APPLAUDING]

Day of Learning 2013 - Binna Kandola: Diffusing Bias

How to Cite This Video

Facing History & Ourselves, “Day of Learning 2013 - Binna Kandola: Diffusing Bias,” video, last updated February 11, 2014.

You might also be interested in…

Most teachers are willing to tackle the difficult topics, but we need the tools.
— Gabriela Calderon-Espinal, Bay Shore, NY