Jennifer Eberhardt. (Nana Kofi Photo)

In many a company, you’ve heard the talk about unconscious bias, and likely attended trainings and so forth, but are these efforts making a real difference the way we see and treat each other?

In Biased: Uncovering the Hidden Prejudice That Shapes What We See, Think, and DoDr. Jennifer Eberhardt reveals the latest research and data, much of it her own, on racial bias and how it impacts us all every day. A Stanford social psychologist and MacArthur “Genius” award recipient, she is one of the world’s leading experts on racial bias.

Eberhardt works with several major police departments, helping them examine and address racial bias in their departments. It’s vital work when people are harassed, beaten, or even killed during something that should be a routine stop.

As a resident of Silicon Valley, Eberhardt is also in a prime position to observe how tech companies deal with bias, whether in their platform, product, or during the hiring process.

What she’s found is that we all have the potential for bias, as it’s hardwired into our brains thanks to societal stereotypes. What we can do with this science is retrain our brains and better identify when we’re being driven by bias — not logic — so we can slow down and make better, fact-based, humane decisions.

“The big takeaway from the book is that you don’t have to be a white-robed racist to have bias, and we want to help people to understand that, and understand that bias is not a trait that a person has that makes them immoral,” Eberhardt says. “Think of it more like a state of mind, something that’s triggered in certain situations and not others, and that can give us a lot of power over it. I’m hopeful about that. In the workplace and the tech space, too.”

Eberhardt spoke with GeekWire about her book and her work with tech companies.

Let’s talk about bias in hiring, especially in tech. What can hiring managers do to identify even small signs of bias in the process?

There are lots of things we have at our disposal to manage the potential for bias, and one of those things is just slowing down. When people are looking over potential applicants, some data suggests that when resumes come in, the people hiring look at it for six seconds on average. Slowing people down is a good thing. You want to slow people down so they don’t fall back on automatic associations and act without thinking things through.

The other thing is to have people focus on using objective standards rather than subjective standards to evaluate others. When using objective standards, there’s a score, number, or percentage of sales that’s a hard number and it’s the same metric across the board. When you have subjective standards, things can get murky, like whether the person’s a team player, where it’s hard to evaluate. You have to be wary there.

The other huge thing is accountability, so if you have metrics in place where you can keep track of how you’re doing, looking at those things like racial or gender disparities, and if there are things you can do that impact that. You can look back and see if it is affecting your bottom line. That is huge. You want to track how you’re doing, develop metrics to track that and look at the numbers.

You have worked with companies like Nextdoor and Airbnb to help them identify and rid their systems of bias and racial profiling. Can you talk about how they changed their online tools to mitigate bias?

Nextdoor’s whole purpose is to provide this whole platform where neighbors can build stronger, happier communities. That’s their mission. They’d gotten some good results with that. They’ve also had issues of racial profiling come up. That’s how they ended up contacting me.

They heard from users that racial profiling was sometimes an issue. Neighbors would use the crime and safety tab to report mostly black men as suspicious simply as being in the neighborhood, not because of actual suspicious behavior.

They not only reached out to me, but co-founder Sarah Leary and her team began to really pore through academic articles on racial bias to try and understand not only how it emerges, but how to stop it. From that and those discussions, they realized that slowing people down was a huge way to try to address bias. Rather than allowing people to post right away on the crime and safety tab, they decided that they wanted to slow people down before they post it and make them focus on the behavior of person rather than a racial category. They were also asked to give full description, instead of just putting “black guy here.” You have to describe that person in more detail and don’t swoop them into some broad category and expose them.

After they did this, slowing people down and making them take extra steps, they crunched the data and saw it had curbed racial profiling by 75 percent on the site. That’s a huge drop. They talked about the model of “if you see something, say something,” to modify that in some way for the user to tell them, “If you see something suspicious, say something specific.”

Slowing down before you fire off a post, those are some valuable lessons for Facebook and Twitter there.

Exactly. The whole goal of tech is reduce friction. These companies are trying to create products that are so intuitive to use that you don’t have to stop and think. You can use them without complication, you can use them quickly, and it doesn’t require a lot of thinking on the part of the user, but those are the types of conditions that allow bias to thrive. That’s the problem.

Affirmative action is often a point of debate, in schools, in hiring, in life. Why should schools and companies seek out people who usually wouldn’t be in their application pool?

It takes deliberate effort. It’s easier to go to schools that you know are gonna give you good candidates, so that’s what people do. If you’re in a position where you’re trying to make a call on something, it’s a lot of work and investigation to look into other schools you’ve never heard of. It’s easier to hire people who are like you, who are familiar to you, who you understand. It doesn’t take as much work or effort to bridge those same pages.

People feel there’s a risk involved in hiring people outside who’s familiar. There’s even neuroscience research on this from at Stanford. [The researcher] looks at what happens to the brain when you’re looking at evaluations of people in your in-group vs. out-group.

He finds that when you evaluate in-group members, it’s easy to process positive info and harder to process negative. But it’s just the opposite when looking at people who are outside your group. It’s easier to process the negative and that’s what you’re aware of. It’s harder to process the positives, and takes more work to get there. This is the bias built in that can lead us to prefer our own.

You work with a lot of police departments on bias. How is the technology of body cameras and surveillance impacting police work?

The last I read is that about 95 percent of departments across the country have cameras or are contemplating getting them. It’s a huge and quick transformation. It seems like, at least from the correlation data, that when departments get cameras, you see a drop in citizen complaints and a drop in the use of force, so we know that.

The next thing you want to do is look to see if there is a causal relationship. You can have all kinds of reforms at once, and maybe there are some others that are causing the drop, and there have been a number of these randomized controlled studies that are carefully trying to look at the causal role cameras are playing. Those seem to be mixed.

I think the jury is still out, but at least studies are happening now in a number of places, including New York City. Both individual police departments and academics are looking at the data. Anthony Braga, a criminologist at Northeastern, is doing a rigorous study in Las Vegas with the Vegas police, and he’s heading up the effort with the NYPD. I know that his previous work has shown that cameras lead to a drop in complaints and a use of force.

In your work, especially with California police departments, are you seeing a changed protocol with training police officers and differences in interactions when they pull people over?

I’ve been looking at what impact it’s having on interactions, even when no complaint is lodged or no use of force occurs. I look at the footage itself to see how interactions are unfolding over time, especially looking at the race of the driver that officers pull over for routine traffic stops. I look at how that extent goes smoothly, to its fraughts, and what language officers are using.

I just did a study looking at officers’ language and how respectful that language was and whether the race of the driver made a difference. I provided the department with those results and that was folded into scenario-based training for the officers, using actual language from real stops in those trainings.

I also had the opportunity to discuss how to handle these interactions that’s respectful and so forth. Now I’m interested in looking at whether that training had an effect, pre- and post-training footage, to see if that made a difference, which is huge. A lot of police trainings are rigorously evaluated, but some trainings are evaluated by simply asking if they like the training on implicit bias or did it meet their expectations. You really want to know if it’s changed the day-to-day interactions on the street in a good way. That’s what we’ll be able to do with our new study we have underway with the Oakland, Calif. police department.

Bias: Uncovering the Hidden Prejudice That Shapes What We See, Think, and Do by Jennifer Eberhardt is out March 26.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.