Racial Bias
The deaths of African-Americans at the hands of the police in Ferguson, Mo., in Cleveland and
on Staten Island have reignited a debate about race. Some argue that these events are isolated and
that racism is a thing of the past. Others contend that they are merely the tip of the iceberg,
highlighting that skin color still has a huge effect on how people are treated.
Arguments about race are often heated and anecdotal. As a social scientist, I naturally turn to
empirical research for answers. As it turns out, an impressive body of research spanning decades
addresses just these issues — and leads to some uncomfortable conclusions and makes us look at
this debate from a different angle.
The central challenge of such research is isolating the effect of race from other factors. For
example, we know African-Americans earn less income, on average, than whites. Maybe that is
evidence that employers discriminate against them. But maybe not. We also know African-
Americans tend to be stuck in neighborhoods with worse schools, and perhaps that — and not
race directly — explains the wage gap. If so, perhaps policy should focus on place rather than
race, as some argue.
But we can isolate the effect of race to some degree. A study I conducted in 2003 with Marianne
Bertrand, an economist at the University of Chicago, illustrates how. We mailed thousands of
résumés to employers with job openings and measured which ones were selected for callbacks
for interviews. But before sending them, we randomly used stereotypically African-American
names (such as “Jamal”) on some and stereotypically white names (like “Brendan”) on others.
The same résumé was roughly 50 percent more likely to result in callback for an interview if it
had a “white” name. Because the résumés were statistically identical, any differences in
outcomes could be attributed only to the factor we manipulated: the names.
Other studies have also examined race and employment. In a 2009 study, Devah Pager, Bruce
Western and Bart Bonikowski, all now sociologists at Harvard, sent actual people to apply for
low-wage jobs. They were given identical résumés and similar interview training. Their sobering
finding was that African-American applicants with no criminal record were offered jobs at a rate
as low as white applicants who had criminal records.
These kinds of methods have been used in a variety of research, especially in the last 20 years.
Here are just some of the general findings:
■ When doctors were shown patient histories and asked to make judgments about heart disease,
they were much less likely to recommend cardiac catheterization (a helpful procedure) to black
patients — even when their medical files were statistically identical to those of white patients.
■ When whites and blacks were sent to bargain for a used car, blacks were offered initial prices
roughly $700 higher, and they received far smaller concessions.
■ Several studies found that sending emails with stereotypically black names in response to
apartment-rental ads on Craigslist elicited fewer responses than sending ones with white names.
A regularly repeated study by the federal Department of Housing and Urban Development sent
African-Americans and whites to look at apartments and found that African-Americans were
shown fewer apartments to rent and houses for sale.
■ White state legislators were found to be less likely to respond to constituents with African-
American names. This was true of legislators in both political parties.
■ Emails sent to faculty members at universities, asking to talk about research opportunities,
were more likely to get a reply if a stereotypically white name was used.
■ Even eBay auctions were not immune. When iPods were auctioned on eBay, researchers
randomly varied the skin color on the hand holding the iPod. A white hand holding the iPod
received 21 percent more offers than a black hand.
The criminal justice system — the focus of current debates — is harder to examine this way. One
study, though, found a clever method. The pools of people from which jurors are chosen are
effectively random. Analyzing this natural experiment revealed that an all-white jury was 16
percentage points more likely to convict a black defendant than a white one, but when a jury had
one black member, it convicted both at the same rate.
I could go on, but hopefully the sheer breadth of these findings impresses you, as it did me.
There are some counter-examples: Data show that some places, like elite colleges, most likely do
favor minority applicants. But this evidence underlies that a helping hand in one area does not
preclude harmful shoves in many other areas, including ignored résumés, unhelpful faculty
members and reluctant landlords.
But this widespread discrimination is not necessarily a sign of widespread conscious prejudice.
When our own résumé study came out, many human-resources managers told us they were
stunned. They prized creating diversity in their companies, yet here was evidence that they were
doing anything but. How was that possible?
To use the language of the psychologist Daniel Kahneman, we think both fast and slow. When
deciding what iPod to buy or which résumé to pursue, we weigh a few factors deliberately
(“slow”). But for hundreds of other factors, we must rely on intuitive judgment — and we weigh
these unconsciously (“fast”).
Even if, in our slow thinking, we work to avoid discrimination, it can easily creep into our fast
thinking. Our snap judgments rely on all the associations we have — from fictional television
shows to news reports. They use stereotypes, both the accurate and the inaccurate, both those we
would want to use and ones we find repulsive.
We can’t articulate why one seller’s iPod photograph looks better; dozens of factors shape this
snap judgment — and we might often be distraught to realize some of them. If we could make a
slower, deliberate judgment we would use some of these factors (such as the quality of the
photo), but ignore others (such as the color of the hand holding the iPod). But many factors
escape our consciousness.
This kind of discrimination — crisply articulated in a 1995 article by the psychologists Mahzarin
Banaji of Harvard and Anthony Greenwald of the University of Washington — has been studied
by dozens of researchers who have documented implicit bias outside of our awareness.
The key to “fast thinking” discrimination is that we all share it. Good intentions do not guarantee
immunity. One study published in 2007 asked subjects in a video-game simulation to shoot at
people who were holding a gun. (Some were criminals; some were innocent bystanders.)
African-Americans were shot at a higher rate, even those who were not holding guns.
Ugly pockets of conscious bigotry remain in this country, but most discrimination is more
insidious. The urge to find and call out the bigot is powerful, and doing so is satisfying. But it is
also a way to let ourselves off the hook. Rather than point fingers outward, we should look
inward — and examine how, despite best intentions, we discriminate in ways big and small.