Bias in Hiring Isn't Going Away—Here's How to Combat It

Updated: December 13, 2024

By: Ji-A Min

4 MIN

New research on "resume whitening" confirms bias at the resume screening stage is still a major problem.

Researchers found that up to 40 percent of minority job seekers try to avoid being stereotyped by "whitening" their resume, such as using an Anglicized version of their name or removing their experience in race-related organizations. And unfortunately, the results suggest these job seekers are justified in being cautious.

For black applicants, 10 percent received callbacks for job interviews if they used their black-sounding names and included involvement with black organizations, compared to 26 percent who received callbacks when they used a "whitened" version of their names and removed extracurriculars such as the Black Students' Association.

For Asian applicants, 12 percent received callbacks if they used their Asian-sounding names and included their experience with Asian organizations compared to 21 percent who used "whitened" resumés. For example, using "Luke Zhang" instead of "Lei Zhang."

It's no surprise that a recent Deloitte survey reports 68 percent of companies are exploring technology to reduce bias in recruiting.

Overcoming bias in the screening stage requires not just the right technology but an overall strategy as well. Here are three strategic steps you can take to reduce bias during candidate screening.

Identify if a bias exists in your candidate screening by comparing the demographics of the qualified candidates who apply to your company to the people that get hired. Ideally, they should be similar.

Companies that fall under the Equal Employment Opportunity Commission (EEOC) or the Office of Federal Contract Compliance Programs (OFCCP) (e.g., 100 or more employees, government contractors with 50 or more employees and at least $50,000 in contracts) are required to collect data on the demographics of both their applicants and employees.

To ensure a lack of adverse impact, many companies will include a (voluntary) survey asking applicants about their ethnicity, race and gender during the application process. In addition, companies are required to submit an annual report on the race and gender of their employees.

If you find a discrepancy in the demographics of your applicants compared to new hires, the next step is to figure out why.

The best way to assess if a bias exists is to examine why candidates get disqualified at the screening stage. There may be reasons for disqualifying minority candidates unrelated to bias, for example, their work authorization status.

If, however, the qualifications of minority candidates are similar to non-minority candidates and they're still getting rejected at a higher rate, that's a problem. The key here is to have good data collection practices. Minority candidates getting rejected at a higher rate due to vague descriptions such as "not a good fit" likely won't cut it with the EEOC or OFCCP.

If after analyzing your numbers, you're still confused why minority candidates are getting disproportionately rejected at the screening stage, then you might have an unconscious bias problem on your hands. At this point, you should consider a technology solution.

An unconscious bias is a mental shortcut we use to process information and make decisions quickly. These biases are automatic, outside of our awareness, and pervasive: Wikipedia lists more than 180 biases that affect our decision making, memory and social interactions.

Activities like resume screening that require processing large amounts of data very quickly and making decisions about people are especially susceptible to unconscious bias.

These limitations of the human brain are exactly why AI technology is now being applied to recruiting. AI can reduce unconscious bias by pattern matching between employee resumes and candidate resumes rather than using untested qualifications, such as the school someone graduated from.

To further prevent unconscious bias at the screening stage, AI can be programmed to ignore demographic-related information such as candidates' names, addresses or college or university.

Some organization such as Deloitte UK and the federal government of Canada are taking the extra step of "blinding" resumes by removing these details before submitting them to hiring managers to interview.

These technologies and practices are fairly new but based on the research, they have the potential to reduce bias during screening by an impressive 75 to 160 percent and represent the next big breakthrough in reducing bias and increasing diversity in the workplace.

Photo: Creative Commons

Related Content

Honoring Veterans Day: A Conversation with Melissa Stockwell & Dave Arkley
Honoring Veterans Day: A Conversation with Melissa Stockwell & Dave Arkley
Webinar
Honoring Veterans Day: A Conversation with Melissa Stockwell & Dave Arkley
Watch NowArrow Indicator
Learning Live 2024
Guide
Learning Live 2024
Read NowArrow Indicator
Celebrating NDEAM 2024: 7 ways to create more inclusive workplaces for people with disabilities
Blog
Celebrating NDEAM 2024: 7 ways to create more inclusive workplaces for people with disabilities
Read NowArrow Indicator