How the right technology can circumvent unconscious hiring bias

Unconscious hiring bias works silently to undermine hiring initiatives. Here’s how to use AI and technology to fight back.

How the right technology can circumvent unconscious hiring bias

Recruiters and hiring managers seek candidates who offer the skills and experience to perform outstanding work in a given role. Although those managers will work hard to focus on the factors that are relevant to a given position, their own biases can work against them — often, without them even realizing it.

Most recruiters and hiring managers don’t consciously employ their biases when they hire. Many even make a conscious effort to spot their own biases and avoid them. It’s the unconscious nature of certain biases that make them particularly pernicious: They can act to the detriment of hiring even when hiring managers believe they’re working to eliminate them.

Unconscious bias has a well-documented effect on a range of activities, including hiring. Choosing the right tech tools, however, can help hiring managers outsmart their brain’s own hidden agenda.

Unconscious bias and how it works

According to Dr. J Renee Navarro, vice chancellor of diversity and outreach at the University of California San Francisco, “Unconscious bias refers to the attitudes or stereotypes that affect our understanding, action or decisions in an unconscious manner.” We aren’t aware we carry these pre-formed ideas about others; instead, those ideas work behind the scenes to influence our behavior in subtle but potentially devastating ways.

Unconscious biases form over time. As our brains are exposed to various pieces of information, they sort that information and seek patterns. A study in the Journal of Cognitive Neuroscience by Hugo J. Spiers and fellow researchers found that the brain’s anterior temporal lobe plays a key role in tracking and sorting information — and that a person’s impression of other races, genders and ethnic groups is built on the information the brain receives.

The study indicates that media portrayals as well as personal experience play a profound role in how our unconscious biases develop.

Studies show the impact of unconscious bias

Because unconscious bias operates outside our everyday awareness, people in positions of authority often don’t realize it affects their behavior. For example, a study of 6,500 university professors by researchers Katherine L. Milkman, Modupe Akinola and Dolly Chugh found that professors were more likely to respond to emails from students with white-sounding names, even when those names made up just a fraction of the total emails professors received.

Why fight hiring bias?

Aside from ethical and compliance reasons, fighting bias in order to diversify hiring has concrete benefits for companies. One study by the nonprofit organization Catalyst found that companies with more women in executive positions had a 34 percent higher shareholder return and a 26 percent higher return on invested capital than companies with less diverse leadership.

“We controlled for industry and company differences and the conclusion was still the same,” says Ilene H. Lang, former president and CEO of Catalyst. “Top-performing companies have a higher representation of women on their leadership teams.”

A study by Vivian Hunt and fellow researchers at McKinsey & Company found a similar effect when leadership contained more members of racial and ethnic minorities. The study found that companies with more of these minorities in leadership were 33 percent more likely to outperform competitors on EBIT margin.

workers representing hiring bias

Tech tools to combat unconscious hiring bias

A study by Martin Wood and fellow researchers at the NatCen Department for Work and Pensions in the UK found that job applicants with a white-sounding name were 74 percent more likely to receive a response than applicants with a name that sounded as if it came from an ethnic minority.

Fortunately, technology can help cut down on the effect of these hidden biases when it’s implemented in a thoughtful way.

Automatic identifier screening

The NatCen study pinpointed one method for combating unconscious bias: Using a technology platform that automatically anonymizes applications.

The researchers found that applicants’ names were less likely to affect the results when the application was made via the employer’s system than when resumes were collected. The researchers surmised that the system made it easier to screen names from applications, requiring hiring managers to focus on each candidate’s skills and experience.

Predictive analysis

Technology that employs predictive analysis can also help combat bias by substituting data-based information for human hunches when it comes to choosing people who are likely to perform well in a given role, Lovepreet Dhaliwal writes at The Undercover Recruiter. It does so by analyzing data such as employee performance reviews, resumes and turnover to determine which skills and abilities are most common in each role’s or department’s best performers.

By using predictive analysis, recruiters and hiring managers gather concrete data on which to base their decisions. They no longer have to rely on assumptions, which are frequently affected by unconscious bias.

AI-based interviewing

AI-enabled chatbots are making it easier to collect initial information from candidates during screening interviews without allowing unconscious bias to contaminate that information collection process.

For instance, a chatbot can ask questions that focus on certain skills, collect the answers, and analyze them or pass them on to a human recruiter. The humans involved in the hiring process see answers to questions that are relevant to job performance, but they don’t see irrelevant information such as a candidate’s gender, ethnicity or apparent age.

Smarter assessments

Companies have used skills assessments for decades in order to see candidates’ skills in action. Combining artificial intelligence with these assessments can improve the results and also help companies eliminate bias.

By using smart assessments, companies can focus on what each candidate does best, says Dean Takahashi at VentureBeat. They can also help collect the data necessary for hiring managers and recruiters to spot patterns in their top performers.

office workers representing hiring bias

Using technology to fight hiring bias

Buzz about artificial intelligence, machine learning, predictive analytics and similar tools has filled the business world in recent years. These technologies aren’t a magical, one-size-fits-all solution to every hiring problem, though. Instead, companies must consider how those tool will be incorporated into their specific hiring processes, internal cultures and end goals.

Seek options that multitask

A technology platform that helps reduce or eliminate hiring bias can improve candidate quality and diversity. When used correctly, it can also help identify patterns that lead to a better candidate fit and thus reduce turnover, says Natalie Pierce, co-chair of the robotics, AI and automation industry group at Littler Mendelson.

Because artificial intelligence and machine learning are excellent at recognizing patterns, they can be used to generate analyses of hiring data, as well. Human resources staff, recruiters and hiring managers can use the data to understand what types of hires do best in certain positions and to tailor their job descriptions and employment branding accordingly.

Use what you’ve built

Hiring more qualified and diverse candidates is the first step to building stronger teams, but it isn’t the only essential step, Laura Berger writes at Forbes. “The rest of the answer lies in facilitating inclusiveness whereby everyone is valued and group differences are embraced,” she says.”

“The result is empowered employees who openly share their diverse perspectives: a win for the company.”

Images by: HONGQI ZHANG/©123RF.com, Mikko Lemola/©123RF.com, Dmitriy Shironosov/©123RF.com

You might also like...