Is automation the answer to fighting gender discrimination against women? Some big tech companies seem to think so. But after Amazon’s experimental hiring A.I. learned to favor men over women, the question is whether automation will stop discrimination or make it worse.
In this blog post, I will discuss recent reports that tech-giant Amazon’s experimental hiring tool was abandoned because it had learned gender discrimination against women. I will explain how Title VII protects against gender discrimination in hiring, and discuss what that could mean in a world where initial hiring screening is done through automation.
Some of the country’s largest employers have been turning to computers to help them speed up the hiring process. Goldman Sachs created a resume analysis tool to match candidates with different divisions within the company. Microsoft offers employers an algorithm to rank candidates based on job postings uploaded to LinkedIn, the company’s online professional network. Others use automated systems to screen out unqualified individuals from the pool of applicants. One report by CareerBuilder suggests that as of 2017, up to 55% of U.S. human resource managers planned to incorporate artificial intelligence (A.I.) into their hiring processes within the next 5 years.
For some employers, A.I. helps promote diversity in hiring. Automated recruiting networks at HireVue, for example, helped companies look beyond Ivy League schools to find other highly-qualified candidates from other, less sought-after schools. But others see the technology as one step in a larger process. John Jersin, vice president of LinkedIn Talent Solutions, says,
“I certainly would not trust any AI system today to make a hiring decision on its own. . . . The technology is just not ready yet.”
Those reservations seemed justified after Reuters reported that Amazon’s experimental hiring A.I. had been discontinued because, among other things, it had learned gender discrimination against women. Amazon began working on the project in 2014. Its machine-learning specialists were trying to mechanize recruitment searches by creating a computer program that could identify top talent. The tool would scan each applicant’s resume, and then rate it from one to five stars.
But within a year, Amazon realized it had a problem. The system had learned gender discrimination against women. Like most computer learning, the program was designed to observe patterns in successful resumes submitted to the company over time. Those resumes came with biases of their own.
Amazon’s workforce is 60% male. Across the industry, in technical roles, like software developers, male employees outnumber female employees 3 to 1. Because the tech industry has a problem with gender disparity, so did the successful resumes.
Over time, the program learned that the skills needed to do the job -- like the ability to write code -- appeared in almost every resume. Instead, the technology found distinction in the way applicants described themselves. It came to favor masculine language, such as “executed” and “captured”. It also reportedly penalized resumes that included the word “women’s” (such as “women’s chess club captain”) and downgraded two all-women’s colleges.
Favoring, or downgrading, an applicant based on gender is illegal under Title VII of the federal Civil Rights Act. The law prohibits employers from making hiring decisions based on a person’s sex or gender (including how well he or she complies with gender stereotypes). When gender discrimination against women becomes a part of the program making hiring decisions, a Title VII violation seems likely.
The law doesn’t require a potential employee suing for gender discrimination to prove the person making the hiring decision intended for the discrimination to happen. Some cases have been won simply on a showing of “disparate impact” -- that factors other than gender negatively affected one gender more than another. Even if the machine-learning team didn’t mean for their hiring A.I. to learn gender discrimination against women, the effect of screening out female applicants could be enough for a court to find the company violated Title VII.
Fortunately, Amazon recognized the problems with an A.I. that weighs gender in its hiring protocol. At first, the team edited the programs to neutralize the effect of those particular forms of gender discrimination against women. But advocates warned there was no way to know if the program was developing other discriminatory ways of sorting candidates.
Ultimately, Amazon disbanded the project in 2017. The company said the program was never used by recruiters to evaluate candidates, and that it had other problems that kept it from going to market. However, given the trends in hiring and automation, it is only a matter of time before Amazon, Google, or some other technology company releases a hiring A.I. that will automate the hiring process and could potentially hard-code gender discrimination into future hiring decisions.
Gender discrimination against women in hiring happens all across the country, in nearly every industry. Whether it is because of automation or old-fashioned biases, you have a right to be considered based on your qualifications, not your sex. If you believe you have been passed over for employment due to your gender, the employment discrimination attorneys at Eisenberg & Baum, LLP, can help. We will review your situation and help you plan a strategy that preserves all of your claims and protects your rights. Contact Eisenberg & Baum, LLP, today to talk to an employment discrimination attorney.