Last week, the Equal Employment Opportunity Commission (EEOC) released a "technical assistance document" that it characterized as a "new resource" relating to the use of automated decision-making tools in employment decisions, such as hiring and promotion. Although we are seeing frequent announcements from federal agencies focusing on artificial intelligence ("AI") and machine learning systems used in key industry sectors, this release from the EEOC does not provide much new information.

The release simply confirms that "disparate impact" or "adverse impact" might exist if the use of automated decision-making tools in hiring and promotion decisions result in people from certain protected classes being disproportionately not selected for opportunities, even if there is no discriminatory intent. Notably, the release utilizes a broad definition of the term "algorithmic decision-making," including "automatic resume-screening software, hiring software, chatbot software for hiring and workflow, video interviewing software, analytics software, employee monitoring software, and worker management software." Many of these systems do not involve AI or machine learning systems. As a result, employers using a range of different technology solutions may be left to wonder whether that solution is covered by the EEOC's technical assistance document.

The release also confirms that employers typically will be responsible for any discriminatory results produced by software they license from third parties. Thus, employers should carefully consider how and whether technical solutions they acquire will increase or decrease risks of discriminatory results.

Disparate Impact Discrimination and the Four-Fifths Rule

Disparate impact discrimination occurs when a seemingly neutral employment policy or practice has a disproportionately negative effect on individuals in a particular protected class. For example, minimum height requirements and physical strength tests are facially neutral but may have a disparate impact on women; an ad that seeks applicants with no more than two years' experience is facially neutral but may have a disparate impact on older candidates. And automated decision-making software may not explicitly consider an applicant's race, gender, or other protected characteristics but for facially neutral reasons may disproportionately exclude people of a certain race, gender, etc.  

The EEOC release mentions the "four-fifths rule" as a quick test to determine whether a selection procedure has a disparate impact on a protected class. The rule is that a selection rate for any protected class that is less than four-fifths (80%) of the selection rate for the most successful group generally constitutes evidence of disparate impact. To calculate a potential disparate impact under the four-fifths rule one must:

  1. Calculate the rate of selection for each group by dividing the number of persons selected from a group by the number of applicants from that group;
  2. Observe which group has the highest selection rate;
  3. Calculate the impact ratios by comparing the selection rate for each group with that of the highest group, i.e., divide the selection rate for a group by the selection rate for the highest group; and
  4. Observe whether the selection rate for any group is substantially less (i.e., usually less than four-fifths or 80%) than the selection rate for the highest group, which indicates there may be discrimination.

Even if a selection process violates the four-fifths rule, the analysis does not end there. If the employer can show that the selection process is job-related and consistent with business necessity (e.g., it is necessary to the safe and efficient performance of the job), the burden would shift back to the employee or EEOC to show that there is a less discriminatory alternative available.

Takeaways

The EEOC will continue struggling to evaluate automated decision-making systems unless it identifies problematic software.

The EEOC rarely conducts audits or investigations into hiring practices unless it receives a charge alleging discrimination. The less visible an automated decision-making process is to job applicants or the public, the less likely it is to be the subject of a charge and thus to come under scrutiny by the EEOC.

The EEOC historically has focused on "trends" in hiring tests/criteria for disparate impact claims once it appears employers are using a problematic practice. If the EEOC discovers that certain hiring or promotion software caused a problem for one employer, it may seek to investigate other employers' use of the software.

Unlike the EEOC, the Office of Federal Contract Compliance Programs (OFCCP), which manages contracts on behalf of the federal government, requires federal contractors and subcontractors that meet certain thresholds to document and analyze their selection rates as part of any affirmative action program, and OFCCP regularly conducts audits of companies' data.

Employers should continue to analyze use of automated decision-making tools and understand recordkeeping requirements.

Consistent with our prior advisories, employers who use or are considering using automated decision-making tools should carefully evaluate the system for accuracy and bias. The EEOC "encourages employers to conduct self-analyses on an ongoing basis" to evaluate discriminatory impacts and to consider less discriminatory means of automated decision-making. Notably, the EEOC states that "[f]ailure to adopt a less discriminatory algorithm that was considered during the development process [] may give rise to liability." This seems to suggest that employers have a duty to conduct diligence of "less discriminatory" algorithms during the selection process. Conducting these analyses with the assistance and advice of counsel offers some protection from disclosure and potential liability.

Employers that license automated decision-making systems should also review their contracts and diligence processes, including securing a clear understanding of how the AI developer tested and validated their tools for accuracy and bias and how liability may be allocated should legal issues involving alleged discrimination arise.

Employers may also have specific recordkeeping or auditing requirements, for example, if they are hiring for positions in New York City. The EEOC release does not discuss recordkeeping requirements, but Title VII requires employers with 100 or more employees to maintain hiring records reflecting the demographics of its job applicants and those hired, and conduct annual adverse impact analyses for demographic groups that make up "at least 2 percent of the labor force in the relevant labor area or 2 percent of the applicable workforce." The employer can use either the four-fifths rule or another statistical method to determine any adverse impact. If the employer detects an adverse impact and continues to use the selection process, it must conduct and retain records of validity studies that demonstrate the legitimate basis for using the selection process.