What is an Automated Employment Decision Tool? And What If I’m Using One?

The meteoric rise of AI in recruitment has brought with it both previously unimaginable opportunities and existential challenges. And nothing in the realm of AI-assisted hiring straddles the line of possible risk and reward quite like the automated employment decision tool (AEDT).

AEDTs offer significant efficiencies and the potential to reduce human bias, but their direct impact on hiring decisions necessitates ethical use and has given rise to regulatory considerations. Understanding the differences between AEDTs and other AI recruitment tools is crucial for employers aiming to stay compliant with legal standards while fostering a fair, swift, and effective hiring process.

What is an Automated Employment Decision Tool (AEDT)?

Automated employment decision tools are AI-powered technologies that “substantially assist or replace discretionary decision making” in hiring or promotions. The automated employment decision tool umbrella covers tools that evaluate resumes, rate candidates, assess interviews, and more. Common examples include screening software that filters applications based on keywords and interview platforms that analyze candidate responses and behaviors. 

By automating parts of the decision-making process, automated employment decision tools aim to increase efficiency and reduce human bias – which may sound familiar if you’ve explored almost any AI solution for hiring (guilty!). But AEDTs differ from other AI products used in recruitment in one key way. 

Other AI tools support the recruitment process without passing judgment on individual candidates, while AEDTs are designed to directly influence candidate selection. Understanding this stark contrast is critical to leveraging AI effectively yet responsibly in your hiring process while complying with legislation like New York City’s Local Law 144 – and surely much more to come.

Not all AI tools are AEDTs

AI technology encompasses a wide range of offerings designed to assist various aspects of talent acquisition. Beyond AEDTs, other AI applications have shown promise to streamline workflows, unlock efficiencies, improve candidate experience, and drive hiring equity. The list is growing by the day, but here are  some examples to get you started.

Of course, at the rate of innovation we’re experiencing today, we can expect lines to become increasingly blurred and labels less definitive. For instance, while a predictive analytics tool that estimates how long it will take to fill an open req is not an AEDT, a predictive analytics tool that projects future candidate success in a role most certainly is. (And if an analytics tool does both, then its classification fluctuates based on how it’s utilized by a particular employer.)

The bottom line? Automated employment decision tools are determined less by what they do and more by how and why they’re used in a hiring process. If you’re outsourcing any evaluation of job applicants’ qualifications to AI – whether it takes the form of stack ranking resumes, scoring recorded interview responses, or matching the results of personality assessments to those of high performers in your organization – you’re leveraging at least one AEDT.

And that may be okay. But you must be aware of the implications, both legal and ethical.

Differences in the regulation of AI recruitment tools

Automated employment decision tools are governed differently in the US than other AI tools used for recruitment due to their immediate impact on both individual hiring decisions and fairness, bias, and discrimination within hiring at large. 

Unlike the myriad solutions that play supporting roles in recruiting workflows, AEDTs directly influence which candidates are shortlisted, interviewed, or hired. This direct impact on employment outcomes has drawn the interest of governing bodies, resulting in AEDTs being subject to stricter regulations.

Though only a handful of US jurisdictions today explicitly regulate AI in hiring, employers nationally still must comply with federal Equal Employment Opportunity (EEO) laws that make it illegal to discriminate against a job applicant or an employee. 

If your organization is found to be putting protected groups at an unfair disadvantage in the hiring process because of biases inherent in an automated employment decision tool, you’re at risk of lawsuits and fines. In August of 2023, for example, the EEOC settled an age discrimination lawsuit alleging that the e-learning company iTutorGroup’s hiring software was programmed to automatically reject applicants 55 or older, resulting in a $365,000 penalty.

Therefore, no matter where in the country you do business or whether you’ve introduced proprietary or third-party AEDTs – which you’re still liable for as a customer – into your talent acquisition efforts, here are some ways you can better comply with both AI-specific and EEO regulations. Embracing compliance goes beyond avoiding legal pitfalls – it fosters a diverse, inclusive workplace, enhances your employer brand, and powers a more fair, efficient hiring process.

  • Conduct regular audits and bias assessments of your AEDTs to ensure they are not perpetuating or introducing biases. Identify and correct any biases or discriminatory patterns.
  • Provide transparency regarding how decisions are made, including explaining the criteria and algorithms used. Disclose your use of AEDTs in a “clear and conspicuous manner” to job seekers in your job descriptions, and give them the opportunity to request an alternative selection process.
  • Work with AI developers and vendors that prioritize diversity, equity, inclusion, and belonging (DEIB) on their teams – especially given the homogeneity of the tech sector, where most AI is built. Diverse engineering teams bring a variety of perspectives and experiences to the table, better enabling them to mitigate biases in their models. 
  • Contractually require AI suppliers to comply with all pertinent employment and non-discrimination laws.
  • Ensure that hiring team members are adequately trained to use AEDTs correctly and ethically.
  • Stay abreast of anti-discrimination laws as they are passed, both nationally and in the jurisdictions where you recruit and hire employees. Since our regulatory environment is evolving rapidly, collaborate cross-functionally with your HR, legal, data, and/or compliance teams to monitor new requirements and ensure that your AI systems comply with changing frameworks.

Remember that while non-AEDT AI isn’t regulated nearly as strictly from an employment law perspective, there are still privacy law considerations. Any tool – chatbot, scheduling assistant – that collects and processes candidate data must comply with data protection laws such as the California Consumer Privacy Act and General Data Protection Regulation (GDPR) in the EU, which mandate strict data privacy and security measures. 

Make informed decisions about automated employment decision tools

Understanding the distinctions between AEDTs and other technology is critical for instrumenting any AI in your hiring process and talent strategy effectively and responsibly. Fortunately, dozens (if not hundreds) of AI-powered tools on the market today – from incremental optimizations like automated interview schedulers to analytics platforms that produce revelatory insights – all have the potential to unlock new levels of hiring efficiency, fairness, and quality when used strategically.

But choosing the right AI to achieve your TA and business goals – and then properly onboarding and managing that AI over the long term – takes careful consideration. If you’re ready to take the next step in your AI journey, join us for an in-depth exploration of this transformative tech alongside Dominik A. Hahn, Global Head of Group Talent Acquisition at Allianz. Register here for this upcoming webinar, “Assistive Intelligence” (AI) in Recruiting.

Subscribe to stay in the know 💡

Sign up for the Datapeople newsletter to receive all the iIlluminating data, valuable insights, and actionable tips today's recruiting leaders can't afford to miss.