If you work in HR, then you’ve heard the buzz about New York City’s Local Law 144 regarding “automated employment decision tools,” or AEDTs, (passed Dec. 10, 2021) that will be enforced starting July 5.
The NYC law requires that organizations using “automated employment decision tools,” as defined under the law, inform NYC residents so they know that such a tool is in use and provide transparency about the tool. It also requires that these tools be audited for bias annually for disparate impact based on sex and race/ethnicity.
Organizations that fail to comply risk being fined. Currently, the penalties for noncompliance range from $500 for a first violation to $500-$1,500 daily for each subsequent violation. What could prove even more costly, though, is the possibility of class-action lawsuits brought under the law.
Whether or not you hire in NYC, now that the law’s enforcement date is almost here, it’s important to understand the potential implications it has for you and your organization. While this post is not legal advice — if you need legal advice, work with your legal counsel — here are some things to know about this law.
Implications of NYC’s AI law
This law is significant as the first-of-its-kind in specifically regulating HR software that uses automation or AI. In a recent New York Times article, the reporter called NYC “a modest pioneer in AI regulation,” with other cities and states exploring similar laws likely to follow suit.
The NYC law defines automated employment decision tools (AEDTs) as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision-making for making employment decisions that impact natural persons.”
It could potentially apply to a wide range of HR technologies, including candidate-sourcing software, résumé-reviewer software, and software that ranks applicants or tracks employee performance.
At Eightfold’s U.S. Cultivate event, labor experts said that organizations should see this as an opportunity to use AI and automated tools to help mitigate bias in their hiring practices — and not fear it. In fact, they said that a movement toward regulating these tools isn’t something to be anxious about. Eightfold believes that the responsible use of AI in hiring will help HR and talent leaders become more compliant with regulations and help reduce bias.
Related content: Watch Eightfold AI’s Chief People Officer Darren Burton interview EEOC Commissioner Keith Sonderling at Cultivate ’23 as they discuss automated HR technologies and regulations. See the full session here.
How your organization can prepare for compliance now
Regardless of your organization’s location, now is the time to be proactive. Laws regarding AI systems and hiring are developing around the globe.
As highlighted by HR Executive, there are several things organizations in NYC must do — and what you may consider doing proactively — to ensure compliance.
- The employer or employment agency is responsible for ensuring independent bias audits are conducted by an independent bias auditor, as defined by the law. The auditors cannot be employed by or have certain financial interests in the employer, employment agency, or the vendor of the tool.
- Audit summaries must be made publicly available.
- Candidates who are NYC residents must be given notice beforehand if an AEDT is being used. The law also requires other notices and disclosures to facilitate AI transparency and explainability.
- HR leaders should assume that these audits will be ongoing and performed on a regular cadence. The NYC law requires that AEDTs be audited annually.
How talent intelligence platforms can help you stay compliant while transforming HR
With regulations in NYC, you may wonder if AI is worth implementing for your HR team.
“AI actually makes HR processes more transparent because AI records what that algorithm is looking for,” said Keith Sonderling, Commissioner of the U.S. Equal Employment Opportunity Commission (EEOC) in our podcast. “We can also look at the data set itself and see if there was any discrimination.”
In fact, AI platforms for HR can help organizations transform their workforces for the better, increasing equity, retention, and performance while providing new opportunities for candidates and employees. A key consideration when using AI is transparency and explainability, with other important areas including non-discrimination, human oversight, and accuracy.
Craig Leen, former Director of the Office of Federal Contract Compliance Programs (OFCCP) at the U.S. Department of Labor and a member of our AI Ethics Council, said that AI is beneficial in combating bias because it can assess thousands of résumés and identify applicants who might have been overlooked due to human bias — whether that human bias is based on demographic information, education, experiences, or anything else.
“Good AI is more likely to identify people who have historically been overlooked or subject to unconscious bias,” Leen said.
Leen added that using AI responsibly requires prioritizing the right AI vendors. Vendors should be transparent about how their software works, and how they monitor and audit it — and provide those details upon request.
You can see more questions to ask vendors about their AI in our guide to deep-learning AI, too.
Related content: Watch our U.S. Cultivate ’23 session with former OFCCP Director Craig Leen for more on AI and compliance. See the full session here.
Our takeaway on new AI laws and old laws that regulate AI in hiring and employment
At Eightfold, we take responsible and ethical use of AI seriously and also believe that it can mitigate bias in hiring and other employment-related decisions if used correctly.
It’s important to note that Eightfold does not consider itself an AEDT vendor. We audit as a service to our customers to help them comply out of an abundance of caution.
To support our customers and educate others about what they should consider in an AI-powered talent intelligence platform, we explain our matching model and how we audit for compliance with Local Law 144. A third-party bias audit of the Eightfold matching model was conducted in May by an independent auditor, BABL AI. Summaries of our third-party audits are available to customers upon request.
Our software engineers have also recently drafted our white paper Responsible AI at Eightfold to further explain how our AI works.
As our General Counsel, Roy Wang, recently told The New York Times, “We believe we can meet the law and show what good A.I. looks like.”
Set yourself up for success
The NYC law states that employers and employment agencies that use AEDTs will have to start auditing tools for bias, but that doesn’t mean they shouldn’t consider AI-powered tools and platforms. A talent platform from a trusted vendor can actually help organizations be more compliant with employment laws and reduce bias.
And while organizations outside NYC may feel anxious about this new law’s implications, they can set themselves up for success by working with a reputable vendor and also working closely with their own legal and compliance teams to review these new laws and existing laws which regulate the use of AI in employment.
“I’d say in five to 10 years if you’re not using artificial intelligence as a significant part of your hiring process that regulators are going to be concerned because they know that the AI is neutral,” Leen said at Cultivate.
“I wouldn’t run away from AI,” he continued. “I would embrace it. Now. Become familiar with it. Make sure your HR leadership is familiar with it, is trained on it, and is learning about AI.”
Eightfold uses AI techniques and other methods that mitigate biases against any individual, including with respect to gender and race. Please review our white paper Responsible AI at Eightfold or our webpage on matching models for more details. If you have questions regarding your application with an employer that uses Eightfold’s service, you may contact the employer directly or complete this form.