AI you can trust

EU AI Act explained.

EU AI Act explained

What does the EU AI Act mean for HR leaders?

This information is provided for general informational purposes only and does not constitute legal advice. This page may reference laws, standards, or industry practices, but the information is provided without any legal representation or warranty. If you have questions about how any law or regulation applies to your specific situation, you should consult your legal counsel.

The EU AI Act (Regulation (EU) 2024/1689) requires Eightfold to adhere to the compliance requirements in Chapter III. While most of the requirements apply to Eightfold as the developer of a high-risk AI system, Article 26 and the related Recitals 91–95 outline the responsibilities of deployers — Eightfold customers — for the safe deployment of our services.

Below is an overview of these requirements and how Eightfold supports its customers’ compliance with their obligations.

This page is part of Talent / Syntax, Eightfold’s reference guide to AI in talent.

ARTICLE 26

Obligations on deployers under Article 26.

Use Eightfold in accordance with the Instructions for Use

Customers should follow the Instructions for Use provided by Eightfold when deploying our AI system and should implement suitable safeguards to protect the privacy of applicants and employees by granting role-based access. Customers should follow release notices and product documentation and adhere to the maintenance, update, and configuration practices described in the Instructions for Use and during customer onboarding. (Article 26(1); Recital 91)

Assign competent human oversight

Customers should ensure their HR professionals and hiring managers using the Eightfold platform understand the main features of the platform such as the AI matching model, including the Match Score, job calibration, and how the Eightfold Intelligence Platform matches candidates and roles. Users should also understand how to interpret the output of the models and how to execute human oversights in line with the "Human Oversight" section in the Instructions for Use. Customers should maintain internal policies clarifying decision rights, internal escalation paths, and training requirements. (Article 26(2)–(3); Recital 91)

Ensure input data is relevant and sufficiently representative

Where deployers control input data, they must ensure the relevance and representativeness of such data for the intended purpose. Customer-controlled input data includes data provided via the customer's ATS/HRIS, job descriptions, and job calibrations. Customers, as the data controllers, should ensure they have processes in place to maintain the accuracy and relevance of this data, keep job calibrations up to date, and remove outdated or irrelevant requirements that may distort the matching. Match Score does not rely on "special category data" for scoring. (Article 26(4))

Monitoring

Eightfold monitors the platform's performance, model accuracy, and potential drift on an ongoing basis as required by Article 72 of the EU AI Act. If users observe irregular model behavior or other incidents that suggest that the platform might present a risk, or if a serious incident is identified, inform Eightfold. If a serious incident is confirmed, users should be prepared to suspend use of the platform until the issue has been resolved and — if applicable — notify the relevant market surveillance authority. (Article 26(5); Recital 91)

Logging

Keep logs generated by the high-risk system for at least six months, unless other laws require otherwise. Eightfold has implemented logging across activities within the security authorization boundary, including automated alerting for specified events. Customer users' actions on the Eightfold platform are logged in Eightfold for the customer's own recordkeeping and reporting. (Article 26(6))

Transparency

Employers must inform affected workers and, where applicable, their representatives prior to putting the AI system into service in the workplace. Customers should explain to workers, in plain language, the intended purpose of the AI system, the role of the AI, and the human oversight, and maintain evidence of this information step as part of their deployment records. (Article 26(7); Recital 92)

Because the Eightfold platform is intended to assist customers in identifying how a candidate’s skills match job descriptions and requirements, and may be used to support customers in their hiring practices, customers should review their notices and disclosures regarding the use of AI. Customers should provide candidates and/or employees with a clear notice explaining how AI is used to assist decision-making and describe the intended purpose and/or main elements of the decisions supported. (Article 26(11); Recital 93)

Conduct a Data Protection Impact Assessment

Customers are required to conduct a Data Protection Impact Assessment (DPIA) regarding their use of the Eightfold platform. Eightfold makes a number of materials available — including Instructions for Use — through the Eightfold Trust Center (trust.eightfold.ai) to support customers' assessment requirements. (Article 26(9))

Cooperate with authorities

Deployers must cooperate with the competent authorities in actions related to compliance with the regulations. This may include providing information on deployment, logs, training, monitoring, and mitigations, if requested. (Article 26(12))

Frequently asked questions.

Customers should follow the Instructions for Use provided by Eightfold when deploying our AI system and should implement suitable safeguards to protect the privacy of applicants and employees by granting role-based access. Customers should follow release notices and product documentation and adhere to the maintenance, update, and configuration practices described in the Instructions for Use and during customer onboarding. (Article 26(1); Recital 91)

Customers should ensure their HR professionals and hiring managers using the Eightfold platform understand the main features of the platform such as the AI matching model, including the Match Score, job calibration, and how the Eightfold Intelligence Platform matches candidates and roles. Users should also understand how to interpret the output of the models and how to execute human oversights in line with the “Human Oversight” section in the Instructions for Use. Customers should maintain internal policies clarifying decision rights, internal escalation paths, and training requirements. (Article 26(2)–(3); Recital 91)

Where deployers control input data, they must ensure the relevance and representativeness of such data for the intended purpose. Customer-controlled input data includes data provided via the customer’s ATS/HRIS, job descriptions, and job calibrations. Customers, as the data controllers, should ensure they have processes in place to maintain the accuracy and relevance of this data, keep job calibrations up to date, and remove outdated or irrelevant requirements that may distort the matching. Match Score does not rely on “special category data” for scoring. (Article 26(4))

Eightfold monitors the platform’s performance, model accuracy, and potential drift on an ongoing basis as required by Article 72 of the EU AI Act. If users observe irregular model behavior or other incidents that suggest that the platform might present a risk, or if a serious incident is identified, inform Eightfold. If a serious incident is confirmed, users should be prepared to suspend use of the platform until the issue has been resolved and — if applicable — notify the relevant market surveillance authority. (Article 26(5); Recital 91)

Keep logs generated by the high-risk system for at least six months, unless other laws require otherwise. Eightfold has implemented logging across activities within the security authorization boundary, including automated alerting for specified events. Customer users’ actions on the Eightfold platform are logged in Eightfold for the customer’s own recordkeeping and reporting. (Article 26(6))

Employers must inform affected workers and, where applicable, their representatives prior to putting the AI system into service in the workplace. Customers should explain to workers, in plain language, the intended purpose of the AI system, the role of the AI, and the human oversight, and maintain evidence of this information step as part of their deployment records. Because the Eightfold platform is intended to assist customers in identifying how a candidate’s skills match job descriptions and requirements, and may be used to support customers in their hiring practices, customers should review their notices and disclosures regarding the use of AI. Customers should provide candidates and/or employees with a clear notice explaining how AI is used to assist decision-making and describe the intended purpose and/or main elements of the decisions supported. (Article 26(7)(11); Recitals 92–93)

Customers are required to conduct a Data Protection Impact Assessment (DPIA) regarding their use of the Eightfold platform. Eightfold makes a number of materials available — including Instructions for Use — through the Eightfold Trust Center (trust.eightfold.ai) to support customers’ assessment requirements. (Article 26(9))

Deployers must cooperate with the competent authorities in actions related to compliance with the regulations. This may include providing information on deployment, logs, training, monitoring, and mitigations, if requested. (Article 26(12))

Compliance materials live in the Eightfold Trust Center.

Instructions for Use, security documentation, and deployment resources to support your Article 26 obligations.

Share Popup Title

Share this article