When it comes to compliance, AI should enhance equality

Talent leaders shouldn’t worry about AI harming equal employment opportunities. In fact, they should know that the right AI only enhances it.

When it comes to compliance, AI should enhance equality

Summary
View transcript

Welcome to The New Talent Code, a podcast dedicated to empowering change agents in HR to push the envelope in their talent functions. Hosts Ligia and Jason bring you the best thought leaders in the talent space. They share stories about how they are designing the future workforce, transforming processes, and using cutting-edge technology to solve today’s pressing talent issues.

Today’s episode features Craig Leen, an attorney by trade and partner at K&L Gates. Before this, he served as the Director of the Office of Federal Contract Compliance Program (OFCCP) at the US Department of Labor. He is a leading expert on workforce compliance, equal opportunity, and anti-discrimination. 

To begin, Craig shares his passion and commitment to protecting companies against bias and representing the interests of people with disabilities. When he started advocating for his daughter, who has autism, he noticed a severe lack of resources. The commission supported his idea for a disability and inclusion program, leading the city to adopt its principles of inclusion. 

Craig highlights a few surprising situations flagged as non-compliant with the OFCCP. The OFCCP focuses on systemic discrimination, including hiring, promotion, and compensation. Given that most companies will have at least one area of disparity, explaining the gap is essential.

Shifting gears, Craig identifies how AI can help unpack some of this work around determining the required skills for a job. First, it can sort through a broader applicant pool to diversify the workforce and increase the diversity of the applicant pool for federal contractors. Finally, AI enables organizations to audit trends in real-time. 

HR leaders should ensure their AI provider is serious about prioritizing equal employment opportunities and assessing the roles for bias. For example, when the company chooses candidates to interview, profile masking helps hiring managers and recruiters avoid making decisions based on race, ethnicity, gender, etc. The OFCCP expects companies to audit their selection procedures yearly. 

Finally, he summarizes what the new laws and regulations in different states mean for vendors and buyers. In closing, Craig reveals his other hidden passions.

Links:

  • Learn more about Craig Leen.
  • Eightfold AI is committed to high compliance standards, security, and accessibility for our systems. Learn more about Eightfold AI’s actions of governance.

Ligia:

Welcome to the new talent code, a podcast with practical insights, dedicated to empowering change agents in HR to push the envelope in their talent functions. We’re your hosts. I’m Lee Zora

Jason:

And I’m Jason Serato. We’re bringing you the best thought leaders in the talent space to share stories about how they are designing the workforce of the future. Transforming processes, rethinking old constructs and leveraging cutting edge technology to solve. Today’s pressing talent issues. It’s what we call the new talent code.

Ligia:

So if you’re looking for practical, actionable advice to get your workforce future ready, you’ve come to the right place. Hello everybody. And welcome to another episode of the new talent code I’m Leia, and I’m here with my co-host Jason. Hi, Jason, how are you? Hey,

Jason:

Leia. Doing great. We have another topic today. It’s on the top of everyone’s mind gonna looking forward to another great conversation.

Ligia:

I know. I know. So without further ado, let me introduce Craig, our exciting guest today. Craig lean. Welcome to the show. Thank you for joining us.

Craig Leen:

Thank you. It’s a real pleasure to be here.

Ligia:

Yeah. Well, let me brag about you for a minute. So Craig is an attorney by trade. He’s a partner in K and L gates. But before that, he served as the director of the office of federal contract compliance program. So everybody knows that as the O F C C P at the us department of labor, he reported directly to the secretary of labor overseeing roughly 450 employees in the budget of over 105 million. That’s a lot more than that I’ve overseen, but in layman’s terms, he’s an expert on everything, workforce compliance, equal opportunity, and anti-discrimination basically how not to break the law and treat people fairly. So for our listeners out there who usually avoid conversations with lawyers, don’t worry, we promise Craig’s gonna make you glad you listened in. So before we kick this off, Craig, tell our listeners a little bit about your passion and commitment to protecting companies against bias, but more importantly, representing interests of people with disabilities. How did you find this niche in law?

Craig Leen:

Certainly thank you so much for having me. I have been a lawyer a long time, although I have been a different type of lawyer, you know, I started yeah, more in corporate law as a corporate litigator, and then I went into local government law and then I became the F C P director. And now I do labor employment, workplace safety law. And then I serve on a lot of boards and things like that. But my interest in disability in particular disability inclusion, disability, non-discrimination, it came from a couple places. One when I was in middle school and high school, I used to work a lot with kids with disabilities in the special education program and also in the special Olympics program. And I really enjoyed doing that and I felt like it was an underserved population and there needed to be more resources there. And I had a great time doing that.

Craig Leen:

So when I got into local government law and had been starting to work in the civil rights field at Miami-Dade county, and then in coral Gables, I wanted to do more for people with disabilities. And at about the same time, my daughter Alex was diagnosed as having autism and an intellectual disability. And her autism is quite profound and affects every aspect of her life. And she needed a lot of support to help her to be able to succeed and prosper and be as independent as possible. She still needs a lot of support she’s 17 now, but you know, one thing I saw when I started advocating for Alex as a somewhat young attorney, a young parent was that there were not a lot of resources there initially, particularly in the autism field, there were a lot of misconceptions about autism, about what it meant. And so I became an advocate in the disability area, starting with my daughter, but then I saw what a challenge it was, how you had to fight for accommodations in almost every place you went.

Craig Leen:

And it occurred to me pretty early that there were a lot of inequities here. And so I started focusing a lot on disability there. I got support of the commission and we did a disability inclusion program, which was quite extensive. And the city adopted principles of inclusion that focused on going above and beyond the law in areas of including people with disabilities as a matter of customer service as a matter of good government. So that was one of the reasons I came to the attention of the secretary of labor and was brought up to be director of O F CCP. So when I was at O F C P I saw something similar that the agency did not focus as much on disability as other areas that the amount of resources it devoted to disability inclusion to disability non-discrimination was less. And that there was a real yearning among the contractor community to have more guidance and more focus on disability.

Craig Leen:

So, you know, that became a hallmark of my tenure there. I talked all the time on disability inclusion wanting to make sure that companies view disability discrimination as seriously as they view race or gender based discrimination. And that stereotypes and stigmas related to disability were being addressed by companies and that they went above and beyond in accommodations. So then when I left O CCP and came over here to K and L gates, and then I got on the advisory board for eightfold, and I do a number of other boards, including a couple boards in the disability area. I’m on the board of respectability and the advisory board for disability in DC. Metro, that became a significant focus of my post O C C time talking about disability, learning more about disability inclusion and still continuing to be an advocate for my daughter, Alex. So that’s where it came from.

Ligia:

Wow. We’re lucky to have

Jason:

You thank you for sharing your story. It’s always interesting to understand kind of what drives someone to work in the space of compliance when so many people are afraid of it, or try to just stay at an arm’s length from it. So I’m excited to have the chance to speak with you today and help others understand a little bit of the rules of the road and how to move forward in this new frontier addressing kind of these new challenges with talent and some of these new tools. So to get down to brass tax, every company wants to be compliant and they try to do their best to be compliant and inclusive of everyone, regardless of race, gender, sexual orientation, disability status, but based on your time at O F C C P what are some of the things that are flagged as non-compliant that may be surprising for our listeners?

Craig Leen:

Well, I think one thing that may be surprising is the O F C C P although focused on federal contractors and subcontractors is almost completely focused on systemic discrimination and looking at systems that are in place at companies, usually larger companies and looking at their auditing, essentially their hiring promotion and compensation practices, looking for disparities based on principally race, ethnicity, and gender, but also looking at a number of other protected classes as well, including disability. And that was one focus I had at O F C C P. One thing that people might not realize is that when O C C P initially gets all the data from the company about compensation and selection, hiring tied to the different employees, their job titles, job groups, their race and gender, and it looks at it, it finds indicators and indicator is where there’s a disparity typically that either satisfies the four fifths rule, which is a way of evaluating practical significance of a disparity.

Craig Leen:

And then in a statistical way, they’re looking for gaps or disparities based on race and gender of over two standard deviations and O F CCP ultimately only finds discrimination in about two to 5% of cases. But in terms of the initial indicators, it can be a lot higher. It can be up to a quarter or a half of the reviews that are done these initial indicators. And that’s because when you’re looking at very large companies and you’re looking at disparities, those can happen by random chance. Sometimes if you have enough job titles or job groups, they can happen because of a specific qualification or skill that you’re focusing on that is not uniformly distributed throughout the population. So for example, if you look at physics PhDs, that’s not uniformly distributed among the population, a higher proportion of men have physics PhDs than women, for example, whereas in another area, it may be a higher proportion of women have that PhD than men.

Craig Leen:

And it’s not to say that men and women don’t have the equal ability to get those PhDs, but it’s just in, in our population, there are different types of areas of study or focus or skills that have been developed that are not uniformly distributed. That could be in part because of historical discrimination that could, could be because of choice of different groups. But the fact that that occurs means that you can have disparate impact based on a lot of different reasons. And I don’t think that people recognize that and the issue is actually much more complex. Most companies will have at least some areas where there are disparities. And then the question is, well, why did that disparity occur? Was it just for treatment? Is there evidence that there was specific intent here to discriminate against a group based on a protected class basis, such as race or gender, but there’s still this disparity.

Craig Leen:

Let’s say it’s a job where there is a lot of manual labor. That’s part of the job. You may have a lifting requirement of a hundred, 150 pounds. And that means that requirement may be necessary for the job. It may be job related and required to be able to do the job well. You’re allowed to have that lifting requirement then. And that may mean that part of the population of people with disabilities will not be able to qualify for the job because they can’t satisfy a 150 pound lifting requirement. If you look at the general population, statistically, more men will be able to meet proportionally 150 pound lifting requirement than women. So that’s gonna probably cause a disparate impact as well, an adverse impact just because it causes an adverse impact. Doesn’t mean you don’t need the requirement. If you took the requirement away, you’d have a lot of people maybe who couldn’t perform the job, which is a

Ligia:

Problem, or who could get injured, which is another

Craig Leen:

Problem, or who could get injured. There’s a lot of issues, but what O F CCP is interested in and same with the EOC in this area is okay, well fine. You might need a lifting requirement, but does it need to be 150 pounds? Let’s actually have an, in someone who knows this area can look at this job and say, well, where do you need to lift 150 pounds? Maybe you really only need to lift 25. And even if you provide a margin of error, maybe it’s 30 or 40 pounds. And of course, if you have a 30 or 40 pound lifting requirement, then a lot more people with disabilities, a lot more women will be able to qualify for that job. And a lot more men as well, more people will be able to qualify for the job generally. So that will also tend to decrease these disparities based on in this case, gender or disabil status.

Craig Leen:

So that’s what disparate impacts all about. Being able to explain a particular qualification or skill and why that qualification or skill, if it’s causing adverse impact, why that’s job related and consistent with business necessity. And if you can do that, then all of those indicators go away and you end up having a, a closure of your audit with no finding. If you’re not able to explain why you have those disparities, which are common, which is what I’m trying to convey here, they’re quite common. Then you could be in trouble and you need to be able to explain those disparities. And that’s really, that’s one big thing. I would say, people don’t fully understand about O C, C P. It’s not there just to find disparities, tell you, aha, we got you. No O F C, C P needs to demonstrate that there’s a discrimination case, which means disparate treatment, which is intent and animus. And, and thankfully is decreasing in the amount of incidents that are found in this country. And hopefully will continue to increase to zero, cuz there’s still findings of disparate treatment. But the area that seems to be growing is disparate impact where there is no intent, it’s a neutral policy, but that area is growing because you’re continuing to find these disparities and they need to be explained.

 

Reevaluating the recruiting process 

Jason:

So a lot of organizations are looking at their requirements and trying to figure out what is a valid requirement and what is something that may have been added over time that may not be as relevant or may cause some of those disparate impacts as, as organizations try to go through and kind of reevaluate their requirements and look at how the job is done today, compared to how it was done in the past. What are some of the things that HR leaders can do to maybe think long term and help kind of mitigate and reevaluate this process?

Craig Leen:

Well, I’ll tell you, I think at a focus on skills over qualifications, it’s a big way to do that. I know that’s something that a folds been very focused on. Helping companies identify people with certain skills based on past experiences, not necessarily based on the fact that they’ve had this experience and therefore, oh, well only if you’ve had this experience, will we look at you, but recognizing that there’s lots of different types of experiences that may lead you to develop a certain skill and looking more for the skill, because then that increases opportunity because then yeah, maybe you didn’t go to an Ivy League school or maybe you didn’t have a Ph.D. Or maybe you don’t have the certificate that’s really hard to get, but yet you have the skill because you had this experience at a job where you learned this particular computer programming language, but that skill is what’s important.

Craig Leen:

And so if you have that skill, then the company should know about it and that gives you a chance to compete for the job. Whereas in previous times before you had AI, for example, that could help identify skills like this. You might say, you know what, we’d love to get people with these skills, but it’s hard for us to really figure it out. So we’re just gonna require that you have this certification, or we’re just gonna require that you have this MBA. We’re only gonna look at these five schools where we have a good experience with them in terms of the candidates we get.

 

AI improving opportunity in recruiting

Jason:

We conducted a, a survey earlier in the year to ask HR leaders, you know, how are you progressing in your use in adoption of AI and where are you using AI in your HR processes? And that research showed us that increasingly organizations and HR is incorporating AI into their process, but there’s still a lot of questions. And there’s still a lot of cautiousness and concerns about what AI does and what it actually achieves. How do you think AI can help maybe uncover or unravel some of this work around identifying the right skills or maybe balancing out opportunity?

 

Diversified talent pool

Craig Leen:

Well, I think in two primary ways, first it’s the, you know, the reason why a company uses AI to identify talent and uses like eight folds platform is to identify a broader pool of talent. So by using the power of artificial intelligence, to be able to look at thousands of applications for a particular skill, that tends to be a very positive part of equal employment opportunity. You want that the broader the applicant pool, the more likely you’ll get a diverse applicant pool, you tend to see a less diverse applicant pool because of affinity bias. For example, if you just focus on references from your current employees, for example, or from a very limited pool, you often tend to see a bias in favor of particular race or gender or ethnicity stereotypes often come in where someone says, well, I think that person will be good for this job, but it may be based on some of this unconscious bias.

Craig Leen:

You tend to see the more it’s human-centered picking, who they think would be good for this job, you know, where you don’t really have all the evidence you haven’t interviewed the person, you haven’t looked at their entire CV and things like that. That’s where unconscious bias and affinity bias and other forms of bias can creep in much better to use artificial intelligence, to look at hundreds or thousands of applications and identify those people that have this particular skill based on their experience in their CV. So once you get those applicant like freight fold, for example, so you get all those applicants, you can mask their protected class basis and look solely at their skills and qualifications, which allows you to eliminate unconscious bias from the process. And there’s a lot of studies out there that unconscious bias, even when you train for it, it still exists.

Craig Leen:

So it’s useful to have AI helping you with that. And then on the other side, in terms of equal employment opportunity, the nice thing about AI can help in two ways. One, as I mentioned, if AI is able to look through lots of applications and identify people that may not have been looked at in the past that are underrepresented or underutilized as people with disabilities, for example, or others, it can help increase the diversity of your applicant pool for federal contractors. So if you recognize, let’s say you do your affirmative action program and you’re underrepresented, let’s assume as to Hispanics in a particular job group and women in a particular job group, you can use AI to identify more applicants among those groups now. So in terms of being a affirmative action tool, it’s extremely useful. And then in the non-discrimination area, the benefit of artificial intelligence is as I mentioned, the profile masking and, and also the ability to audit what you’re doing in real time.

Craig Leen:

So if you’re seeing that there’s a higher proportion of a particular race or gender being picked, you can figure out why you can determine why is that happening at this level? So for example, let’s say you’re using a matching score. Women who have the highest score are not getting picked or people who are African-American African-American applicants are not getting picked. It could be Asian applicants, it could be white applicants, whoever it may be. You can see that through the use of AI. And you could say, why is this happening? Is this happening because of something that’s explainable or is it tied to potentially race, ethnicity, or gender? And if it is, then you can immediately address it, which is really useful about artificial intelligence. So that’s why I think artificial intelligence is the wave of the future.

Ligia:

So Craig, if I’m, uh, HR leader evaluating different HR technologies that have AI in them, you know, at a high level, what are two or three things that I should be aware of or looking for regarding that AI, just to, to understand, cuz I think there’s a lot of confusion to understand whether it’s increasing bias or actually addressing diversity.

Craig Leen:

Yeah. So one you want probably the most important thing is when you’re looking at a particular AI program, you want to first make sure that they’re very serious about the equal employment opportunity components because regulators are very focused in this area. Yeah. So that’s one thing. So for example, if you go on eight folds website, they have an right on the main page, a big focus on equal employment opportunity. And it talks about what eight folds doing to make sure that there’s not bias. Make sure that EEO is a focus I’m on the advisory board, Vicky Lipnic, the former chair of the EOCs on the advisory board. These are areas that we’re looking at. Cause it’s not an afterthought. That’s one thing too. You wanna make sure you should ask about this, that they’re assessing the tool for bias, basic bias. So they’re running tests where they’re looking at, okay, if you have a hundred applicants and they’re all equally qualified and you make these 50 men and these 50 women, do they get treated equally by the AI or does it favor men or women?

Craig Leen:

And if you flip them, you make these 50 men now, 50 women and these 50 women, 50 men, does that change anything because that’s extremely important for artificial intelligence. And that’s what the regulators are mostly looking at, even if they don’t articulate it that way. If they’re looking at AI more generally, and you’re seeing a lot on AI coming from regulators and they don’t always focus, but that’s the big issue I think, is there bias in that way. Not is their adverse impact. That’s a different question. Every selection procedure can cause adverse impact and adverse impact is not necessarily illegal that’s to be expected. In fact, sometimes if the AI acts perfectly correctly, you’ll still have adverse impact because of the particular qualification or skill you’re looking for. What’s important about AI is that there’s not bias. That’s occurring solely because of gender or race.

 

Enhanced employment 0pportunity

Craig Leen:

And that’s something I know a folds very focused on and has been preparing for the, in particular, the New York city ordinance is asking companies to post information about bias. So that’s something that I would ask right away. Any AI provider, what are you doing about bias? Can you show us that there’s not, not bias here, that if I give you two, two candidates that they’re gonna be treated the same regardless of their gender or race, the next thing I would look at is fine. So you want the platform to be neutral as to protected class, but then what is the platform doing to enhance equal employment opportunity? That’s the next thing. One, you don’t want it to harm equal employment opportunity. That’s really important because there is a history of aptitude tests sometimes tied to artificial intelligence having a bias. So you wanna make sure that that’s being addressed, but then you wanna see, well, what can you do to enhance equal employment opportunities? So for example, being able to look through hundreds or thousands of resumes to be able to identify people or applicants who may be overlooked and who may be diverse candidates, making sure they get a full opportunity to be tested. And one way, as I mentioned to do that is by focusing on skills, for example, based on past experiences, not just saying, well, if you have this experience, you have the skill saying, okay, there’s eight different experiences that may lead to the skill. Do you have one of those experiences? Do you have the skill? And then if you do well, let’s take a look at you as part of our applicant flow and not just exclude you based on a paper qualification that we’ve picked that’s really just a proxy for the skill, but may be inaccurate and may actually cause a disparate impact. So, you know, that’s number two, you want artificial intelligence that’s powerful that is able to look at lots of different applicants in that way. And then three, this is important and it’s somewhat simple, but not a lot of, at least I’ve been seeing, not a lot of AI does this yet, but you need the profile masking. When you get to the selection procedure it’s extremely helpful to address unconscious bias. So if AI is not biased, you’ve shown that. And let’s say you’re using a score for example. And it’s based on the skills that this person has matched to a particular job. And now everyone’s getting a full opportunity to get that higher scored. It’s not tied to their race or gender. There’s still the part where humans have to select the person and where unconscious bias can come in. So you want it’s, it’s, it’s imperative that if the company wants it and, and I generally would recommend this, that when they make that selection of who to interview, for example, or whatever, it may be who to select if they do it that way, that that you’re able to have profile masking so that they’re not making that decision in any way based on race or ethnicity or gender or disability status or any of the, or LGBTQ plus status or any of the areas where you’re getting.

 

Audit recruiting processes with AI

Craig Leen:

If you get self-identification information on those ground, you want to make sure that that’s not coming into it. And then a bonus point I would raise is the ability to audit it in real time is really important. So you want to be able to O F C, C P expects you to audit your selection procedures yearly. And anytime you have a particular selection procedure that’s causing adverse impact, they typically want you to show that it’s job related and consistent with business necessity, usually through validation, right? So you want a system that assists you in doing that. And then that if you’re audited, you can immediately go to the system and be able to explain a disparity that occurred that disparity, if it’s based on a particular skill or qualification is going to exist, whether you use AI or not, whether you use AI or not, you’re gonna have, if there’s a particular job requirement that is not uniformly distributed based on race and gender, it will lead to a disparity by definition, even if you’re doing everything right.

 

Better off with AI in recruitment

Craig Leen:

And then the point is you have to validate it, or you have to explain it or show that it’s job related and consistent with business necessity. So in doing that, I would ask companies, do you wanna have artificial intelligence there to help you or not? Cuz if you don’t, you’re still gonna have to explain it, but now you’re not gonna have the objective sort of information that you would have. And, and a lot of it too, typically that you would with AI. So I think there’s this sort of misnomer or misunderstanding out there that if you use AI and there’s an adverse impact, that’s a problem. Whether you use AI or not, there’s gonna be adverse impact for certain jobs, certain qualifications and skills that are required. The question is, how do you justify it? If you don’t have anything to justify it, you may have O F C, C P come in and say, that’s dispar treatment. They may say you, you can’t explain this. We’re gonna assume that that’s disparate treatment. And the law allows that if it’s more than two standard deviations or they may say, okay, fine, it’s because of this qualification, but you’re, you can’t show me why that qualification is needed or how that’s impacting the applicant pool. Whereas with AI, you could show that you could show how it impacts the applicant pool and the thought that went into picking that particular skill or qualification. You’re better off with the AI in my opinion. So

Jason:

Craig, you gave us a great list of three. You even gave us a bonus fourth. One of the things that raises a lot of questions and you know, people get concerned about, you mentioned new laws coming out and popping up in different places. You mentioned New York, I know a year or two ago, there was some law in Illinois. Can you summarize what they mean for vendors and buyers or how to read into these new regulations?

 

Adverse impact with AI recruitment

Craig Leen:

Yeah. So you have the federal regulators, then you have the state and local. So the federal regulators have been pretty clear in their message here. What they’ve said is that you have to follow the uniform guidelines that have been in place since the seventies. So the traditional rules that apply to adverse impact apply to AI. And that’s really what they’ve said. Now, is there gonna be additional guidance or nuance probably. But at this point you’re not seeing any new laws or regulations. There’s a lot of different types of artificial intelligence. Some are really focused on skills and qualifications. Others are more focused on what would’ve been traditionally, a test. Like if you answer these questions, then was this someone that’s a good candidate, a good person to interview. Sometimes it’s based on the way someone sounds or how they do an interview, that’s videoed or sometimes it could be based on a variety of different factors, but they’re painting with this broad brush.

Craig Leen:

So I think that that’s causing some degree of anxiety among companies. And then I think the other point I’d make about all the state and local laws. Other than that, it would be nice if there was a uniform one that came from the federal government. I think that the other point is, okay, well, what if you just don’t do well with the AI? Let’s say you’re a good candidate, but because of a protected class basis you have, or because of something you can’t explain, the AI never gives you a good score or never passes you through. There’s this idea that it’s going to cause harm to certain people. And I think the less you can explain the outcome of a particular AI, the more that that concern is valid. Having said that I think the opposite is what’s true though, particularly for good AI that’s doing all the EEO legwork and that’s focused on being able to explain why someone’s getting a better score based on skills or qualifications they have because in the long run, AI is more likely to identify people who have been overlooked by humans or that have been subject to unconscious bias or other biases in the workplace and not given a fair chance. The AI is more likely to identify them.

Craig Leen:

So yeah, there’s this concern out there that what if I don’t do well in the AI? And I can’t explain why it’s not tied to my skills and what if you’re seeing AI harm or particular group or being used to harm a particular group, like almost disparate treatment that concern is worth having, and regulators should look at it, but the top artificial intelligence platforms, that’s not what they’re do. They’re doing the opposite of that. And that’s what you should look for as a consumer, as someone in this area, cuz that’s one thing the agency’s gonna wanna know. Do, are you identifying more skilled workers? Is it helpful to your productivity? You know, the AI platform that you’re using, are you identifying qualified people? So you’re gonna wanna show that you’re gonna wanna show an AI that cares about equal employment opportunity that has profile masking that makes sure to recognize this concern that a particular protected class may be harmed by the AI through no fault of their own.

Craig Leen:

And what is the company doing to address that? And then is that something that could, is job related and consistent with business necessity? In my view, as a former head of a regulatory agency, the more information, the better, the less explainable it is, that’s where the regulators start to get more concern. And that would be true, whether you use AI or any sort of employment test or selection, qualification, or procedure, they’re gonna wanna know why you chose it and why you need it. Assuming you can satisfy those two. You’re fine. And as I’ve said at both cultivate conferences, I think in 10 years, maybe sooner AI will be the standard of care you’ll have to use AI. You can’t just rely on humans, making decisions that are affected by unconscious bias. So the way to address that is have artificial intelligence tell you who has the best skills and then the human element comes in. But at that point you’ve already identified those skills. And you also know who has the skills the most. So when you have humans then come in and make selections, you can test, why are they picking the people with the best skills? If not, if it’s two standard deviations one way or the other, then you know, you have a problem with your selection procedure, not with your AI. Yeah.

Jason:

Well Craig, I think you’ve given our audience some key things to think about as well as some key questions to ask as they’re evaluating their vendor or making selections for new tools going forward.

Craig Leen:

But the underlying point I want to convey if someone listening takes one point from this is in the long run though, AI is the way to go. That’s what I think. I think that in the long run that decreases liability increases equal employment opportunity. Don’t just take the view that AI is bad or AI is a concern.

 

Embracing AI in recruiting

Ligia:

What you’re saying is embrace AI, educate yourself and pretty much do your homework.

Craig Leen:

Yes, no. I’m definitely saying that when hopefully we’re building a world where profile masking will not be necessary, but because of unconscious bias, it still is in my view. And it gives everyone that fair chance to be selected.

Ligia:

We could go on for hours on this topic, but that’s all the time we have today. So we wanna end with a tradition we have on this podcast. The question is always around potential. So if you hadn’t gone into law and pursued being a lawyer, what other passions would you have pursued?

Craig Leen:

There’s a couple areas. One I would’ve liked to have been a professional runner. I was pretty fast runner in high school. I won the city championship in the 800 meters and I was good at cross country, but I always had tendonitis in my knees and it cost a lot of pain. I was never able really to pursue that sort of professional running career, which by the way, even if I had, I would still want to be active in the disability area in civil rights area, cuz it’s very close to my heart is something I believe in very strongly. That’s one another area. I would say when was a kid I used to want to be president and I haven’t really pursued, you know, I was the head of a federal agency. So I guess it’s not so far you

Ligia:

We’re on your way. Yeah. You

Craig Leen:

We’re close. I haven’t really pursued politics. I’ve been more an appointed official and a lawyer and things like that. And then, you know, more generally I would’ve loved to have been an artist or a musician. If I had more talent, I love music and, and art and things like that. And I feel like being able to express yourself that way, maybe through literature, like as an author, I would’ve enjoyed that. And you know what? Maybe I’ll do that one day.

Ligia:

It’s never too late, Craig. All right. Well listen, thank you so much again. We might have to have you back and everyone else. Thanks for joining us. That’s a wrap. Thanks for listening to the new talent code. This is a podcast produced by eightfold AI. If you’d like to learn more about us, please visit us@eightfold.ai and you can find us on all your favorite social media sites. We’d love to connect and continue the conversation.

You might also like...

Listen On