Successful AI transformation isn’t just about algorithms — it’s about people, alignment, and preparation. Despite massive investment, most organisations are stuck in the “messy middle”—caught between promising pilots and enterprise-wide failure.
It’s time to rethink the roadmap. How can organisations escape this pilot purgatory and unlock the real productivity gains promised by AI?
While companies treat AI as an IT upgrade, they overlook the human silos and skills gaps that actually determine success. New data confirms that the missing link isn’t better algorithms, but a radical alignment between the CHRO and CIO.
In this session, we unpack insights from our global survey of 700 executives to reveal what AI leaders do differently.
We discuss why leadership alignment is the single biggest predictor of success and how a “people-first” strategy is the key to escaping the messy middle.
You’ll learn:
Rebecca Warren, Lou Celi, and Sharee McLaurin discussed the integration of AI in organizations, emphasizing the critical role of the CHRO. They highlighted a global survey of 700 executives, revealing that 92% of companies use AI, but only 21% scale it successfully. Key factors include radical alignment between CIOs and CHROs, shared governance, and a focus on business transformation rather than just technology. The conversation underscored the importance of addressing employee concerns, fostering a culture of experimentation, and ensuring AI enhances human skills like decision-making and interpersonal relationships.
Rebecca Warren 0:00 Hey, hi, and hello, everyone. Welcome to our first Talent Table of 2026. Super excited to have some amazing guests here. I want to apologize in advance; Rebecca Warren is fighting a cold here, so if I sound a little froggy, I apologize. I’m going to do my best not to share anything virtually or over the sound waves with you. So, welcome to the Talent Table, where we are super excited to talk about a whole lot of things—around data, around people, around culture, around connecting CIOs and CHROs. We’ve got a huge talk planned. So, what I would like you to think about first is looking around your screen. If you want to interact, feel free to click the widgets at the bottom of your screen. If you have any questions, feel free to pop them into the Q&A. Our goal would be to get to them live, but if we don’t, we may address them in a future episode, blog post, or on LinkedIn. If you want to do some additional reading, feel free to check out the resources section for the full report that we’re going to be talking about today, called “The New Role of the CHRO in AI Transformation.” And if you love the Talent Table, or if you’re new to the Talent Table, feel free to click the link for next month’s session so you get it on your calendar and get a chance to join us next month. We do this every month. Alright? So first, I’m going to have our speakers introduce themselves, then we’re going to get to the question of the month, and then we’re going to go ahead and kick it all off. So let’s start with our speakers. Sharee, do you want to go first? And then we’ll have Lou; you can introduce yourself.
Sharee McLaurin 1:51 I’ll do what Rebecca does. Hey, hi, hello. I am Sharee McLaurin, Vice President of People and Culture for Tang & Company, which is an occupational health and safety organization. We’ve been in business now close to 50 years, and so in about 42 states nationwide, really ensuring that America’s workforce returns home safe and healthy to their loved ones. And really, our mission is about caring for people and keeping them safe. So in my role, I really focus on all of the Human Resource aspects for the organization, really looking at how to elevate the people’s experience, and also embedding all of those cultural pieces in a more meaningful way. So when we think about performance, onboarding, recognition, leadership development, diversity, inclusion, well-being—you name it—my function kind of oversees all of that, and it’s been a really, really rewarding position so far. Prior to that, I spent most of my career in Human Resources in different capacities, but most of it has always been in the healthcare space, in larger healthcare systems. So really excited to add my perspective and learn from Rebecca and Lou today. Awesome.
Rebecca Warren 3:11 So glad you’re here. All right, Lou.
Lou Celi 3:17 Hello, everyone. Yeah, yeah, it’s great to be here. My name is Lou Celi. I am the CEO and Founder of ThoughtLab. To tell you a little bit about myself, I was formerly a Managing Director at The Economist Group. I was also the President and Board Director for Oxford Economics. And in 2014, I decided to set up my own company to focus on thought leadership, which I had done at my prior organizations, but I wanted to do it differently. This time, I wanted to focus strictly on technology, its impact on people, business, government, and the world, and to use the latest qualitative and quantitative tools to do that analysis. So I brought in a team of economists and thought leadership specialists that can do more than just surveys and interviews. We do full-fledged economic impact analysis, performance impact analysis, and leader/laggard analysis. All of those techniques were used in the report that we just concluded with Eightfold on AI and the CHRO, as you were referencing, Rebecca, and we’re going to be talking about it today. So I’m very excited about being here and kicking off, I guess, the first talk we’ve had about the study since it was released.
Rebecca Warren 4:54 Yes, so this is why I brought you two together. We want to make sure that we’re talking about the data and the information, and we also want to make sure that we’re keeping humanity in the process. So I think we’re going to have a great conversation, ticking and tying and putting things together. But before we do that, the question of the month. Are you all ready? Okay, here we go. I had a lot of fun coming up with this one. We’ve been spending a lot of time—and on the Talent Table, we talk a lot about what’s happening in the future, where are we going, what does it look like today—we’re regressing. We’re going a little bit back in time. My question of the month is: You’ve been dropped into 1952—very specific for a year—with no modern tools. Now, which daily responsibility do you think would break you first? I had to think about this one, and I came through a couple of different answers. I finally landed on one. But what do you think? You’re in 1952, no modern tools. What’s going to knock you down when you think about daily responsibilities? I’m going to shoot it to either one of you. Who wants to start?
Lou Celi 6:09 Oh, well, I can start because I was almost around in 1952. I was almost one year old. So look, I thought back to when I… ’52 is hard to even think back to, okay? Yes, it’s hard to even imagine. You know, it wasn’t that far after World War II. But when I think about my first job, I have to say I can’t imagine working without a computer. I couldn’t do the research. We couldn’t be here today on this video. It’s just amazing.
Rebecca Warren 6:50 You’d have a completely different job, wouldn’t you?
Lou Celi 6:53 Yeah, and I started as a journalist, don’t forget. So I would be on a typewriter. I remember using a typewriter; that was really hard to make changes, to get things done, and research. I mean, keep in mind, this research that we did was online. Back then, you had to go to a library and write down a few things from the encyclopedia on a piece of paper. It was very hard to do this writing back in the early ’50s. It’d drive me crazy.
Rebecca Warren 7:27 Yes, knowing what you know, and then going back, right? If you don’t have it, you don’t miss it, but when you have it, that’s when it feels very painful. All right, Sharee, what do you think?
Sharee McLaurin 7:41 This one was so hard. I actually shared the question with my family just to see what their reactions would be. And it got out of hand. Everybody was so curious. Like, “Whoa. Am I a man? Am I a woman? Like, you know, am I white? Am I Black? Am I purple?”
Rebecca Warren 7:59 They wanted to know the rules and—oh.
Sharee McLaurin 8:02 Yes, you know, “Am I middle class? Am I upper class?” I’m like, “Guys, it’s just…”
Rebecca Warren 8:12 I love the thought that goes into it. I want to hang out with your family. Oh my gosh.
Sharee McLaurin 8:16 But I thought it was really cool. And I ended up kind of feeling like Lou. Like, if… and I just had a conversation about typewriters literally yesterday with some of my Gen Z teammates, and I was like, “I’m going to make you guys do your work on a typewriter.” And so they’re like—they literally said, “What is that? Like, where do you find those?” And I’m like, I have to find one because I remember it being a requirement. And I wasn’t around in 1952, but I absolutely had the experience of working from a typewriter, and that being the sole kind of, you know, equipment that we had available. And, like, really loading the paper… so I agree, not having, you know, computers and technology, that would really break me.
Rebecca Warren 9:10 Okay, and I went to something a little bit less technical. So my first thought was daily responsibilities. I use my dishwasher every single day, like every single day. So that was my first one. And then I’m like, “Yeah, but you know, I could wash dishes.” You know where I landed? Laundry. Laundry. I do not want… I looked it up. I had to look it up. And was like, what was that like? And it was like the hand scrubbing and then the roller and then the drying. And, no, no, I don’t want to have anything to do with that. So I think I would lose my mind with laundry because, you know what? That stuff’s going to add up fast, and I don’t want to take all day or two doing laundry. Oh my gosh. So, yeah, so for me, I forgot about the tech. Oh, like, I just need machines to do… like, yes, it is tech, right? Like I need machines to do my laundry.
Oh, boy, alright. Well, I love the thought that went into this. This is great. And Sharee, I’m not kidding. I want to show up for dinner one night with your family, because I think we’d have a great conversation.
Alright, so let’s continue the talk about technology and talk about what this looks like if we’re going to align leadership, people, and strategy? So when we especially… We’re thinking about AI, right? A lot of folks talk about AI like it’s a software update. You click install, you grab your coffee, and you come back with amazing productivity. Woo hoo. That’s not really how it works. And I think we all know that a lot of organizations get stuck in long, weird cycles that look like: you run a pilot and then you run into a complication or an obstacle, things stall, goes on the back burner, rename the project, switch out the team, try again, or you’re like, “Hey, we’re just going to scrap this whole thing because it’s too much work.” So we call that cycle “Pilot Purgatory.”
We want to understand why some companies are thriving while others are in this pilot purgatory. We’re going to work with the data in the report that Lou and I talked about earlier. So we’re going to be referencing a global survey of 700 executives across different industries. Now, this research identifies a specific group. We’re going to call them the “AI Leaders” who have successfully moved past this pilot purgatory, this being stuck on AI failure. These folks are seeing massive gains. So what we’re going to talk about is what that secret sauce is. It’s not a better algorithm. It’s what we’re calling “Radical Alignment” between the CIO and the CHRO. So Lou, I do want to kick this off with you connecting us to the survey. So tell us a little bit about the survey, how it got started, how you decided what data you wanted to go after, and then I wanted to ask you a specific question about something in the survey. But talk to us a little bit about the survey and what you found.
Lou Celi 12:31 Yeah, so we conducted the survey in the third quarter of 2025, so the information is very fresh. And we wanted to understand the latest trends in the use of AI in organizations, and particularly the role of the CHRO. And when we went into that study, the role of the CHRO was not very prevalent in AI innovation. So we wanted to see if that was still the case, if things had moved on, and wanted to see what leadership in AI really looked like. So we did a survey of 700 executives around the world. They were from general management and technical management, so people like CEOs, CMOs, CHROs, but also CIOs and CTOs. They were in 10 different industries, from manufacturing and banking to technology and life sciences; even the government was involved as a survey respondent. It’s an international study, so we wanted to look at the trends from Europe and America as well as Asia.
And so what we did in addition is what you were just talking about before: we created a leadership framework. So we were able to look at which companies were furthest along in AI transformation, and which ones were still just starting out, and who were the ones in between. And you know, most of the folks are in the “messy middle,” as we call it—the 56% in the messy middle. There were about a quarter of them that were just beginners, and about a quarter of them that were really leaders. And to do that framework, we based it on seven areas of best practice. And so that was about leadership and culture and data readiness and how prepared they were for the future of work and governance and other ideas like that. So we can see how they plotted. And what we found is that a lot of organizations were stuck in that messy middle.
Rebecca Warren 15:02 Yeah, yeah. So, okay, so let’s… I just want to dig into that just for a second. Like, how does that happen? Like, when I look at one of the stats, 92% of companies are doing AI—whatever “doing AI” means—only 21% are actually successfully scaling it. So why is it that so many people get stuck in that messy middle and they can’t get out of it?
Lou Celi 15:31 Well, there’s a few reasons for it, but I think to make it simple, the main reason is not being ready for AI to scale. Okay, first of all, they do not have the technology and data in place. Their data is not reliable enough. It’s still stuck in silos. They’re working with legacy systems that are not well interconnected. They don’t have that infrastructure to support, you know, AI scaling. They also don’t have the governance in place, which is a very important thing, because there are certain risks associated with AI use: security risk, privacy risk, even ethical risk. So you have to make sure you have your governance in place and compliance systems. Because—not in America; we seem to be going the opposite way, but in many other countries where more rules are being placed on AI—so you don’t have the foundation.
But most importantly, they don’t have the organizational foundation, because at the end of the day, it’s not about technology; it’s about people and the organization. And they don’t have the commitment of their staff. They don’t even have the commitment sometimes of the leadership team. They don’t have the skills in place. And so without that recognition that AI transformation is basically a people and organizational transformation, they can’t really succeed and get stuck in the middle.
Rebecca Warren 17:10 That makes a lot of sense. There are a lot of hurdles to overcome in order to get to that ability to scale. So Sharee, let me ask you, because there was another specific data point that came out, which I think you are absolutely the right person to ask this question of. So 93% of the AI leaders say that the CHRO is critical to their success. So in your experience, why is the head of people suddenly the most important person in the room in these conversations?
Sharee McLaurin 17:48 Yeah, and if I’m being transparent, at first I was… I did not see myself or any, you know, people leader as that—as being kind of the most important or influential person around AI. I completely saw it as, “Oh, that’s the IT function. What are they going to do about that AI thing?” I kind of disassociated myself from it, if I’m being, you know, transparent. And so over the last, I’ll say, maybe year, I really started to kind of lean in a little bit to how it’s impacting the day-to-day experience of the people, and that’s exactly where my focus is, right? And so what I did was, for our organization, I implemented an AI policy because there was nothing around how we should be using AI responsibly. And I’ve really used compliance as a way to enter into the conversation, to Lou’s point.
And I think now I’ve been able to draw attention to it for our executive leadership team, and also thinking about, you know, how we want to transform, and what that’s going to mean for the people. We already have, like, 100 applications that folks got to use every day to do their work, right? And so when we think about the change management around it—and change fatigue is real, right?—so I’m thinking about that live, day-to-day experience. And I’m able to uniquely speak to that, right, as the head of people. And so being able to partner with the head of IT and Information, I think that that’s going to position us to better equip that overall experience throughout the whole life cycle. And I see more and more of my peers starting these conversations, and I think compliance is a great way to enter into it, and then start to look at what adoption looks like through that lens as well.
Lou Celi 19:50 I think it’s an evolution. I think it’s, Sharee, because you’re talking about a few years ago, where the CHRO didn’t even have a seat at the table. Yeah, okay, which is consistent with our research. And now they have a seat at the table, and it’s a very important seat, sitting right next to the CIO, because they recognize that they have to make sure the technology works for the organization. Correct. But I’ll tell you, in the future, I think it’s going to go even further. The CHRO is going to be crucial to AI transformation, and I already see it happening in some companies.
So, for example, ServiceNow, a company I work with as well… you know, they have a CHRO who they call the Chief People and AI Enablement Officer, because what they’re realizing is that you have to do those things together. As a matter of fact, the more that AI agents become like workers, they have to be managed. They have to be trained, onboarded, just like humans have to be. Yeah, and they have to work side-by-side, so they work closely and productively with different people. And so it’s really interesting. And Moderna also has organized it that way. Yeah. And so… I think it’s a trend that decides, you know, we’re not even going to just look at it as just something that’s material. We’re going to look at them almost like we look at people working side-by-side. I think it’s a really interesting development.
Rebecca Warren 21:28 Yeah, I agree. I was chatting with a CHRO yesterday, and Sharee, this goes to what you were saying about paying attention to what impacts the people. He’s been in his role for about six months now, and when I was asking him some just general questions, he said, “I’m a leader who just happens to sit in HR, and my job isn’t HR. My job is to think about overall outcomes of business, right? But through the lens of how it impacts people.” So looking at management systems, looking at network optimization, looking at different structures and strategies. So when you said your role in your head shifted, right? Thinking about the impact of the day-to-day experience of the people, I think that’s where it comes… it shifts from an AI project to a reshaping of how work gets done. Right? Because AI isn’t plug-and-play. It doesn’t just optimize tasks. It reshapes how work gets done, who does it, and where that value comes from.
So what we’re seeing—and Lou, you talked about this too—it’s really that human friction that shows up, and AI amplifies the issues that orgs already have, right? If you just throw AI on top of it, you’re going to see everything in sparkling wonder or disarray, right? So when we see that friction, you know, people… if we’re not being transparent and we’re not pulling in the whole process, people are protecting roles that they think are at risk. Managers don’t know how to measure performance when tasks are shifting or roles change. Teams are resisting tools that create new workflows, but they don’t understand how they fit in the big picture, or what the incentive is to make those shifts. So what you both have said, right? Treating AI as an IT upgrade keeps it kind of in this little box, and that’s where you see that “messy middle.” Adoption stays in that shallow end of the pool, and AI is just another system that someone tries to ignore.
When we think about that as work transformation, it forces—and Sharee, I’m curious to hear what you think about this—it forces folks to live in the uncomfortable spaces where we’re like, “We don’t know what this looks like.” You know, where we have to align technology, talent, leadership, culture. And when all of those pieces align, then you start to see that scaling. So, you know, when we think about companies that treat AI solely as a tech challenge, you know, let’s talk about why the wheels fall off. What’s missing? Is it talent, skill gaps, use cases? Like, what are those things that you’re thinking about when we look at it from a holistic perspective, instead of just as an IT project?
Sharee McLaurin 24:34 Yeah, I think too, when I think about HR, it’s really about: how can we add value for the organization, right? When we look inside-out and outside-in, I think that’s really what I’m always thinking about: How can we add value? And I think when I think about the organization that I’m currently in, we’re not a big tech giant, right? And I think it’s important that organizations really kind of meet their people where they are, meet the company where they are—is something that I always love to say—and not try to kind of keep up with these, you know, Technology Joneses or the AI Joneses. I think sometimes people are looking around and wanting to adopt things that, to Lou’s point, you’re just not equipped for. You’re just not ready for it. You’re nowhere near having the appropriate tech stack to be able to implement something and scale it, for that matter.
I think it’s also important that we think about AI as more of a sociotechnical challenge, rather than just a technological one. Success depends just as much on the people and the processes and the culture just as it does on the technology itself. And so I think that’s the conversation that we’re, you know, having now, because we can find a million AI tools, right? We can look to implement them. But are they meaningful? Do we know what solutions we’re actually trying to solve for? And is it going to add more burden to the workforce and that day-to-day experience? That’s something that I’m always going back to: like, what is it going to do to the day-to-day experience, and is it going to be actually meaningful? So that’s the kind of thinking that I think, you know, is coming up for leaders when they’re treating it just as a sole tech challenge, right?
Rebecca Warren 26:23 And Lou, as we think about that… so Sharee, that’s, I think that’s a great way to look at it. How do we meet people where they’re at and push them along, right? Get them out of the space where they’re comfortable. So Lou, talk to us a little bit about that, where AI leaders are able to make that shift.
Lou Celi 26:43 Well, I agree with what Sharee is saying. First of all, the reason that the wheels fall off is really about purpose and people. All right, you’ve got to make sure you’ve got the right purpose. It’s not just technology for the sake of technology; it’s about what is the business problem I’m looking to solve much better, and how can AI enable that? Okay? So it always has to be based on a business purpose and value. They go wrong when they don’t think that way. Okay? And not only just think that way, measure to see if you’re actually delivering on that purpose.
But the other part is the people part, right? So, you know, that’s where it really goes wrong. You can build a wonderful new car that has all the latest devices in it, but what is the good of it if people don’t want to drive it? Yeah. And what if you built it in a way that the people didn’t want it in the first place because you didn’t really talk to people? You didn’t ask the right question, right? “What do you want in the car? How would you prefer working? What are the biggest pain points you experience? Where do you want to add value as a human? Where would you like a machine to take repetitive tasks or problems off your plate?” And so that’s the part that’s really missing, yeah, and that’s why the people part is so important. Because the CHRO has their finger on the pulse. They know what people are thinking about—their pain points, their desires, their aspirations, their frustrations—and that’s what has to be taken into account when you develop your AI strategy.
Rebecca Warren 28:31 Yeah, yeah. So what we’re hearing and seeing is what’s coming out of our conversations. It’s not a shortage of technology. It’s a gap or a disconnect between what AI and other tech tools can do and what people are prepared, supported, and incentivized to do with them. So to what both of you have said, this is where AI stops becoming a technical conversation, and really gets into the leadership, right? Into the folks that are driving those business outcomes and responsible for some pretty big, pretty big goals.
So closing that gap—and I said this before—closing that gap is about redesigning the work, right? How skills are developed, how they’re tracked, how they’re looked at. And then how much permission people have to learn and to experiment and to grow and maybe to fail while they’re doing it. So how can we think about this from a different perspective? Of “AI will replace you” to “AI will augment you without losing trust.” Because Lou, in your report, 70% of employees are worried about being replaced. So how do we get to the point where folks feel comfortable to ask questions, to lean in, to practice the skills that they’re learning real-time?
Lou Celi 30:03 Well, let me just say a few things. In studies that I did for you or others, people now are enthusiastic about AI. That has changed; a year or two ago, people were not that enthusiastic. If you had a dinner party—like the dinner party of Sharee—and you brought up the question, “What do you think about AI?” you would have a whole debate going on.
Rebecca Warren 30:36 If we were around Sharee’s table, we might be talking about that for days. It sounds like her family pulls it apart, yes.
Lou Celi 30:47 And that was before ChatGPT, that was before Gemini and Copilot, and before people started looking and learning about it and saying, “You know what? This is great. AI creates videos. I can go out and answer questions immediately and get recipes. I can do so much. AI knows what movies I like. It’s telling me some good suggestions for Netflix.” So, you know, people start to get enthusiastic about it. Okay. And on the job, people realize, “Well, I don’t have to do that grunt work anymore. It’s going to be done for me. That’s great. I don’t have to worry about all that documentation stuff. My productivity… I have time for myself. I have a little bit more time in the day.” So they’re actually saying good things.
However, they also have fears. Yep. As they learn more about it, they realize, “Well, I’m in a more clerical job. Why would they need me if AI starts to become Agentic AI and can actually do things right?” And so… “Do I trust the output from this AI? Is it reliable? Is it… and is it going to somehow… Am I going to lose my privacy because of it? Will they all know what I’m thinking and doing?” So, you know, you get these concerns.
So I think the role of the leader—to get to the answer to your question—is to be able to tap into what people want from AI and their concerns about it. I think that means having honest discussions upfront. I think it means… one of the big fears is that they’re not going to succeed in an AI-driven business world. So give them the skills and the training to build their confidence. Get more of their sentiment of what their thoughts are about, what they want the solution to do so that it actually… they have ownership of it. They believe in it. Have them share experiences with their teams. You know that nothing sells better than one employee saying, “I’m doing this,” to another employee because they trust each other. And, you know, building a culture of experimentation, give them tools, tell them to experiment, get them really excited and involved. And I think that’s important. And, you know, some CHROs always also have change management programs they put in place, because at the end of the day, it’s about changing the way you get things done.
Rebecca Warren 33:31 Yeah, so all of those things are… I’m taking notes on things that I’m thinking about, and all of the great things we’re learning. And so I think, Sharee, this is where you talked about the value piece. I think it’s that uncomfortable conversation to have with people about what their value is to the organization. And it’s shifting, as you said, Lou, it’s shifting from checklists like, “Hey, I accomplished a payroll run today,” and more about, “What are those human-only things that we can provide?” And honestly, I don’t know that a lot of people have thought that way, especially if they are in a busy admin tactical role. Like, it may be uncomfortable, but to say, “Where can we lean into the human-only pieces that you provide, not just your ability to accomplish lots of tasks?”
So thinking about that shift, it’s hard for people, especially if they are motivated by checklists or checking off things, and we’re asking them to think differently. But it has to be that way because the world of work isn’t going back to 1952, right? We’re not going to take out tech. We’re not going to take out AI. So when we think about those top human-only skills, right? Strategic thinking, creativity, agility, curiosity… Talk to me a little bit, Sharee, about some of those things, about the shift that we need to pay attention to, to get to those… What can AI do well, and what do we need to make sure humans are doing? And not just from a compliance standpoint—I think it’s, in my opinion, I think it has to be bigger.
Sharee McLaurin 35:18 Yeah, I think what I’m seeing is, you know, we still have five generations in the workforce, right? And so with all of the different generations that are in the workforce, there’s different mindsets around things, there’s different things that each of those generations have experienced, and that is kind of informing their response to this revolution, right? And so even on my team, I have a few Gen Xers, and I have observed that they are a little more apprehensive to start to adopt AI and to help them get their work done. And it was almost a pride thing, right? Like, “I know how to write a report. I know how to send… I’ve been doing this for years. I got this.” And what I have started to do is kind of shift the thinking around that. “Oh, we know you can do it, right? We know that you have that skill set. But how do we make your day easier? How do we make the workload lighter for you?” Right? Burnout is a thing for everybody, right?
And so shifting their mindset around how do they look at it as a resource for greater efficiency, and then also signaling to them like, “Hey, your teammates, they’re using it, and they’re getting, you know, things done quicker, more efficiently.” And so we want to make sure that you know you’re not feeling like you’re going to get left behind, or that your job is going to be replaced because you’re like, “I know what I’m doing. I’m not going to embrace it. I’m not going to adopt it.” But how do you start to kind of marry the two, and then also lean in, be able to have more time and resources to kind of lean in on some of the other skills that are going to make a difference in that experience? Right? Having conversations, stakeholder engagement, right? Really looking to help innovate in certain other areas and collaborating differently. I think that when we have more productivity in some of our day-to-day tasks, we can go and have conversations with people in different functions and work on different initiatives together. I think it also gives more space and time when we think about belonging, right? Creating a greater sense of belonging for folks and making sure that we’re being more intentional, especially when we have a distributed workforce and we’re not always all together. I think that we can have so many opportunities to let AI do some of the heavy lifting on the day-to-day tactical things, so that we can be in conversation. And co-creation is really what it’s about, because people really support what they help create, I think, right?
Rebecca Warren 38:09 Well, yes. And so I think when we talk about… so we have a workforce that’s distributed, right? We’ve got different generations. We’ve got all kinds of things with people that are already in the org. Now we have to think about bringing people and adding to that organization. And I, as a former TA practitioner, I have certainly been involved in searches for the Purple Squirrel, you know, the unicorn, the—I always say—you know, with the top hat and sunglasses and all the things, right? So when we think about bringing people into the organization, and we’re talking about we need folks who are going to be agile and creative and be able to shift quickly, we’re not actually hiring for that. We’re still hiring… I was looking at… I have a friend of mine who’s looking for a gig. And so something came to me, and I was like, “Oh, this could be interesting.” And it is a very large company that is focusing on skills. In what they say, that job description made me want to, like, literally tear my hair out. It was this laundry list of all of these things you must, must, must… not one place in that entire job description for an organization that says that they’re focused on skills that have KPIs or expectations, or “Here are the skills you’re going to use.” There was no talk about potential in that. Like, I think if we’re also going to shift the narrative, we have to hire differently. Because we’re hiring for this specific job description, putting people then into an organization that is working on shifting. I feel like there’s this big kind of explosion that’s coming because I think we’re hiring for specific tasks and maybe not the ability to evolve. So Lou, when we say 5% of AI leaders say potential to learn is now more important than past experience, how are you seeing people hired for that and think about that differently when they’re bringing folks into the workforce?
Lou Celi 40:14 Well, let’s look at it from the top down, right? Let’s try to tie everything together. Your first question was about 1952.
Rebecca Warren 40:27 You were barely alive. You said it.
Lou Celi 40:30 Was barely alive. And basically, digital was just not even… not even part of the talking points, right? You didn’t even have computers when you were just starting, so you had a completely different time. So when we worked in the ’50s and ’60s, it was really mostly worked up by people. Now, fast forward to where we are today, yeah, all right, and where we’re going to be in five years. Because where we are today almost doesn’t matter because we have to plan ahead for the next three to five years if we’re thinking about AI as a transformation of the business. Now think about myself, a Boomer. We’re in the workforce now. We’re not going to be in the workforce in 2030; there’s only going to be like 2%, 3% according to the statistics. So we’re disappearing. What I think Sharee was talking about was Gen X? Well, Gen X and Gen Y are the biggest cohort. Yeah, they make up about 65% of the workforce when you look at it internationally, okay? Even so, in three years, the X’s are going to start to move out, and it’s going to be the Gen Z and Gen Y who are going to make up two-thirds of the workforce. So that’s where the hockey puck is moving. You have to be thinking about those people, what motivates them, and starting to plan for that. That’s my first point.
The second point I would make is that you have to be thinking about the future of work. That’s what you’re talking about, Rebecca. What are the skills needed in an AI-driven business world? Okay? Now this is where it gets different from 1952 because now you have to be thinking: what are the computers going to do, and what are the humans going to do? What are the computers best at, and where are the humans going to add value? It’s almost like you’re reversing the script. You’re thinking, “If I build a company all around technology that could do all these things, what would be left for humans to really… where would they really add the greatest value?” So where they… and I did a study that looked at what people do with their free time from AI. And what they want to do is, first of all, they want to be trained and learn the skills of the future. That’s why learning is so important. Because this is moving so fast. If you don’t… if you’re not open to learning and experimentation, you won’t succeed right in an AI, quickly, AI-transforming world. But it’s another thing too. People don’t just want technology. Your customers don’t want everything because of technology. If they have a problem, they want to talk to a person. So AI, when constructed right, can make humans better humans. Yeah. And they can actually… Okay, and that’s really important. They can free up their time to talk to their customers, to collaborate with other people—that is so important in AI transformation, right?—and to make decisions. Because no one trusts AI alone; there always has to be a human in the loop who’s actually saying, “Is that the right decision?” So decision making, analysis, interpersonal skills—all the things that make us uniquely human are the things that you have to be thinking about for the future. And so that’s the way I look at it. That’s why, at the end of the day, being able to manage these models in a reliable, responsible way, and being able to solve problems and deal with other humans in interpersonal skills—that’s what’s really going to be very important, I think.
Rebecca Warren 44:31 You know, Lou, it’s so interesting that you say that because I was just… I heard something the other day when you talk about allowing us to be better humans or more capable humans, and people want to talk to folks on the phone. Like, for a while, it was like, “I just want my answer. I want a robot. I want a chatbot.” And now people are like, “No, I want to talk to someone.” And what I was hearing—and I thought this was fascinating—is that more people now want to talk to someone because we are personalizing experiences across so many different platforms and different ways of connecting with folks, and it’s all personal. And folks now believe that their problems, their question, is so unique that the only way it can be understood is by talking to a person. And that just kind of rolled my mind a little bit like, “Oh yeah. Like, I guess that makes sense. Like, I just want to explain to you why this is important to me. I want somebody to know a little bit more than just chatting in a question.” So what you said about being better humans and allowing us to have those conversations, that’s also what people are craving as well—is, “I want somebody to understand me and to hear me.” Even if the person on the other end has heard that same thing 12 times, the person who’s calling in feels heard and special and connected to. So super interesting that you say that because that really kind of connected for me, went, “Oh, that makes sense.”
Lou Celi 45:58 And I think it’s also doing with employees, okay? Because I think a lot of people—you talked about this before, Sharee may have some more for this—but a lot of people have operated in silos as we… yeah, yeah. And you can’t work in silos to do customer experience, for example, because it’s not just one department doing it. It’s the service person, it’s the support person, it’s the marketing people, it’s the IT people, if it’s a self-service function, it could be a whole range. It’s the product people. So you need to be working together, and AI actually facilitates that, because first of all, you have more time to talk to other people. And secondly, AI allows you to bring data and workflows together so that you can collaborate much more effectively. So that’s why I’m excited by it.
Rebecca Warren 46:47 Yeah, yeah. Okay, so I’m watching the time here. Sharee, did you have anything to add to that before…
Sharee McLaurin 46:52 I was just going to add, just really thinking about: how can we bring value to those clients, customer, and employee experiences? I think too, there’s so many moments throughout any given experience, and if we’re leveraging AI to kind of do, to Lou’s point, do all the busy work, we can decide, you know, how to curate those experiences in ways that’s going to be more meaningful. I think we don’t do enough around that anymore. It’s kind of like, you know, we’re just going to answer your question and send you on your way. But how do we sustain our clients? How do we, you know, move them into some other product offerings that we might be having coming down the pipeline? Right? Really being a little more forward-thinking and thoughtful in what that connection can do for us if we keep that human kind of connection alive.
Rebecca Warren 47:44 I love that. Took a note on that: “Curating experiences.” Those are such powerful words. Okay, so when we talk about this connection between people and tech, right—CHRO, CPO and the CIO, CTO, whatever those C-suite titles are—I want to get back to that piece of why some organizations are able to accelerate where others get stuck in that pilot purgatory, specifically around AI, but I mean pretty much any project right now. So that alignment is really that ignition. It’s the difference between motion and momentum. So that alignment causes AI to become an opportunity to reshape the work for the good of the organization. So, you know, I used the word “radical” before, kind of that… not just average. So Lou, when we think about Radical Alignment, what does that look like in a Monday morning meeting? Right? You know, one of the stats is that AI leaders are two and a half times more likely to have a joint governance board between HR and IT to oversee the AI roadmap. Is that what it is? Is it having more people in the room, or is it something else?
Lou Celi 49:06 I think there has to be radical alignment to be successful at AI, okay. I think it has to be radical alignment right at the top level, the leadership level, okay? And I don’t think it’s just the CIO and the CHRO, although I think that’s a power deal that’s so important, yeah, but I don’t think just that. Because the thing about AI is about business transformation. It touches every part of the business, yeah? So when you think about AI, yes, there’s a technology component, yes, there’s a people component, there’s a risk component. So you want the CRO involved, right? There’s certainly an operational and COO component, because it’s all about changing operations and workflows. And so it’s a product and service. Because if you really want to transform, you’re going to embed AI into your products and services. Yeah, and certainly there’s a customer experience and marketing change. So when you think about it, it’s a holistic change where everyone needs to be aligned and add their point of view so that you get the best solution. Because imagine if you had an AI solution that didn’t take into account the CISO’s or the CRO’s point of view. That could be a disaster. Yeah. So I think it’s important to have it, and at the same time, you want it at the lower levels, the management levels, because those are the people on the ground. They’re actually doing the work, and need to work together.
Sharee McLaurin 50:34 Rebecca, I was just going to add real quick as well. I think radical alignment also looks like a shared purpose, right? And, quite frankly, Radical Candor, right? How can we go into those meetings and create safety and curiosity, especially when you’re thinking about your entire executive leadership team being able to, you know, add to that pool of shared meaning and bust those stories, right? Those are concepts from the Crucial Conversations framework, right? I think that when we’re not able to do that, alignment really lags and suffers because we’re not able to be candid and have that trust to move this work forward.
Rebecca Warren 51:19 Yeah, I love it. So it’s funny that you say that, because I’m like, “Oh, here’s Crucial Conversations, here’s Radical Respect,” which is, right, that same idea—books sitting right here. I think that totally makes sense, that shared experience across this business transformation… like we have to create that sense of shared purpose. I love it. Okay, so as we talked about before we jumped on, this time goes really fast, and I have like 1,200 other questions to ask, which I’m not going to be able to ask. But what I wanted to… so as we think about, how do we break that cycle? So we’ve talked about some great things, and I have a whole lot of notes here, but we have to break that cycle of stalled pilots, reactive fixes. There are those deliberate steps of connecting the dots. Sharee, I thought that was a great idea of saying, “Hey, we can come at it from a compliance standpoint and then build on something else.” How are we building that culture? You know, Lou, you were talking about helping make us better humans, and looking across all organs… all of the organization. So what’s the first step? Like, what’s the first step to break that reactive cycle and start scaling? You can’t just walk in and say, “Today, it’s Cultural Transformation Day, and we are going to align with purpose,” right? So what’s the first step? I’d love to hear from both of you, and then we’ll do kind of a closing quick question.
Sharee McLaurin 52:50 I can go. I think one of the things that our CEO recently introduced us to as a leadership team were some of the principles from the book Traction, which really outlines the Entrepreneurial Operating System, you know, looking at a framework around aligning people, processes, and priorities. And really what resonated with me was around putting the right people in the right seats and managing around the shared purpose. And what I’ve observed in a lot of organizations is you don’t have the right executives in the right seats, right? So they have their own agenda. They’re not aligned to the corporate strategy or the vision, right? And so that is really going to stall any momentum that you have on any of your strategic initiatives, in addition to AI—right, your AI transformation. So I think that piece around shared purpose and having the right people in the right seats—I’ll say looking in the right direction—is also very important.
Rebecca Warren 53:55 So Lou, before you answer this one, I want to throw out something that came in from the audience, which I think is really interesting… and I think it aligns with what both of you have said. So the overall question was about, “Hey, radical alignment sounds great, but in most organizations, folks aren’t ready for that,” right? So what does it look like when we need to meet people where we are, and we have to do AI, right? So, Lou, you had talked about it needs to be a top strategy, but it also needs to be bottom-up. You need to get… and this is what Gartner talks about, is that it’s going to start with those managers, because those are the ones that are telling the C-suite what’s happening, but they’re also working with your frontline teams. So, like, your managers need to be aligned. So what are we thinking about, if we can’t get alignment at the leadership level, how can folks think about doing AI right, but also meeting people where they’re at? And that kind of derails us a little bit, but I think it’s an important question. So I don’t know, Lou, do you have a thought on that and Sharee?
Lou Celi 55:04 Clarify that question. I just… What is it?
Rebecca Warren 55:06 Well, so when we talk about alignment from the top—the CIO, the CHRO—like those have to be lockstep. But we also have to think about all of the business transformation, like across the organization. But sometimes that doesn’t happen, and folks still need to move quickly, and still AI needs to happen, even if the C-suite isn’t all aligned. So how are we thinking about organizations that want to move forward with technology but maybe don’t have all of the pieces in place yet? And maybe this is a question for another time, and I can write a blog post on it or something if we say we’re not ready to answer it now, but…
Lou Celi 55:46 I’m just thinking whether that’s really the right way to go.
Rebecca Warren 55:52 Oh, tell me more. Okay, because…
Lou Celi 55:57 That, I think, is what has gone wrong in many companies: that they are trying to do things in a sort of a piecemeal way, and not really planning it out. How can you not… if you don’t have the leadership involvement, if you don’t have the commitment from the top, if they don’t provide the culture for experimentation, if they don’t provide the skills, what are people going to be doing now? Just using ChatGPT in some crazy way that might not even be secure? I mean, I don’t really fully get it. That’s why I had to repeat the question.
Rebecca Warren 56:38 That’s a great point.
Lou Celi 56:40 I think… I think basically, yes, it’s good. I mean, it’s good. I’ll tell you what’s good. I did a study that showed that the managers are on the front lines. They very often know more about what’s going on, how their business is going to be disrupted, what their customers want, more than the CEO. Yes, my view: that is really important to get them in. In fact, they’re the barometers for change. Yes, getting them involved, getting them to talk about what they want to do, and sharing information—that’s all good. But if that doesn’t go back to management, then it’s hopeless. If you don’t have the commitment to the CEO, you don’t have the culture in place, they’re not investing the money, they know the skills… it cannot work, in my view.
Sharee McLaurin 57:38 Yeah, love. I would just add real quick to that. I think we’re in this mindset of just always adding on. “Okay, what’s the new flavor of the month? Let’s transform. Let’s do something new. Let’s embark on a new strategy.” And that’s fantastic, because we should always be thinking like that, but I think that we have so much to do with refining our current technology stack and our current technology experience. So to your question, Rebecca: what can we be doing when we don’t have everything in place? Like, start looking at what you can do now with what you have. How do you optimize the systems that you already engage with on a day-to-day basis? And then add what Lou said: talk to the people that are really deep in these systems and processes, and see where you can start to refine, and then you keep going with trying to build momentum to pull in those AI pieces.
Rebecca Warren 58:31 Love it. Unfortunately, we are out of time. I’m so sad my list of questions will stay unanswered, but I think we need to have additional conversations. So thank you so much. Love the answers. Like I said, I’ve got a lot of notes. We’ll do a recap on the things that we’ve learned. Thank you so much for participating, and we have completed our first Talent Table of January. Thanks all for joining us, and have a great rest of your day.
Lou Celi 58:59 Thank you. Good to be here. Bye.
By submitting this form, I consent to Eightfold processing my personal data in accordance with its Privacy Notice and agree to receive marketing emails from Eightfold about its products and events. I acknowledge that I can unsubscribe or update my preferences at any time.