The skills conversation is going on 10 years old, but it still feels like everyone is still engaged with it. Some leaders are restructuring entire org charts around skills. Others are still trying to define what a “skill” even is. Wherever you fall on that spectrum, you probably have questions.
In this 45-minute on-demand session, they’ll be doing something simple: answering your questions. No slides, definitely no sales pitch, and no pretending that this is all easy. Just a real conversation between two people who’ve spent the last several years immersed in this work.
The session will pull from recent research, lessons learned from leaders in the trenches, and common themes from recent Skills Strategy workshops.
Expect discussion on things like:
Speakers:
Dani Johnson and Stacia Garr from Red Thread Research discussed the evolution of skills data and its integration into organizational decision-making. They emphasized the importance of understanding scope and purpose, starting with immediate challenges, and leveraging data streams. They highlighted the need for transparency, psychological safety, and cultural shifts to effectively implement skills strategies. They also addressed the challenges of validating and integrating skills data, the role of AI in skills assessment, and the importance of involving business leaders in taxonomy creation. Finally, they discussed the significance of skills data in talent acquisition, compensation, and the development of diverse workforce segments.
Introduction
Evolution of the skills movement
Practical applications of skills data
Rebuilding trust and shifting mindsets
Significant changes in skills by 2026
Building a skills-based approach
Managing skills data and cultural shifts
Defining and revisiting skills data
Building a skills taxonomy
Integrating skills data across platforms
Including diverse populations in skills initiatives
Quin Adler 0:00
All right, hello and thank you for joining today’s Q&A webinar with Red Thread Research. We’re excited to have Dani Johnson and Stacia Garr joining us to share their insights on skills strategy and what comes next. If you have any questions during today’s webinar, please put them in the chat widget located at the bottom of your console. If you experience any technical difficulties, you can also place these in the chat, and our team will help you troubleshoot. So first and foremost, Dani and Stacia, welcome to the team here. We’d love a little introduction from each of you to start us off today.
Dani Johnson 0:36
Sure, we’re happy to be here. One of us should speak first. I’m Dani Johnson. I’m one of the co-founders at Red Thread Research. We’ve been a thing for almost eight years now, and we’re happy to be here today. We’ve focused a lot of our research in the last two or three years on skills, and we love all the questions that came in from the registration for this event.
Stacia Garr 1:00
I just want to say hi. I’m Stacia Garr, the other co-founder of Red Thread, and we are both super excited to be here. As Dani mentioned, these Q&A webinars that we’ve been doing with you all—I know this is our third one—have been a real privilege. Hopefully, we’ll get to all the questions. Thank you for having us, and thanks to everybody for spending some time today with us to get a little bit smarter on skills and ask us good questions. Let’s get started.
Quin Adler 1:28
I love it. For introductions, my name is Quin Adler. I lead our Solutions Consulting team here at Eightfold, and I’ll be moderating today’s session. Let’s jump in with a bit of a provocative question to start. I’ll turn the floor over to you two: Is the skills movement over, or is it just evolving into something else?
Dani Johnson 1:50
Are those the only two options?
Stacia Garr 2:00
Well, I’ll jump in. No, the skills movement is not over. I think it’s evolving, but I’m not sure it’s evolving into something else, per se. We’ve been talking about skills, as Dani mentioned, for many years. I think our first study on this was in 2019, and we have said consistently that the key is just to get started. The journey is going to lead to other places, and that is what I think we’re beginning to see. We’re uncovering more of the challenges that exist when thinking about skills—about bringing the data together and making good decisions. But that’s not necessarily because the objective, which is to make better decisions about people using data, has changed. It just means that we’re understanding the challenges, limitations, and approaches differently as we’ve progressed. The other thing—and I know we’ll start off by saying AI—is that AI has created even more incentive to figure this out, whether it’s with regard to reskilling or thinking about how jobs might evolve. That changes the urgency, but I don’t think it fundamentally changes the “why,” which is that we want to make better decisions about people with data.
Dani Johnson 3:32
Yeah, I think over the last few years, we’ve seen many organizations spin up skills teams. They want to figure out how to do skills verification, what software they need, and what skills mean in their organization across taxonomies and ontologies. One of the changes that I’ve seen in the last six months is that skills are becoming less of a “thing” and more of a thing you do as part of something else. It is more granular data that helps us make better decisions. We’re more able to not only get it but understand the data streams that may not have been useful to us before because AI wasn’t as prevalent. Now we have the ability to take all that data and understand what skills we have and what skills we need. I don’t think it’s evolving into something else; I think it’s just integrating into everything else we do.
Stacia Garr 4:30
I would agree with that. I think the one potential addition we’re hearing more about is the topic of tasks and how tasks relate to skills. We have said in the past that, in many ways, the task is the Rosetta Stone between people and AI. While I think that is true, task data is just another form of data to understand how the work is being done. But that doesn’t mean skills data is any less relevant; it’s actually just as relevant. Even if you were to decide that AI can do 30% of the tasks in a job, you still need to understand what skills people need to take on the remaining tasks, and you need to understand how those skills work to develop people for those roles. Tasks are not a replacement for skills; they are an augmentation to this set of data, just like skills were an augmentation to the data we had before this.
Quin Adler 5:38
Yeah, it feels like we’re starting to actually get to a place where skills data is useful as opposed to just an exercise that HR does. Whether it’s talent acquisition, talent management, upskilling, or reskilling, we are utilizing those skills to make a difference versus just tagging them on a profile. We’re getting more actionable. One of our webinar registrants asked about the “snow day effect”—trends that seem like a big deal but might not actually happen. Is there anything in the skills and talent space that is currently being overhyped?
Dani Johnson 6:22
I don’t know if overhyped is the right word, but maybe wrongly hyped. I feel like we’re in this really strange space where there’s so much going on, and it’s not hype. We’re changing roles based on new technologies and organizing work based on what we know about people. All those things are happening, but I do see people focusing on things that are just part of the solution, not the entire solution. A couple of years ago, the big conversation was whether we needed an ontology or a taxonomy. I would argue that maybe it doesn’t matter as much as people think. It depends on the use case and the skills data you’re trying to get to solve certain problems. I don’t necessarily think anything is overhyped, but in many cases, things are wrongly hyped, and we’re not looking at the bigger picture. We’re super-focusing on little things that don’t make sense as part of a larger whole.
Stacia Garr 7:21
I’ll take a slightly different approach and maybe go a bit more meta. I think what is being overhyped right now is the fear of AI taking away jobs and the concerns about how skills might factor into that. In every major technological change humans have seen, technology has created at least as many, if not more, jobs than what we had. That’s not to say there won’t be disruptions, but there is this fear of jobs being taken away at the same time we’re having conversations about lower birth rates and having fewer people coming into the workforce. These two conversations are not intersecting. In a logical fashion, it might be: “Hey, wouldn’t it be great if AI took some of these jobs because we’re going to have fewer people to do them?” Very few people are actually having that intersection, and if you don’t bring them together, you can enhance fear. It might actually be a comforting story: We’ve got this amazing technology to do some of the work, and the people who are here are going to get to do even better work. I know that’s maybe a little Pollyannaish, but I think there’s too much hype regarding fear right now.
Quin Adler 9:12
It’s definitely an interesting conversation. One thing our CEO at Eightfold always says is that we’re in the era where we can get some time back. If AI is used the right way, we can hopefully automate manual or mundane tasks and focus on more strategic work or cross-departmental collaboration. Whether it’s as simple as writing an email or something a bit more agentic, allowing systems and AI agents to do the things we don’t want to spend time doing is a major benefit.
Dani Johnson 9:52
Yeah, I would argue that maybe we shouldn’t be doing those things, period. Maybe we should use AI for a bigger purpose and get rid of the stuff that doesn’t make sense for us to do anyway.
Quin Adler 10:04
I’m a fan of that. Let’s go to the next question. Many organizations are coming off a long year of restructuring. How can a shift to skills help, or potentially even hinder, rebuilding trust and shifting the mindset from role-based to growth-based development?
Dani Johnson 10:29
That sounds like someone going through a hard time in their organization right now. I remember when we were first digging into what skills strategy meant for an organization, we held some roundtables. We asked participants how they were using skills, and all of them said “just for development.” I don’t know that that’s true anymore. There is a real fear around how skills can be used to aid with decisions about RIFing (Reduction in Force), promotions, and those types of things. More data can be very helpful, but it can also be used in dangerous ways. One thing we see successful organizations do is be completely transparent about how that data is used. In some cases, they even make it opt-in: “You tell us your skills, and we’ll give you this,” rather than insisting all skills data belongs to the organization.
Stacia Garr 12:00
I also do a lot of research on performance management. While skills and performance don’t always seem to go hand-in-hand, they actually do. We found in our most recent study that performance management is the most common source where organizations identify skills. There’s a connection here. I’m mentioning it because we’ve seen a lot of focus in performance management be on assessment and restructuring. Overall, we’re going to have to move back toward performance management and skills being about how we develop and how we have the right skills to do the work in the future. That’s going to require the cultural pendulum to swing back. How do we do that? It starts with core things like psychological safety, messaging about what’s going to happen, and increased transparency about where the organization is going. It requires baking learning into day-to-day work and encouraging people to take risks to learn.
Quin Adler 14:05
Very well said, Stacia. All right, let’s go to the chat. What significant changes in skills are expected in the workforce by 2026, and how do these changes relate to the demand for education and credentials?
Dani Johnson 14:23
I focused a lot on employee development. One thing plaguing L&D (Learning and Development) leaders right now is that we have a history of using work for development. People coming into the organization would be given simple tasks to get used to the culture and build relationships. Increasingly, AI is taking those tasks, so organizations are struggling with leveling new people up quickly to be useful. The other way we’re seeing it is that people are offloading work to AI secretly. A KPMG report last year said that 57% of people are using AI but not telling the organization. As we see more of that, skills around critical thinking and identifying when something is real or not real are now critical. Having a strong understanding of critical thinking skills and the ability to detect and use AI correctly is essential.
Stacia Garr 16:06
I would add that we need to make sure people feel like they have the resources to make those decisions. I had a call with a client who has major call centers. They said up to a third of their customer service requests coming in were fake—AI trying to convince agents they needed a refund for things they didn’t purchase. How do you figure out what’s real and what’s not? If somebody messes up and thinks a customer is a bot when they’re not, what does that look like? All that goes back to psychological safety and clear decision-making at the edges of the organization.
Quin Adler 17:32
There are so many conversations around AI literacy. Many people think ChatGPT is just for writing emails or summarizing, but those who really dig in know there are unlimited use cases. We’ve seen university courses and minors around artificial intelligence. It’s interesting to see how companies try to guide people or nudge them toward the right usage, because if you’re doing it in the shadows, you could be pushing trade secrets into a public model.
Dani Johnson 18:28
One of our mega-trends this year was that authenticity is a really important thing. Do I know it’s a human? It’s on leaders to determine where the human touch is critical. I’m not sure we completely know right now where it’s important to interact with a human versus just getting information from a robot. That’s a skill I’m hoping we figure out this year.
Stacia Garr 19:09
If people are using AI but not telling their leaders, that goes back to psychological safety. How do we put incentives in place to encourage use in a way that helps people run toward something rather than being fearful of something happening to them? There’s a culture and a structure issue here.
Quin Adler 20:00
Let’s steer the conversation back toward skills. Another question from the chat: Did you encounter a debate on skills versus competencies in terms of definition?
Dani Johnson 20:13
Oh yeah, that was one of the first things we looked into. If you listen to early episodes of our podcast, Workplace Stories, we asked everybody about the difference. Eventually, enough smart people said, “I don’t really care; I just want the data stream that tells me somebody can do this thing,” that we quit having the argument. Is there psychological importance in the idea of a competency? Absolutely. Do some organizations roll skills up into competencies? Absolutely. But it is a much more localized conversation. There aren’t even two companies that agree on what it is or how to define it, so it’s whatever you define it as in your own organization.
Quin Adler 21:08
Where should an organization realistically start when building a skills-based approach? Skills frameworks, learning initiatives, assessments, or manager-led conversations?
Dani Johnson 21:29
Start with the problem you’re trying to solve. In all the use cases we admire, they started with a specific problem, like high turnover or a skill being rare in the marketplace. When organizations try to nail a five-year strategy upfront, it’s almost impossible. It’s best to start with immediate challenges you can solve because that builds resolve within the organization to continue building the database outward.
Stacia Garr 22:24
We can actually pop up our model here. This is our model of the six elements of a good skill strategy. You can see we have built it mostly circular to indicate there isn’t one thing you should prioritize over the other, with the exception of scope and purpose. Understanding what you are trying to solve for is the starting point.
Quin Adler 23:24
How do you handle managers who are not quite ready for granular skills discussions? Should they keep it high-level?
Dani Johnson 23:42
It depends on the maturity of skills in your organization. If you’re just starting, there are ways around it. We’ve seen organizations go to readily available data first without driving those conversations and then slowly change the language. Instead of a career ladder, we talk about a career portfolio. It involves cultural aspects; you can’t just tell managers they need to do something.
Stacia Garr 24:59
We have this expectation that managers are going to be the be-all and end-all for everything, yet we don’t always compensate or incentivize them for it. Structurally, it’s not set up to happen. You have options: You can incentivize managers to coach, but you can also look at the structure around them. In our study from a couple of years ago, the most important item was that the organization gave managers and employees digital coaches. Does a manager have to be the one having this skills conversation? Could a digital coach, a mentor, or a dedicated coach be helpful instead? We need to provide support so employees get what they need when they need it.
Quin Adler 27:05
That makes total sense. Many more people can help upskill and reskill than just the manager. Let’s talk about data. For an organization that is data-immature and lacks baseline info, what are the most foundational pieces of data they should be collecting first, and what makes for a good pilot group?
Stacia Garr 28:02
It depends on your scope and purpose. You obviously need info on who the employees are from your HCM (Human Capital Management) system. From there, it depends on what you’re trying to do. If you’re showing career options, a baseline through self-identification or an assessment is a good start. If you are making promotion decisions, you need a higher level of validity, like validated executive assessments.
Dani Johnson 29:34
Also, take a look at possible data streams and decide what decisions you can make off them. Some L&D organizations look at signals that help them understand where people are right now versus what the organization needs.
Quin Adler 30:13
The shelf life of skills is dropping. How should HR teams approach defining the shelf life of skills versus the shelf life of skills data?
Stacia Garr 30:55
On the data aspect, the real question is whether there is a way to reliably understand what skills are at a given point and what experiences might have changed them. If I was certified as an underwater basket weaver in 1990 but don’t practice it, there’s no reason to reassess it. But if we have targeted skills we’re trying to develop, that necessitates more frequent data collection.
Dani Johnson 32:14
We recently did a study on skills verification. Variables like recency determine validity. Signals tell us if a skill is being used. It becomes a trade-off: How good do we want the data to be, and how expensive is it to get that signal? Do we recertify people every year, which is expensive, or tap into project management data to see repetition? It’s a complex, interesting problem.
Quin Adler 33:34
It’s helpful because people don’t often think about the waning of skills. Skills are muscles; if you don’t use them, they go away.
Dani Johnson 33:43
Exactly. It used to be “do you have the skill or not?” Now we think about level. A college grad might say they have a skill, while a PhD who has done it for 50 years might say they are only at 40% because they understand how much there is to know. Determining skill level for the local environment is key.
Quin Adler 34:22
Every company is different from a culture standpoint. Should organizations start by building a formal taxonomy, or should they let AI generate one and then refine it? How do you ensure it doesn’t become an HR-owned auditing nightmare?
Dani Johnson 35:20
We’ve seen it done both ways. It cannot be HR-owned; it’s got to be organizationally owned and continuously updated. Use what makes most sense for your organization. Some industries are straightforward; some change regularly.
Stacia Garr 36:09
Many organizations leverage a vendor’s foundation—roughly 80%—and then refine it. AI usually generates that. Keep in mind what degree of accuracy you need. There are validated skills frameworks out there. In tech, skills change fast, and there may not be research behind specific roles yet. It depends on the partnership you have with your vendor.
Quin Adler 37:57
This piggybacks nicely on a question: My organization is building a skills taxonomy for unique technical skills, such as for regulatory agencies. Any tips for success when third-party avenues and AI are not true options?
Stacia Garr 38:24
Business leaders or functional leaders must be responsible for helping create that taxonomy. HR might do the first pass with SMEs (Subject Matter Experts), but business leaders understand the work. Avoid thinking HR can do this alone; it has to be truly owned by the business.
Dani Johnson 39:27
I’d add two things. First, we’ve seen people tie skills taxonomies to their job structure. If you have a good job structure, address the taxonomy in that framework. Second, look at Ericsson. They made sure they had the systems and processes to maintain it. It was owned by business leaders, and SMEs were identified to keep things updated regularly.
Quin Adler 40:37
What does a full transformation look like in practical terms for recruiting and total rewards?
Stacia Garr 41:05
Some organizations start with recruiting to get data early. We see an adjustment of the sourcing process using labor market intelligence. Then, integrate it into assessments and interviews. Finally, make sure those data are connected on the HCM so they don’t just sit in a silo. Compensation and rewards is a bigger question. Few organizations do this yet. We did a podcast with IBM where they made compensation adjustments based on skills to retain talent, but it required a massive amount of analysis. You need a solid philosophy before influencing rewards with skills data.
Quin Adler 44:17
If new job architecture work is in flight, what pre-staging can move skills work forward with the business in parallel without risking duplicate work?
Stacia Garr 44:51
Focus on the six elements. Work with the business to understand the top three problems you need to solve. Make sure you have clarity on partnership, governance, and culture. Use a RACI (Responsible, Accountable, Consulted, Informed) model so the business understands when they will be plugged in. Help leaders come up with language that embeds skills into how they already talk about talent.
Quin Adler 47:03
How have organizations successfully assessed and validated skill proficiencies, and what are the best practices for integrating that data into lxps or lmss?
Dani Johnson 47:29
We’ve identified 16 or 17 ways to validate skills. It usually involves a scale. Gigantic challenges exist right now with harmonization of information from different systems. Understand from your vendor how they do harmonization and what algorithms surface proficiency. For integration, you need a defined system of record. Is it the HRIS (Human Resources Information System), the skills platform, or the people analytics platform? If your effort is just learning, an LXP (Learning Experience Platform) or LMS (Learning Management System) is appropriate. Increasingly, people are feeding information into a data lake to harmonize it internally and then feeding it back out.
Stacia Garr 50:35
We have a study coming out on skills data architecture soon that breaks down middleware and pushing data into various systems.
Quin Adler 51:02
Can professional certifications serve as a credible signal for both technical and durable skills like critical thinking or EQ (Emotional Intelligence)?
Dani Johnson 51:21
It depends. AI changes every other day; critical thinking is more durable. New technologies help verify skills in interesting ways, like AI coaches that assess how you respond in a conversation. For technical skills, we see sandbox systems. Does it align with what you are trying to certify, and what is the shelf life?
Quin Adler 52:53
How are leading organizations successfully bringing military talent, healthcare workers, and deskless or unionized employees into a skills-first framework?
Dani Johnson 53:17
Technologies help us. In healthcare, systems hook into medical coding to see if nurses perform tasks like putting in an IV a certain number of times. In frontline operations, data can show if a person on a machine had errors, triggering training. We are gathering information from latent data. Knowledge workers are harder to measure than frontline workers.
Stacia Garr 54:42
Make sure you’re not “othering” these groups. Don’t make assumptions about what they can or don’t want to do. Understand from them what kind of development they want.
Quin Adler 55:56
Thank you both. I’d love to hand the floor back to you to each share one action item the audience can take.
Stacia Garr 56:12
Understand your scope and purpose. Have leaders aligned to it and have a way to measure success. Job number one is constant communication.
Dani Johnson 56:51
Change the mindset around skills. It’s not just a “worth it” investment; it’s granular information about what people can do. Think about it in chunks and use cases rather than having a massive system in place before you start.
Quin Adler 57:32
Thank you so much for joining us. If you missed part of the webinar, the on-demand recording will be available through the login link and on the Eightfold website. Have a great day.
Stacia Garr 57:54
Thank you all. Thanks.
By submitting this form, I consent to Eightfold processing my personal data in accordance with its Privacy Notice and agree to receive marketing emails from Eightfold about its products and events. I acknowledge that I can unsubscribe or update my preferences at any time.