Training Industry research has found that one of the biggest challenges facing learning leaders is measuring the effectiveness of their training. On this episode of The Business of Learning, Kevin M. Yates, a fact-finder for learning and development, discusses this challenge and his tips on evaluating learning and measuring return on investment (ROI).
Find out:
- Why measurement is challenging.
- The pros and cons of common forms of measurement.
- Why smile sheets get a bad rap.
- Whether we can boil down learning programs to ROI.
Don’t miss Kevin’s article “There’s a Data Analyst on the L&D Team?” on TrainingIndustry.com.
Listen Now:
To learn more about Training Industry’s Measuring the Impact of L&D Certificate course, download the program brochure below.
A transcript of the episode follows:
Intro:
Welcome to The Business of Learning, the Learning Leader’s podcast from trainingindustry.com.
Taryn Oesch:
Hello and welcome to Episode 19 of The Business of Learning, the Learning Leader’s podcast from Training Industry. I’m Taryn Oesch, managing editor of digital content, here with my co-host, Scott Rutherford, head of digital operations and marketing.
Scott Rutherford:
Hi. Training Industry research has found that one of the biggest challenges facing learning leaders is measuring the effectiveness of their training, and so that’s the focus of this episode: the role of measurement in corporate training.
Scott Rutherford:
Today on The Business of Learning podcast we’re talking about measurement in L&D, and here today to help us with this challenge is Kevin M. Yates.
Taryn Oesch:
Kevin is a fact finder for learning and development. As the Sherlock Holmes of L&D, he uses facts, evidence and data to answer the question, “Did training work?” His work is focused on using measurement to show training, learning and talent development’s impact on performance, behavior and organization goals. He’s also a contributor to trainingindustry.com.
Taryn Oesch:
Kevin, thanks for joining us today.
Kevin Yates:
Wow, and thank you for that wonderful introduction. Boy, do I sound good?!
Scott Rutherford:
You know, we try.
Scott Rutherford:
So Kevin, can you start us off a little bit by just giving us a little background about yourself and what you do?
Kevin Yates:
Yes, absolutely.
Kevin Yates:
I like to say, Scott and Taryn, that I grew up in L&D. I’ve been in L&D now for about 20 years, and my career in learning and development started at a small community bank on the south side of Chicago, where I had a role as a trainer, a day-to-day facilitator. That was one of those boots-on-the-ground roles where, almost eight hours a day, I was in front of a class and training on bank software and customer service, and then that ultimately led to a role in instructional design that led to a role in curriculum and development. Then I moved into a role of leadership where I was managing trainers and instructional designers that opened up to a global learning role where I had the opportunity to spend some time in Bangalore, India, setting up training academies. I came back to Chicago and then started working with learning solutions and learning technology.
Kevin Yates:
Then about three quarters of the way, I guess, into my career, I was introduced to this idea that you can measure the impact of training and more specifically, that you can measure the impact of training on people’s performance and organization[al] goals. So that was maybe six or seven or eight years ago, and that was a game changer for me. It literally gave me reason to re-shift and re-focus my career, very narrowly, and very specifically in the area of measurement for learning and development, and then more specifically data and analytics for learning and development as well.
Kevin Yates:
That’s where I’ve been focused for the past few years. I think I have found my niche, if you will. I am just really enjoying the opportunity I have to be a voice in the L&D community for the idea that we can measure the impact of learning, that we can use fact based evidence to answer the question, “Did training work?”
Kevin Yates:
So I think that I have found the right spot for me, and for my career and my work.
Taryn Oesch:
[It] certainly seems like it.
Taryn Oesch:
So Kevin, when we talk about measurement in learning and development, what exactly are we measuring and, why? And what are some of the common forms of course measurement? We talk a lot about smile sheets, and what are some of the other forums and what are their pros and cons?
Kevin Yates:
Yes, that’s a great question.
Kevin Yates:
For me, to answer that question, “What are we measuring and why?” I think it’s pretty simple. We are measuring the extent to which our training and our learning solutions are changing anything at all, right? So it’s measuring and determining the extent to which our training and learning solutions are actually changing people’s behavior, changing their performance and changing their actions.
Kevin Yates:
And then it’s measuring the extent to which a change in behavior, performance and actions are actually impacting their performance in a way that helps them achieve business goals or organization[al] goals.
Kevin Yates:
So, we want to take a look at the link between learning, the link between performance and the link between actual impact on an organization goal or a strategy. So, we want to determine the extent to which learning may be helping people execute in their role in a way that helps them execute on a particular business strategy.
Kevin Yates:
So, that’s the “why.” Simply put, we’re doing it because we need insight on the extent to which our work, our effort [and] our resources are actually producing a change or actually making an impact. So for me, when I am measuring at the highest level, I’m really taking a look at the extent to which there is metrics and data in the business that shows how people are performing.
Kevin Yates:
There are different ways to get at that. We can take a look at actual business performance metrics, and we can take a look at causation and correlation that we could show between performance and learning and training experiences that people have had. So they’re … Taryn you asked, “What are we measuring? What are some specific measures that we’re looking at that?”
Kevin Yates:
That’s just one example. There are so many different measures that we can look at and what we measure is, and how we measure, is really driven by where we expect to see an impact because you can’t measure the same thing the same way over and over. You really have to be looking at where you expect to see an impact and then take a look at what measures will determine the extent to which that impact was made.
Kevin Yates:
You reference smile sheets. Smile sheets are getting a bad rep, and I think that’s because of the traditional kinds of questions that smile sheets have asked. The traditional smile sheet asks, “Did you like the instructor? Did you like the classroom, and did you like the food?” That won’t give you any insight into the extent to which you can expect to see a performance change. I’ve developed about … I think it’s 92 … questions that you can use to estimate the extent to which you can expect to see an impact on people’s behavior [and] performance based on their learning experience. It’s not that smile sheets are bad, it’s just that smile sheets are [not] always, or have traditionally [failed to] ask the right kinds of questions that get at the right type of data, that provides insight or that informs decisions.
Kevin Yates:
Does that make sense?
Scott Rutherford:
Yes, I think what you were saying, if I can interpret a little bit … a little bit of my own observation from folks that I’ve spoken with on this topic too, but it’s the relationship of how you’re measuring and what you’re measuring, and how those interrelate, because there is … the smile sheet is used as an example or at least I’ve seen it use as an example of easy measurement. It’s an easy measure. It’s easily administered, it’s easily compiled, but it’s not necessarily meaningful on the other side.
Scott Rutherford:
That’s the balance, isn’t it, to come up with measurement that’s both meaningful and isn’t too difficult to assemble or to execute?
Kevin Yates:
And that it’s not too burdensome to the person from whom you need to collect that data, right?
Scott Rutherford:
Right.
Kevin Yates:
It’s not that smile sheets are a bad thing, but you just need to use people’s time wisely and you need to take advantage of that limited time to get the right kind of data. So not being … I’m thinking we should stay away from some of those gratuitous questions that really won’t provide the type of insight that informs decisions and allows us to act. But again, to your point, we should be using that precious time that we’re going to ask people to give, in that three to five minute survey, or however else it is, but pushing that out, to ask the kinds of questions that are going to inform decisions and provide insights. For me. I like to use that time to collect data that gives me insight into the extent to which we can expect a change in behavior and performance in action. So the types of questions that I ask are the ones that give us just that.
Kevin Yates:
Does that make sense?
Scott Rutherford:
It does.
Scott Rutherford:
I’m going to ask you maybe to be a little “controversial” here, and I’m putting that in quotes.
Kevin Yates:
I’m all for that.
Scott Rutherford:
Is there … does it matter to assess whether learners liked the program or does it, does that … is that really not important? Is it more important to focus in on metrics that can really help the business leaders understand whether the program was effective? Is there a role for a happiness score?
Kevin Yates:
Yes.
Kevin Yates:
I’m going to be controversial, and my own view is, not necessarily. I think that for me, at the end of the day, it’s important to determine the extent to which content and experience impacts or has the potential to impact a person’s change in behavior and performance. So there might be a correlation there where someone doesn’t “like” something, I’m using air quotes, that might influence the extent to which there will be a change. But I think that you have to separate like from expectations.
Kevin Yates:
If the expectations for the purpose of the training and learning solution is made clear, if that is clearly expressed so that those who are engaged in that training and that learning solution have very clear ideas about its purpose and its intent, then I think that’s where you want to be looking at the extent to which there’s connection with that, and not so much the extent to which there’s a connection between whether or not someone likes something. At the end of the day, there are performance expectations that organizations have with people and teams, so that whole idea of liking something, I’m not so sure that it is as important as content resonating with people in a way where they can see the connection between it and what the expectation is for how they are expected to use that training, experience in that learning, experience to impact their behavior in a performance and their actions.
Kevin Yates:
Did that answer your question? Does it make sense?
Scott Rutherford:
Yes, I think that makes sense.
Taryn Oesch:
So moving to another big, I guess, I want to say buzzword, but [another] popular idea right now with training measurement: Can we boil down learning programs to return on investment (ROI)? And if not, how do we demonstrate the value of a learning program to the business in terms that the management team will understand [so that] that they really go beyond ROI or just dollars and cents?
Kevin Yates:
That is an awesome question.
Kevin Yates:
When I talk about return on investment (ROI), I always like to begin the conversation by contextualizing what I mean when I say ROI, because it means different things to different people.
Kevin Yates:
So, when I speak about ROI, return on investment, I am contextualizing that, to mean the monetary value you gained or lost as a result of an investment in people’s talent and people’s development. I’m talking dollar to dollar.
Kevin Yates:
The first thing that I think we want to think about is how far you want to go with measuring the financial ROI of a training solution or a training program, and I would say, it really depends on the extent to which that solution or that program is visible and the extent to which it is strategically connected to a business goal, and the extent to which that program is visible in the organization. So for those programs that are highly visible, that are expensive, and that have a direct link to achieving a business goal or executing a business strategy, I believe it’s important for us to measure the return on investment, the ROI, for that particular training experience or learning solution.
Kevin Yates:
I don’t think it’s realistic to expect that we should be measuring the ROI for every learning solution and training solution that comes out of L&D. It’s not practical and quite frankly, we don’t have the resources to do that, but if the organization is heavily invested in a particular training program or learning solution, if it is expected to impact a large number of people, and again, if it’s highly visible in the organization, I believe that we have the responsibility to show the return on investment, the monetary return on investment for those types of programs that are in the business.
Kevin Yates:
So then when we are able to identify the return on investment, not only does it show our impact, but we are also then speaking the language of the business. Other parts of the business are already being held accountable for [their] return on investment, and again, I mean, the monetary return on investment, so I don’t think it’s unrealistic to expect that learning and development should be going the same thing.
Scott Rutherford:
On the topic of ROI, and looking at it from the perspective of a business owner, there are parts of a business where you could look at ROI in an immediate term. You could immediately know … if you’re doing a point of sale promotion in a retail circumstance, you can understand what the ROI of a two week point of sale promotion would be.
Scott Rutherford:
I’m wondering if you agree with what I think about ROI in L&D — to use a mess of acronyms there — return on investment in learning and development … it’s a time function as well. Behavior change is a time function and I don’t know if it’s possible for a training program, or let’s take it to the extreme, you can’t measure or can you, ROI from a single course. Isn’t it something that has to be measured in performance change over time?
Kevin Yates:
You are so spot on and I think that what you’re hitting on there, Scott, is a way in which we need to manage our expectations of the business, the organizations where we serve and we need to manage expectations for ourselves. Because, as you said so eloquently, and so accurately, the essence of what we’re talking about that comes out of a training or learning experience, is a change in behavior. If you’re looking for signs or evidence or facts that show the extent to which changes in behavior and performance have impacted some type of business goal, that’s going to happen over time.
Kevin Yates:
So what I am saying, and what I have learned from my experience, is that if you want to really see that change, and you want to see how that change has made an impact, you’re really talking, nine to 12 months post-experience. That’s even keeping the idea that the performance is changing, and the performance is ongoing after that learning experience.
Kevin Yates:
So, that training experience … you are absolutely right in that, more often than not, you’re not going to see an immediate return on investment. That’s usually going to happen over time. The caveat to that would be those instances where depending on industry, and depending on the work in that industry, you might see some immediate return on investment.
Kevin Yates:
For example, if you are in the service industry, at a manufacturing industry and you have employees going through a training program that shows them how to execute on a performance, or rather a process, you can see that [ROI] pretty immediately, right? Because if it’s a rote process on what somebody has been trained, and it’s a repeated process, you can measure this to the extent to which they have that process down right. But when you’re talking about some of those higher level, performance-type situations where you might be talking about leadership, or even some types of sales training, yes, you’re so right, Scott.
Kevin Yates:
You’re talking, six, nine, 12 months to really measure the real impact and measure the real return on investment, the ROI.
Kevin Yates:
Does that make sense?
Scott Rutherford:
Yes, I think so. I think that it also hints at one of the challenges in reporting ROI, because one of the reasons that any business function wants to focus on ROI is you’re fighting for your place at the table of budget time. So if we’re talking about a measurement cycle that extends to nine, 12, 18 [or] 24 months, we’re talking multiple budget cycles and, at some point, you have to build the trust with management. Don’t you have to say, “Well, look, we have confidence what we’re doing is going to bear fruit in the two year, three year, five year timeframe,” so that there’s the support and patience to allow it to mature.
Kevin Yates:
You’re right and it’s also a matter of building trust and showing evidence for approach.
Kevin Yates:
So here’s what I mean by that: You are spot on in that those cycles don’t always line up with when we expect to see that change, but we’re talking about a culture change. We’re talking about a change in mindset, and we’re talking about a change in the way that we work. So, if over time you are repeatedly using credible, reliable methodologies to show return on investment for the impact of your training and your learning programs, then you have shown and demonstrated to the organization that you are working in a way that is aligned to how the business is working with using ROI.
Kevin Yates:
So what I mean by that, Scott, is if you are initially in that ROI journey and that measurement journey, it might feel as though there’s a disconnect between the budget time cycle and the time for which you are able to show impact, but if you are continuously doing that over time, if you are [an] organization or if you are [a] L&D organization, that is where repeatedly using ROI methodologies to show impact and provide insights on the extent to which your training and your learning solutions are actually producing return on investment. If you’re engaged in that over a period of years, then you have been building trust and, so then, when those budget cycles come up, you will have already demonstrated the way in which you are measuring ROI.
Kevin Yates:
So it might not be as difficult to get those dollars, because you’ve already shown to the organization that you are diligent in the way in which you are calculating ROI and the way in which you are showing ROI.
Kevin Yates:
Does that make sense?
Scott Rutherford:
Oh, for sure. Yes, and I want to take maybe three steps back because I was listening as you were talking about your own career path and it occurred to me that your entry to learning and development is, it’s a trajectory that I’ve seen and heard before. And I’m wondering if there’s if there’s anything we can learn from that to say… [the best way to navigate that challenge] because, learning leaders, if we wanted to paint [them] with a very broad brush, typically would cite measurement as a challenge, as an area that they maybe struggle with a little bit.
Scott Rutherford:
Is that, do you think, related to the fact that going into a learning or into a training career path … maybe the business savvy or the business focus [isn’t always there] or [maybe] there’s a skill set that’s required for measurement that doesn’t necessarily get [taught] … that you’re not prepped for when you’re starting your first position?
Kevin Yates:
You’re absolutely right.
Kevin Yates:
For me, when I think about it, this idea that we measure impact, and that impact is the deliverable, is a totally new concept. So again, if you think about the fact that I’ve been in L&D now for about 25 years, I go back pretty far and I remember at the beginning of my career, there were never any conversations or thought about impact being the deliverable.
Kevin Yates:
The deliverable was the training program, the deliverable was the class, the deliverable was the e-learning, or the deliverable was the PowerPoint that the facilitator was going to use to deliver the class. So, that was the deliverable. We never had any conversations about impact being the deliverable, and if you’re going to know what impact is, then you have to measure it.
Kevin Yates:
So on some levels, Scott, measurement is not in our DNA, because it’s just not been what we consider to be what we’re producing. And that’s okay, because there’s a shift and I’m excited to see the shift. But your point here is that acumen, that expertise, that skill, that capability has not traditionally been part of what we’ve been expected to do. So what I believe we’re seeing now is a shift, and it’s interesting because I started doing research a couple of years ago, taking a look at the evolution of this new role that has emerged and [shaped] learning and development teams.
Kevin Yates:
So now you’re seeing things like learning analyst, learning data scientist, learning measurement manager. You see all these new types of roles that are popping up that were just not part of the L&D organization even as recently as 10 years ago. So I think that, as a community as an L&D community, what we are beginning to recognize is that there is a need or a unique and very specific skill set within our L&D teams that is focused on measurement, that is focused on data and focus on analytics.
Kevin Yates:
Now to be sure, I believe that as a community and as an L&D organization, we need to be data literate, but to expect, say the instructional designer to devote his or her time say, I don’t know, 75% to instructional design and 25% to measurement. I believe that’s unfair because the ability to measure impact is an art, a science and a skill. And I think that you need time to develop that art, that science and that skill. I also think that you need time and opportunity to focus on it.
Kevin Yates:
So the essence of what I’m saying here is that I believe that it is important that we have roles embedded on L&D teams that are narrowly and specifically focused on measurement. I think it’s important that [an] L&D organization commit[s] headcount to having minimally one role on the team that can help answer the question, “Did training work with facts, evidence and data?”
Kevin Yates:
So, what I’m saying is the[re’s] a rise in some of our biggest most recognizable brands and even [in] some smaller organizations, where these L&D organizations and teams are now creating roles, recruiting for roles that are focused on measurement because measurement has not been traditionally a part of skill and capability requirements for L&D professionals.
Taryn Oesch:
Kevin, you wrote an article on this topic. It’s been a couple years ago now. There was a data analyst on the L&D team and-
Kevin Yates:
I remember that.
Taryn Oesch:
Do you think there are more data analysts on the L&D team[s] now than there were two years ago when you wrote that article?
Kevin Yates:
I do, Taryn, and actually that’s a great connection to my last point, because when I wrote that article, I was inspired to write it because as I was just taking a look at job postings, and doing different types of research, I was beginning to see these roles emerge in L&D with these titles that were just unheard of. Some of those titles that I mentioned earlier, like learning data analysts and learning measurement specialist, I had never seen those types of roles before.
Kevin Yates:
So when I wrote that article, two years ago, I was inspired to do so because of the emergence of those roles that I was starting to see … and I would say that trend is continuing now, two years later, and I expect it to grow even more as we go forward. Because I expect that more learning and development teams will begin to see that it’s important and valuable for us to answer the question, “Did training work with analytics and data and facts?”
Taryn Oesch:
Right.
Taryn Oesch:
Can we get a little bit into the weeds here? Do you have an example of from your work maybe of how measuring a learning program has led to improve support from senior leadership for continued investment in learning and development?
Kevin Yates:
Yes, and actually, I’m going to answer that question by going in an opposite direction that I think will actually answer that question. So here’s what I mean.
Kevin Yates:
There was a time when I worked in a business where, within that business, there were five separate business units, and so the business wanted to go to market as one business unit, which meant that each business unit would have to sell the other business units’ products and services. So the first response to that was that we needed a training solution that was focused on product sales and product knowledge. So we put that sales training program into place in the business, spent quite a few dollars on it and quite a few people. When I took a look at the performance data for those individuals who were part of that program, I didn’t see any changes in performance. When I took a look at client engagement scores and didn’t see any changes in client engagement and then the bottom line when I took a look at sales data, I didn’t see any changes in sales data.
Kevin Yates:
So the initial facts, evidence and data said that training program made zero impact. That’s a little scary. As a trained professional, that’s a little scary. Then going back to what you and I were talking about Scott, there was zero ROI for that. There was no return on investment because there was no change. So after taking a look at sales data, performance data, client data, and just seeing that no behavior has changed, no performance has changed, that really that trying to [execute that] program made zero difference, I then needed to go back and collect some qualitative data. After going back, and doing a few focus group and a few interviews and having some conversations and collecting that qualitative data, what I discovered is that the business had not changed its P&L structure.
Kevin Yates:
Which meant that if I were in business unit A and I sold business unit B’s products and services, I would not get revenue recognition for that. So I was only being held accountable for the sales number that I needed to hit for my respective work innovation. So I was able to take the qualitative data that came from the interviews and the conversations, and I was able to marry that with the quantitative data that came from the performance data out of the business, the sales data from the business and the client engagement data. And I was able to tell a story. I was able to take that story back to senior leadership to say, “The money we invested had zero impact on sales, on people’s performance and client engagement and here’s why.”
Kevin Yates:
We didn’t see a change, because what was not done prior to this sales training program was conversations around changing the revenue recognition system. So my response, and I’m paraphrasing here, is that we could train until the cows came home, but if we did not change the revenue recognition system, we would never see a change in behavior performance, sales and impact in client engagement.
Kevin Yates:
That was around, that was a different story to tell, Taryn not that I didn’t want you to ask … but what I wanted to do here was just demonstrate how you can use fact-based evidence to tell stories and tell them in a way that are compelling. So it wasn’t me going to senior leadership saying, “I don’t think that this program worked and here’s why.” It was me saying, “Here’s what the data says. Here’s the story, oh, and by the way, here’s my recommendation for how we can really see a shift and how we can really achieve the goal that we have here of increasing revenue, improving client engagement and increasing sales.”
Scott Rutherford:
And it’s a way to act as a consultant to the business, too — rather than just staying in your lane and looking only at learning and development, you’re saying, “Well, look, learning and development can do many things but it can’t do everything on its own.”
Kevin Yates:
Man, you are so right Scott, and to be honest with you, I don’t really see myself as a trainer or a quote-unquote, “L&D person,” air quotes again. I really see myself as a performance consultant to the organization that is using training and learning solutions to help improve people’s performance. So I see myself as a performance consultant first, and it just so happens that what I have in my bag of tricks, if you will, are training and learning solutions.
Scott Rutherford:
Well, Kevin, thanks for your time. We really appreciate it, you spending a few minutes with us here on the podcast. I wanted to give you a chance if there’s any final thoughts or advice you have for other folks who are in L&D and maybe are just, feel like they’re starting up the hill toward measurement. What advice do you have?
Kevin Yates:
So here’s my advice. My advice is focus on performance first, focus on a change of behavior, a change in action, a change of performance as being the deliverable, as being the outcome, as being the result for your training and your learning solutions. So my mantra, and my guiding principle is, find at least one thing about a person’s behavior or performance that you can attribute to training and learning and let that lead to the facts about impact.
Taryn Oesch:
Kevin M. Yates, the Sherlock Holmes of learning and development. Thanks for joining us today.
Kevin Yates:
This has been an awesome discussion. Thank you both.
Scott Rutherford:
Thank you.
Taryn Oesch:
For more information on this topic, we’ve got lots of great content on Trainingindustry.com. Don’t forget to check out the podcast episode page for this episode, Episode 19. We’ll be linking to you some great resources there as well, trainingindustry.com/trainingindustrypodcast.
Scott Rutherford:
And we do hope you’re enjoying listening to The Business of Learning. Thanks for listening and if you are, please continue consider rating and reviewing us on Apple podcasts and make sure you subscribe for notifications of upcoming episodes at trainingindustry.com. Thanks for listening.
Outro:
If you have feedback about this episode, or would like to suggest a topic for a future program, email us at info@trainingindustry.com or use the “Contact Us” page at trainingindustry.com. Thanks for listening to the Training Industry podcast.