Floor 10: The wood shop
Get your tape measure ready – we’re headed to the wood shop with Laurent Balagué of Formetris! This week we’re unpacking the best practices for measuring the effectiveness of your learning program.
Resources from the episode:
Transcript of the interview:
Kerri Moore 0:06
Hi everyone, welcome back to the Learning Elevated podcast brought to you by Docebo, the show where we help you elevate your learning efforts and move up in the world of enterprise learning and development. As always guiding you on your journey up the tower will be your elevator operators, myself, Kerri Moore, and my co-host…
Rob Ayre 0:22
Rob Ayre. Each week, we stop off at our new floor and today we’re getting off on the 10th floor, the woodshop. Here, we’re making sure to measure twice and cut one since we measure the impact of our learning programs. So this is just a really important episode. It’s one of those topics that I think come up almost every single time we have a conversation with a prospect or with a customer, really anybody who’s in the L&D realm. How do I measure the real impact of my program? This has become even more important in a world where the hole of the watercooler is gone. Right?
Kerri Moore 0:52
Rob Ayre 0:52
So how we are going to track is really important social learning moments. And so the interview that we’re going to have later on in the show is going to be from a gentleman, Laurent from Formetris and he mentions a survey that they built out with this. And they’re delivering it via an LMS. And it’s one of those things that becomes incredibly important in a world where you don’t have one to one interactions to be able to sort of measure somebody, you need to figure out digital ways to do it. Right?
Kerri Moore 1:19
That’s right, exactly. And I think we’re all sick of the smile sheets, you know?
Rob Ayre 1:22
Kerri Moore 1:23
How many customers do we speak to when they just, that is the way that they’re actually measuring whether they’re hitting their their mark, basically, and yet, we need to change from that, then as you say, Rob now is more important than ever to make sure that this is still going on with people at home, and that we’re still actually seeing some kind of impact from all the work that we’re putting into this.
Rob Ayre 1:41
Yeah, absolutely. So we’re gonna kick this one off with another article from eLearning Industry. And this one’s called “Measuring the Impact of Corporate Employee Training Programs, and the Role of KPIs” and this one’s written by a lady named Isha G. you know, and what it kicks off with and this is something that that Laurent, our guests for today also discusses and I feel like it’s probably something that just about everybody who’s listening to this podcast is familiar with: and that’s the Kirkpatrick Scale.
That’s one of those sort of first starting places that really introduced individuals to that whole idea of measurement, is that right Kerri? Like, it’s sort of that initial place.
Kerri Moore 2:15
Exactly. Yeah. And I mean, it just gets a little bit old fashioned, but it’s something that everybody understands. And so I think it’s a great way to kind of use that as a basis to measure how things are going for you. Another thing that I really liked about this article was that it really, really focuses on making sure that you’re aligning your KPIs with your learning goals as well. And so that means that you’re staying focused, you’re staying on track. And sometimes people don’t do that. And I think it’s a super important step.
Rob Ayre 2:41
Yeah, there’s sort of different things that you have to be cognizant of, especially an LMS administrator in terms of what you are measuring within your learning programs. So there are your sort of basic metrics, which is are people enjoying your program? Are they taking your program? Are they completing the courses, but then that sort of ends at a certain point. It’s like okay, so what? So the conversation more has to become what impact is my learning program having on my business. So we always talk about that as a business impact. And sort of aligning those KPIs with learning goals is something the article goes into in some really nice detail.
Kerri Moore 3:11
Exactly. That’s incredibly important as you say. Another thing as well, that we will just hammer home as much as we can, is to make sure that you know exactly what you’re going to measure and how you’re going to measure that before you actually start on this piece of learning content on all your learning programs. Because otherwise, you’re going to get these kind of iffy fiffy metrics out of the air, and it’s not going to mean anything. So you need to, again, nodes, relate it back to the business, and again, how it’s creating impact for you.
Rob Ayre 3:39
Yeah, and, you know, one final thing that I would just sort of touch on from this article, and as always, the article is available on the website. But, you know, really think about your employees performance and think about how you can isolate the training evaluation and really measure that impact on the performance. So you know, there are always going to be other factors that could influence an employee’s performance, you know, let’s say like a bonus structure.
So it’s really good to have those control groups and really take sort of a methodological approach to the way that you’re going to measure these things and really, really think about, you know, is there a control group? Is there? Is there a sort of outside variable that might be making a difference? If so, can we kind of take that out of the equation and really take a look and see, okay, is this training helping them to be better at x part of their job or x part of their performance?
Kerri Moore 4:21
Exactly. That’s right. I know, I think that we seem to be getting off on the floor now, Rob. So I’d like to introduce you to our guest today. And I hope that i’m pronouncing his name right. Laurent Balague, who’s the CEO and Co-Founder of Formetris – hope you guys enjoy this as much as we did.
Rob Ayre 4:41
So today, we’re joined by Laurent Balague: CEO and Co-Founder of Formetris. Laurent, welcome to the podcast!
Laurent Balague 4:49
Hello. Thank you for having me on your podcast.
Rob Ayre 4:52
So we’re so excited to talk to you today. You know, this this season, what we’ve been doing is focusing on setting up for success when you’re when you’re getting a new learning platform or when you’re launching a new a new strategy and with Formetris having such a deep knowledge and experience in the measurement space, you know, we wanted to kind of kick off by asking you, how important is measurement within enterprise learning?
Laurent Balague 5:14
Well, for me, it’s the equivalent question two, how important is measurement is how important is reaching a high performance on what you do as an organization as a team as a function? So yeah, it’s critical. Measuring allows you to prove and to improve your performance. So for both these reasons, and many others, it’s absolutely critical.
Kerri Moore 5:39
Right. And so what would you say that companies can do early on to set themselves up for success in terms of metric measurements?
Laurent Balague 5:47
So then, umm there are two ways you know to understand your question, is regarding at the program level, or at the more, at a more global level, typically when they’re when they install a new learning platform. But in both situations, I would say that the reasoning is the same.
Before doing anything you must know how you will measure your success, you must define what will be a success for you, for your internal clients, for all the sponsors of what you do. This is true at the program level, let’s say your internal L&D, your internal clients, who expect you to help them achieve higher sales, better quality, better client satisfaction, etc, etc. So you have a business objective that you need to translate into skills and behavior objectives. And that will end up you know, defining a learning program that normally should be done before you even start designing the program. And it’s the same when you install a learning platform. Why do you do that? How will you and your clients and when you’re… your.. the people financing project will measure that the project’s success? So that should be something defined in the first step of the project.
Rob Ayre 7:09
When it comes to that, that sort of defines what it really means. How important is that sort of executive level buy-in the executive level involvement and understanding how you’re measuring?
Laurent Balague 7:20
It is absolutely critical because they’re in uhh.. once again, that program level.
There is a common myth, shared by many, many, many people is that training is magical. Is that it’s a magic wand: when you don’t know how to improve a situation, then you call the training department. And uhh and uhh.. Up! A learning program is a magic wand. And you know, on Monday morning, they don’t know. And on Monday evening, they’re perfectly capable of doing what they were not able to do a few weeks or a few days ago. Unfortunately, it only happens in Matrix the movie where you know, Keanu Reeves saves people in a few minutes. In real life, it doesn’t happen that way.
So there are many illusions about the capacity that learning can do on a short term basis without the involvement of other functions. So this is why to really engage your internal clients, the executives on what they really expect and sort of having a contract between you L&D department and them. And it’s true when it’s internal and it’s also true when you’re in a training organization selling training and, or other things to outside clients. But in both situations, you need really to have a detailed contract that will start by the business objectives, but also detail what the learning share will be precisely.
Kerri Moore 8:45
Would you be able to share with us some of those best practices maybe if that’s for… yeah, how you plan something out, or how you should expect, you know, the metrics to come in or how are you going to spread them, whatever that would be?
Laurent Balague 8:56
Oh, yeah. Oh, so the list could be long but…
Kerri Moore 8:59
Laurent Balague 9:00
There is a… I’m still a big fan of the Kirkpatrick Scale. And it’s… I know it’s… it’s from the 50s from the previous century. So because it’s so old, it looks suspicious. And, and it’s true that the scale is not scientifically perfect. But the great thing about the scale is that everybody understands it. Level one, their reaction satisfaction, engagement. Level two, we are talking about what people learned in terms of knowledge skills. Level three is transfer: what they do, from what they’ve learned. So we are now talking about performance observed on the daily job. And level four, its impact in terms of business and all the operational KPIs. This, every executive will understand it in five minutes. So a best practice is typically to explain that chain of consequences so that they understand that, you know, they’re from learning to business impact there’re different steps.
So that’s the first thing. The second thing is that no matter what L&D professionals say, in their DNA, they’re obsessed with satisfaction. And so we have to admit that. We have to recognize that, you know, this is a natural trend. And it’s funny because I do a lot of training and conferences on the importance of learning impact but when I, when I do, you know, facilitate and organize this training, my obsession is that people laugh at my jokes, and that, that people would be very happy at the end of the day. So even though I’m saying the opposite, I’m saying no, we should not focus on satisfaction, but me as a trainer, this is the only thing that matters, at least on that special day. Then, of course, I intend to sell some things, but that’s that will be later. So it’s important also to recognize that it’s very hard to have an impact focused learning organization because of all these three. Because of this natural tendency to focus on satisfaction by all the stakeholders, even the CEOs, actually, you know, when CEOs are in the classroom, they are obsessed with satisfaction, and the rest doesn’t really matter.
Rob Ayre 11:14
So when organizations are kind of ready to take that plunge and say, Okay, I’m going to really focus in on measuring this learning impact. What do you think’s one thing that people often forget when it comes to that measurement that they probably shouldn’t?
Laurent Balague 11:26
Okay, another thing is that you because, yeah, because of what I said earlier, learning organizations will not have enough time to deal with the learning impact of all courses, at least, that’s a very ambitious level. So you need to be pragmatic, because most of L&D time is spent on the future on the future program to design to roll out. Looking backward is necessary to measure you know, you need the learning to happen first and then you can measure some things. So with the little time that learning teams have, they need to have a sort of, you know, what I described as an 80:20 approach.
We know that they will spend 80% of their time and resources on the top 20% of the programs. The most critical for them for their organization. So for these programs, they need to be more ambitious and really have a detailed measurement strategy to really build the chain of evidence: what people learn, how it affects their daily work, and in the end, the consequences in terms of business. And something very detailed so that the continuous improvement process will start very early on at the pilot phase of the project. And then they will have enough data to.. and stories, uhh.. to come back to their internal sponsors or their external clients. For the remaining 80%. You still can go beyond satisfaction, but in a more automated and scalable way. So really having two different measurement strategies for these two types of programs. That’s one of the best practices we strongly recommend.
Kerri Moore 12:58
Okay, so I have a little scenario for you. If you were to launch a learning project in say, the next few months, what would be the first three things that you would make sure that you could, you know, definitely track accurately?
Laurent Balague 13:13
Okay, well, uhh uhh let’s uhh… If I don’t have anything more specific, I will be still, you know, general, but I would say, you know, what… who is sponsoring the project? who is the, the… So, who is the top boss.. the, the… top uhh stakeholder of the project? And what is.. what.. What is his or her vision of the program’s success? That would be point number one and that’s important for the project and that’s important for my personal future.
Second one is what is expected concretely of this learning program to achieve the overall project. That’s important. So that’s, that would be.. that would.. that would allow me to build the contract I was talking about between the learning team and this sponsor. And the third, is that really a clear vision of the time. When the program is expected to start? How much time do I have between the first pilot and the rollout? Is it.. Is it possible to have enough time and data to improve it at that phase? Or is it just, you know, an instant rollout? Meaning that I won’t have time to consider any continuous improvement process? That was probably the third thing I will look at.
Rob Ayre 14:33
Laurent Balague 14:35
And maybe I can define a measurement strategy.
Rob Ayre 14:38
Right. And having those sort of three things in place are your foundation to be able to measure to figure out how to put together that strategy, right?
Laurent Balague 14:46
Rob Ayre 14:47
Yeah. So what if there’s, if there’s one thing that makes a really great organization stand out from the rest when it comes to measuring their impact?
Is there one thing that you’ve seen that really says okay, this is what’s out elevating this organization over perhaps their competitors?
Laurent Balague 15:02
Yes, it’s the capacity to use data. Is not.. I don’t measure the performance in terms of metrics of an L&D organization, not on the quantity and the quality of data they collect – it’s what they do with it. If you can prove that you know the business impact of all your courses, but you never act on it, you should do something else because you are losing your time.
And you know, the opposite, you just collect satisfaction, but you’re super reactive. And you actually do a lot to improve on the session based on that.. well the second case is better. So for me what is more important is what you do from the data you collect than the data you could enforce. In an ideal situation. If you use a typical solution like ours, you can do off. But the worst situation is to annoy people to collect data to ask for feedback. Let’s say you are.. you ask the managers for feedback, but you never do anything from that. That will be worse than doing nothing.
Rob Ayre 16:03
So what are some of the things that you’ve seen organizations, those high performing organizations do with the data? Can you give us some examples that are that our listeners can take away and say, okay, you know, maybe this is what I need to be starting to do with that data that I collect?
Laurent Balague 16:15
Yes. So first, enrich a strong continuous improvement inside the L&D and having defined the, you know, clear.. clearly the accountability. Who is in charge of collecting the data, looking at the data, treating, initiating actions to improve and to correct, maybe to come back to your clients if you’re a training provider. So that is absolutely critical. That’s a real best practice to really have clearly a quality process defined, operational and I’ve defined accountability very precisely.
The other thing is what I mentioned earlier, is a 80:20 approach, so don’t.. don’t launch the same evaluation every time.Take into consideration the importance of a specific course. Third thing I would also make sure that you.. the way you… you umm.. you engage, typically the learners and their managers make the evaluation process also useful for that. So that’s another topic is that, of course, you collect data on learning, on behavior change, on skills, on performance improvement, to what I.. to do what.. to achieve what I said in the introduction to prove and improve the impact of all your learning programs. But doing so, it’s also possible to make all this data collection process and typically the surveys and the assessments that you will initiate at the learner level and the manager level to make all this process also useful for the learners and for their managers.
So turning the evaluation to something that will become useful for the learners. I will give you an example. Asking the learners what they intend to do. As you collect some, some insights of the satisfaction, what went well, what didn’t go well, but also as them okay, what do you plan to do from everything you’ve learned. And sending this list of commitments, this action plan to their manager to say, okay, you know, John has recently attended this communication program, and John intends to do this and this in the coming weeks. Please help him. It’s a very simple thing. But that can be extremely helpful for the learners and for the manager so that the manager can do the role of you know, helping, putting the team into a situation of applying what they’ve learned, and facilitating the learning transfer and the application of learned.
And then, with the same action plan, coming back to both a few months later and saying, to the learner and to the manager, okay: has this list of actions been done or not? That is a way to measure the actual sort of, go back to level three to add on to take this example or the actual application of learning. But turning into something that will increase the accountability of the learner, the manager, and in the end actually improve the impact of the course.
Rob Ayre 19:19
The feedback loop is so important, isn’t it, with learning measurement is the actual making sure you’re circling back and saying, okay, you know, we did put this in place we wanted to achieve this. And I think oftentimes, what I’ve seen is that some organizations maybe don’t look back and say, Oh, we actually did do this or didn’t do this, or, you know, here’s an area that we need to improve.
Laurent Balague 19:39
Absolutely. So, there are different different levels and feedback. So first, as what I said earlier is that you collect data but then after the data collection and the data analyzes there, there must be actions to improve the course, to improve the content in LMS, to improve the next sessions. If it’s an ILT or VLT. The feedback also should be also directed at the people who get the feedback. So the learners, the managers, maybe the trainers. And then finally, and this is the most critical for L&D, organization, coming back to the sponsors coming back to all the other functions.
I said that measurement is used to prove and improve your performance. But there is something in the middle. Is also prove that to improved your performance, and that’s communication, and it’s political, but it’s critical for L&D organizations to show that even though they are one of the softer functions of an organization, they also apply, you know, quality rules, quality processes, and there there’s the total system constraint and measurement strategy and in the end there is there is a permanent improvement of their performance, their content, etc. And communicating on that is also critical.
Kerri Moore 20:55
You’ve given us some fantastic takeaways there, but I wonder if there’s any other kind of best practice or tips that our listeners can maybe take away from this?
Laurent Balague 21:03
Well, I will open another field because at this stage I’ve been focused.. focusing on the.. on the formal learning, you know, but you know, you’re from Docebo. So you know that learning can happen, you know, in something less formal, where you just have a platform, and you learn what you want, when you want, from whoever you want in the organization or outside it. And then the whole measurement strategy becomes much, much trickier because the internal sponsor is no longer in here. It’s just an individual doing something from the platform that was available. And for this, the whole measurement strategy has to be redefined. Kirkpatrick doesn’t apply anymore.
Kerri Moore 21:46
Laurent Balague 21:47
There are no business objectives and that we’ve been working on that.. this is a new generation of analytics. It’s possible to measure things but it’s very different. It’s a totally different philosophy and methodology. We focus on learning culture on our site to measure the dynamic of the daily learning at the team level, at the skill level, to see what works well, in what situation, what countries, what teams, what functions? There is a real dynamic, let’s say on job related topics, or on technical topics, or on soft skills on a management. Anywhere, this is actually not as intense and helps our clients to improve that and to find the levers to improve this learning culture. So this is a new world of analytics, a new world of measurement, but becoming more and more central for learning organizations.
Rob Ayre 22:38
And have you seen some best practices for those organizations that are starting to figure out the best ways to measure that social learning and those in the flow of work kind of discover your own own learning opportunities?
Laurent Balague 22:50
So you say, best practice in terms of measurement, was it your question?
Rob Ayre 22:56
Yeah, when it comes to social learning.
Laurent Balague 22:58
Rob Ayre 22:59
Laurent Balague 23:00
So, umm.. yes, umm.. of course, you can look at the LMS data and you know, talking to an LMS team, I won’t say this is not something that people should not do. But this is not sufficient. The best technology for social learning is the coffee machine, it’s not the LMS. And sadly, we live in a world when the coffee machine doesn’t work, the LMS still works, and is probably overloaded by you know, requests, but the coffee machines are desperately alone in offices worldwide. And this is a very sad image that shows that you know, social learning has dramatically been damaged by the current crisis, even though the technology and the online content has never been so.. so rich and so diverse. So the measurement strategy has also to reach beyond the basic and the standard Learning Technologies because most of the learning happens outside.
So you need to take that into consideration to really measure the reality of daily learning, informal learning, social learning. This is why we’ve developed a new approach to measure this learning culture, mimicking the engagement surveys. So it’s a quick survey that can be launched at a very large level – we’ve done that with many clients of ours globally. So that we do not have that corner we are sure to monitor the reality, and just not just not being.. being trumped by the.. by just one technology, while people actually share on Slack “Well I thought it would be on LMS” or they use Teams and nobody was aware of that, and so on and so on.
Rob Ayre 24:45
That makes so much sense and we really appreciate all of your insight today, Laurent. Thank you so much for joining us today.
Laurent Balague 24:50
You’re very welcome. Have a great day and good weekend.
Kerri Moore 24:52
You too. Thanks so much.
Rob Ayre 24:54
Laurent Balague 24:54
Rob Ayre 24:58
Thank you Laurent for joining us for that interview. You know, I don’t know if we can really overstate the importance of measuring. Umm.. well it becomes one of those things where it’s like, why are we doing what we’re doing? Right. And, and one of the points that I think that Laurent brought up a bunch, and it’s something that I’m even going to apply to sort of my marketing think – is that measuring really allows you to prove, but it also allows you to improve. And I think that that’s sort of like a really good mantra to stand by when it comes to doing really anything in business. Is it like, if I.. if I put a certain system in place to measure what it is that I’m doing, what am I.. what am I thinking about trying to prove? And then what am I trying to improve once I do collect that data?
Kerri Moore 25:38
Exactly. I mean, how can we know how to better ourselves? In this we’d look back and assess and kind of go over where we were stumbling, what were we doing, right. So yeah, completely agree. A really, really important point there. Another one I wanted to bring up was actually my favorite part of the call was when Laurent was speaking about how L&D pros are obsessed with satisfaction and that can be kind of hard for them to focus in on the impact and he mentioned that, you know, when even when he’s giving talks, he wants to make people laugh at his jokes, you know? And he doesn’t think about maybe the impact that he’s having on them and the knowledge that’s actually being set off. It’s very human, I think, to get that kind of, you know, response back and to feel like “ahh, people liked me!” You know? Or.. for example for L&D people who liked my course, they thought it was really cool, but did they actually take something away? What was the impact of that course? And so really, really important to bring yourself back to that.
Rob Ayre 26:28
Yeah, it is. And, you know, one of the other things too, and maybe it’s a tough thing to say on a on an L&D podcast, but we do have to recognize that training, it’s just not the magical be all end all programs, not gonna fix all issues that exist, unless you get that buy-in from the C-levels, and that those changes are made throughout the entire business. So, you know, yeah, you can put in a training program to try to change some specific thing within your business, and it might improve that one specific thing, but if you don’t think about it from a from a full landscape view, I guess you could say, you’re probably not going to have the total impact that you want to have. So really sort of getting that buy-in from the executives and making sure that the things that you’re doing are, like I said, across the entire landscape of your operations, it’s gonna make all the difference.
Kerri Moore 27:14
That’s exactly right. I mean, it takes a village, right? And if you’re working in your silo, and you’re trying to see change, but you’re not going to have these people above that are going to buy in and agree with you, then it’s only gonna get so far. So yeah, I completely agree.
Well, thanks again for joining us for this episode. It was fantastic to speak to Laurent, what a great guy. So funny. I was laughing so much throughout.
Rob Ayre 27:32
Kerri Moore 27:35
But next week, we’re going to be stepping off on floor 11, which is The Lounge and we’re going to be speaking to the guys over at 70:20:10 Institute. Join us then.
Rob Ayre 27:44
For more information on what we’ve discussed today, including links to resources and downloadable assets, go to docebo.com/podcast. That’s D-O-C-E-B-O.com/podcast and subscribe to our newsletter.
Kerri Moore 27:59
You can also find us on iTunes, Spotify and all other places where you get your podcasts by searching Learning Elevated, so don’t forget to click “subscribe” so you know when we’re disembarking on another floor.