What’s wrong with learning measurement today (and how to fix it)
Learning measurement is more important than ever—especially when it comes to moving the learning function forward within today’s diverse, dispersed (sometimes hybrid) and otherwise complex workforce. Despite its accepted importance, measurement continues to be one of the biggest challenges facing L&D today. Why?
Although technology has made it easier to track massive amounts of learner data, leaders still struggle with understanding exactly what stats they should collect. And according to Kevin M. Yates, The L&D Detective and recent guest on “In The Know,” the fact that many don’t think about analyzing the impact of training and development solutions until after programs are launched is a big part of the problem.
How can you (finally) fix learning measurement in your organization? It’s time to refine data practices and think about impact proactively instead of reactively.
Learning measurement is broken
Good data practices are essential for delivering clear value to stakeholders and discerning how training and other learning solutions influence performance and business outcomes.
But so many organizations either don’t measure—Learning Technologies points out that 60% of surveyed respondents don’t have analytics set up for their L&D teams—or collect the right data at the wrong time.
“What usually happens is there’s a training program that’s created, or learning solution that’s launched, it’s consumed and then the question becomes, ‘What was the impact?’” says Yates.
This makes it harder for learning professionals to overcome one of the biggest barriers to measurement: carving a budget out for analytics while competing against other priorities without the concrete proof needed to show that it’s really working.
“The best way [to approach measurement] is proactively planning for impact intention, so rather than trying to figure out what the impact was, you’re already thinking about the expected impact proactively. So before anything is designed or utilized,” he explains.
Design with the end goal in mind
Keeping in mind what you want out of a learning program throughout the entire designing and delivering process is key to achieving the intended results. This may feel obvious but the perpetual learning measurement struggles we’ve all encountered prove otherwise.
“In the training, learning and development industry, we talk about designing with the end in mind. We don’t always do a great job with that, but I believe that when solving measurement mysteries—in terms of measuring impact and results—the biggest challenge is not being proactive about desired outcomes that are measurable until after programs are designed and used, and searching for the impact later,” says Yates.
Bonus: getting proactive and staying transparent with your workforce about the reasons behind the training content you deliver shows employees you care about solving their daily struggles and helps them feel prepared and confident at work.
Axonify empowers leaders to tie training program efficacy to business goals.
Impact vs fulfillment of purpose
Not all data is important data to EVERY learning campaign. Different solutions have different purposes, and, for some, you just might not be able to measure impact in the traditional way.
“If you proactively plan for goals, the essence of what you’re determining or measuring is the extent to which purpose was fulfilled,” says Yates. “What that means then is when you’re designing and creating your training and learning solutions, you’re determining the purpose for which they’re being built.
“In our industry, the word ‘impact’ has become a cliché and a buzzword. But when we contextualize it differently, it gives us a bit more intention, allows us to be more strategic and focused on exactly what it is that we’re trying to do. It’s not just semantics—the language that we use, how it shapes our mindset and how it informs our actions matters.”
Taking a proactive approach to data practices can make measurement more effective and ultimately more meaningful. Start by thinking about what you want to accomplish with a program before you start rolling it out instead of the common mistake of making impact an afterthought. That way, when the time comes to measure success, you’ll already have an idea of what data is important and how it should be used. Then, collect data points that are purposeful and aligned with your business goals. In short: don’t try measuring before figuring out what you really need—and why.