Useless vs. Useful Learning DataPosted on: January 29, 2016
An Interview with Learning & Performance Consultant JD Dillon on metrics L&D should value to prove learning success in business
I recently had the pleasure of speaking with JD Dillon (pictured right), an experienced, active and well-respected proponent for improving corporate learning and development. JD authors his own Just Curious Learning Blog and speaks regularly at the industry’s top learning conferences. Most recently, he served on the ATD 2016 TechKnowledge Conference Program Advisory Committee where he led a session entitled: Reworking the Puzzle: How to Build a Smarter Learning Ecosystem. Here’s what JD had to say about learning metrics and how to tie learning to bottom-line business results.
Axonify: Most large organizations have an LMS in place. What are the top ten learning metrics you can pull from a typical LMS?
JD Dillon: I’d really have to stretch to find ten useful pieces of data that I can pull from a traditional LMS. This data is typically limited to logistical details about specific training events and includes items like:
- Level 1 survey feedback
- Level 2 assessment scores
- Training session dates/times
- Training hours associated to specific objects
- Assignment/registration info
- Organizational hierarchy info pulled from HRIS
Axonify: How does this data help you understand and measure learning?
JD Dillon: It doesn’t. While you can tell who attended and completed training and what they scored on any assessment, that doesn’t necessarily indicate learning, or more importantly, performance improvement. A traditional LMS doesn’t offer effective ways of tracking knowledge growth over time, or correlating training activity to real-world performance.
Unfortunately, the limited data collection and reporting capabilities in an LMS prompt L&D organizations along with stakeholders to value the wrong data simply because it’s available. While I do care about resource utilization and formal training attendance, I really care about the connection between those data points and business KPIs, a connection that LMSs do not make easy to establish.
Axonify: How do you or other professionals in your industry use this data (or is it just data that’s collected but not used)?
JD Dillon: Based on my experience and ongoing conversations with peers across industries, I’d say that we don’t leverage data very effectively as an industry.
This starts with a lack of effort in designing learning towards the collection of meaningful data. Then, because our systems do not support effective data collection/analysis, we are left with limited options. We also do not stress data analysis as a core L&D competency, which limits our ability to improve the situation in terms of selecting better systems and designing to ensure measurable outcomes. The order-taking nature of L&D also disrupts our ability to meaningfully use data, as it requires focus on long-term objectives rather than short-term deliverables and check-in-the-box completion.
Finally, we limit ourselves by separating “training data” from “business data.” It can be extremely challenging or impossible to locate and correlate information from various business units, including sales, customer service, quality, HR, L&D, etc., and therefore people often don’t make the effort. This means we fail to gain valuable insights into performance.
Axonify: What data would be meaningful for measuring learning and tying it to business results?
JD Dillon: Trends. An effective measurement strategy must take long-term changes in performance into account and connect them to learning opportunities. This starts with subjective and objective performance measures. In addition to hard metrics like sales, customer satisfaction, and quality feedback, we must collaborate with managers to capture behavioral insights into performance that don’t specifically appear in reports.
At the same time, we must select and effectively integrate systems and processes that help us collect data on how employees are using support resources, including learning content. This includes everything from employee traffic flow through content repositories to knowledge assessment scores and practice session observations. We can then identify trends over time from this “learning data” and connect these patterns to changes in performance. This will help us ask more informed, meaningful questions when performance gaps are identified as we look for the best ways to support the organization.
Axonify: How would you recommend L&D professionals get started on identifying and measuring the most important learning data?
JD Dillon: Get more comfortable with data, especially information outside the traditional realm of L&D. Do your research. Take advantage of shared resources. Enroll in online classes to enhance your measurement, reporting, and analysis skills.
Then, focus on performance. Find partners who have access to and experience with high-value performance data within your organization. Work to become experts with this reporting so you can, not only speak the language of the business, but also design learning and support strategies that target the right data for the purpose of finding meaningful connections. Continue to use learning data to inform your questions and decision-making, but focus on the measurements that are most valued by your partners across the business.
Is this your challenge? Next week, we’ll feature a post about how to turn learning measurement on its head. We’ll take you through a business-first approach that focuses on key business objectives and then uses those objectives to drive your learning programs.