By leveraging learning data and analytics, organizations can better understand if their programs drive desired business outcomes and improved employee productivity. Learning analytics provide the opportunity and insights necessary to better meet learner expectations. Leveraging learning data properly will reveal their learners’ preferences, including which modalities and design features they value. This data can also be used to uncover poor training programs and segments by evaluating whether the learning has been understood or even consumed by the learner. These discoveries allow learning leaders to prioritize learning that works and eliminate programs that waste the organization’s time and money.

We understand that leveraging learning analytics is often easier said than done. The potential and availability of data in organizations presents ample opportunities and plenty of challenges. Some organizations may not have the resources to collect learning data, and organizations that do have the resources may struggle to determine if they are collecting the right metrics. You may even have the necessary resources and correct data, but struggle to accurately interpret the results.  Where do you go from there?

To gain a greater understanding of learning analytics, we asked a group of leading learning and development companies and knowledgeable learning leaders to share their best practices and solutions for learning data collection and analysis. We hope the insights you gain from this report will provide actionable tips that will inform your organization’s learning analytics practice, as well as offer a comprehensive understanding of learning analytics’ role in your organization’s success.

What Are Your Business Partners Measuring?
Bonnie Beresford, Director of Performance and Learning Analytics, GP Strategies

Training departments will forever be challenged if they continue to try to show value by exclusively using their own data. Learning data provides insights and helps learning leaders manage the efficiency of the training department. However, the data is not sufficient to prove the value of training; you must reach outside the learning organization to credibly show value. You need to meet with business partners to discover what they measure and what KPIs are on their dashboards. Then align your learning solutions with these business KPIs during the design stage. Collaborating up front with business partners pays great dividends when it comes to measuring and showing value. Your partners will recognize your interest in their business, understand how your solutions are going to help, and get you the business data you need to show training’s impact on the KPIs that really matter.

Operational & People Metrics
JD Dillon, Chief Learning Strategist, Axonify

L&D must expand its definition of learning data to include a variety of operational and people metrics. Start with a clear, measurable business goal, and avoid the temptation to include a wide range of topics in your solution. Instead, focus on a specific result that can be measured using existing business data. Then, work with subject matter experts (SMEs) to identify the employee knowledge and behaviors required to achieve your goal. Look for existing data collection opportunities within the operation, especially related to behavior observation. Finally, determine the right-fit learning solution based on your identified knowledge, behavior and result targets. Ensure that your design, regardless of content modality, provides opportunities for you to measure changes in these elements before, during and after implementation. This process will allow you to determine your impact: the direct connection between your training and changes in business results.

Situational-based Learning Assessments
Dr. Ian Stewart, Executive Director Learning and Design, Kaplan Leadership and Professional Development  

Imagine learning data that reveals:

• The current level of understanding and application across an organization.
• Those people or groups who are a training priority.
• How to personalize the learning provision.
• The effect the training has made.

Begin by harnessing the contextual richness of situational-based judgement assessments and presenting scenarios that simulate the real-world application of knowledge, replete with ecological factors such as ambiguity, time pressure and social friction.

Combine the measure of the respondents’ competence with the level of confidence in their responses. Although competence is a moderate indicator of behavior, competence allied to confidence makes for a much stronger indicator.

Delivered and managed online, situational judgement assessments can provide a data map of current practice, identifying respondents with low competence and high confidence who are a training priority. The function also distinguishes the respondent’s strengths to inform personalized learning and offers a simple test-retest measure of training effectiveness.

Performance Mindset
Sales Performance International

The sales training industry would benefit tremendously from a more definitive, evidence-based approach to understanding the return on capabilities development for sales. Sales organizations need to shift from a training-focused mindset (knowledge acquisition) to a performance mindset (outcome attainment).

To begin, organizations need to translate their growth strategy into specific sales goals and metrics. Then data-driven models can deduce the most important competencies for attaining those goals, which also assumes the existence of well-defined sales competency models. With a capabilities model in place, organizations can apply a five-step process to drive continuous, measurable improvement.

Through the use of well-defined models, integrated learning and development content, and data-driven technology, it is possible to efficiently collect data at each step of the process. Organizations can also apply integrated analytics to measure how sellers are progressing in their development, and how their changing capability levels relate to their attainment of specific outcomes.

Get Aligned
Tim Dickinson, Director of Strategy, Watershed

To effectively leverage data to improve business outcomes, the first and most important practice is having a shared understanding of how strategic business priorities align with their respective learning programs.

Creating alignment with strategic priorities requires cross-functional discussions about selecting or defining priorities. It also begins the process of gaining stakeholder buy-in and provides the foundation for a chain of evidence to demonstrate the eventual impact of learning.

Granular Learning Data
Christopher Cameron, Valamis

“Feel-good” statistics, like completion metrics, are dead. Training teams need to look at granular learning data, like learner behavior, in conjunction with business metrics to understand the true efficacy of their training. Data science allows companies to analyze huge swaths of learning data and key business metrics to see the behavioral influences on learning. If you can influence a desired behavior to meet a business goal and accurately measure the impact learning had on that outcome, management teams will be armed with the information necessary to make strategic decisions regarding training and their business.

Measurement from the Outset
Edward Trolley, Senior Vice President of Managed Training Services, NIIT

The measurement of training value can be elusive, but it is important. Begin with establishing a measurement methodology. Next, have a measurement discussion at the outset, not at the end. It is not necessary to measure everything, but it is important to measure what matters (i.e., large projects, high visibility projects, business issue driven projects, etc.). With the advent of analytics and integrated systems, we can tie the measurement of L&D to various metrics, like employee performance. The objective of measurement should be to consistently build confidence in your customers, so they know they will get an acceptable ROI. To reinforce this, develop a results contract at the beginning of customer discussions to define the scope of the work, the results expected, the metrics to determine the results, what needs to be done to ensure the results and any other external factors that could impact the results of training.

Just Get Going!
Andrew Joly, Director of Strategic Design, LEO Learning

For the past three years, we’ve asked L&D professionals to share their perspectives and progress on learning data and analytics in our annual measurement survey. Examining the top barriers to success provides interesting insights into the first steps toward accurate measurement analysis:

1: Get stakeholder buy-in
Early on in the process, engage senior stakeholders in benefits-focused conversations.

2: Build team capabilities
Buy, build or borrow data and analytics skills, and begin to build the skill change across your team.

3: Start small
A “narrow-but-deep” approach is a useful focus to implement when creating your chain of evidence.

4: Understand what tools are needed
The key tool in this area is a learning record store (LRS). This will collect experience API (xAPI) statements from different learning events and connect to business data from elsewhere in your business.

Short and sharp. Our advice: just get going!

Understanding First
Raytheon Professional Services

For L&D professionals, improving business outcome comes down to improving the outcome or work output of individual employees. This upfront analysis needs to take the whole performance system into account. It is not only about learning management system (LMS) data but rather about collecting and analyzing learning, workforce and performance measures to extract valuable insights that will guide the solution design.

  • L&D professionals need to adopt a consultative approach combined with an analytical mindset:
  • Ask the right questions with a good dose of critical thinking in order to unveil the real issues.
  • Collect and analyze new types of data that will highlight the most difficult and most frequent tasks that have the greatest impact on business outcomes.

Let’s call it, “Understanding First.” This approach not only provides insights to inform the learning strategy but also establishes the baseline measures to prove the value of our solutions.

On-the-job Performance Data
Jim Kirkpatrick and Wendy Kayser Kirkpatrick, Kirkpatrick Partners

Most training organizations gather data about the training itself, such as measurements of participant reaction and learning. While this type of data will not directly improve business outcomes, training organizations frequently report this data and attempt to claim some credit for the outcomes. However, these stories have no credibility because numerous factors influence organizational results.

Fewer organizations gather data useful to the business, that which relates to on-the-job performance after training and changes in the key company results that the training was designed to improve. The single most important thing training organizations can do is refocus their resources on supporting and measuring what happens on the job after training. This is where training value is created, and it is the area that is most often overlooked.

On-the-job performance data connects training and organizational outcomes and creates the story of value required to sustain training investments.

Apply Data Science
Stephen Young, Ph.D., Manager of Leadership Analytics, Center for Creative Leadership

In 2018, organizations outsourced $34.3 billion in leadership development initiatives. Only 10% of global CEOs reported that those initiatives had a clear business impact. Is your organization one of the 10% that can demonstrate impact or the 90% that can’t? If you’re in the 90%, you need to apply data science to the data you already have.

Data only becomes an asset when it’s leveraged to inform decisions. Most organizations capture data about their leader behaviors and employee engagement but rarely tie that data to business outcomes. The solution: Use predictive analytics to tie your leader data to tangible business metrics. The resulting models can help you prioritize leadership investments most likely to bring improvement in critical business metrics. Then, once you’ve made your training investment, you can track improvement both in behaviors and in business outcomes to demonstrate return on investment (ROI). Predictive analytics help remove risk from decision-making, so you can make strategic bets on your leadership development investments.

Engagement, Experience, Effect
Richardson

Analytics are more than a way to verify outcomes at the end of training. Effective measurement guides the training process as it unfolds. Participants discover where they need to focus their attention and where their blind spots reside. Measurement is for the learner as much as it is for the leadership.

Despite these benefits, many organizations fail to generate and measure data. Often, the data sits in disparate systems, making it difficult to connect the data to business impacts. Moreover, the available data represents different levels of accuracy, forcing the user to qualify the analytics first. Even after meeting these challenges, it is difficult to know what to measure.

Simplify the process by segmenting measurement into three key parts:

  1. Engagement: How are learners progressing through training?
  2. Experience: How memorable is training, and how excited are learners to participate?
  3. Effect: How has training improved business outcomes and skill adoption?

More Than Technology Alone
Explorance

The top performing corporate training organizations recognize that learning analytics require more than technology alone; it’s a transformation process rooted in human cooperation and geared toward business objectives. L&D leaders first align with strategic business goals. Then, establishing a common language is crucial when agreeing upon leading indicators that describe how training results lead to improved business outcomes. For a proven and accelerated process, L&D professionals derive KPIs from a validated learning impact model.

The right technology optimizes the transformation by automating data collection and reporting processes, allowing personnel to focus on analysis and actions. Internal and external benchmarks enhance the credibility of measures and shape performance expectations. These benchmarks further reinforce business executives’ confidence that L&D programs function adequately and maximize impact.

Finally, this process results in building practical data literacy skills at all levels in the learning organization, so all stakeholder groups can effectively leverage role-based reports, dashboards and data exploration tools to monitor progress and transparently recognize the value of training programs.

Valid & Reliable Assessments
Kristin Bernor, Marketing Manager, Questionmark

With the growing demand to leverage learning data to improve business outcomes, training organizations can develop better measurement practices to prove the value of training by implementing valid and reliable assessments. Assessments are a valuable tool before, during and after training that define learning success. Organizations can gain a better understanding of the effectiveness of the learning and make the appropriate adjustments to meet goals and objectives.

Key drivers for measuring learning include improving the effectiveness of learning and deliberately linking learning programs with organizational performance, individual performance and employee engagement. By utilizing assessments and the analysis of those data points, organizations can prove learning impacts these areas. Observational assessments can also be used to check on-the-job performance. Valid and reliable assessments result in meaningful measurement outcomes that prove the value of training.

Turn Away from Event-driven Metrics
Sam Shriver, Executive Vice President, The Center for Leadership Studies

Old habits die hard, and learning professionals have long been identified as event-driven creatures. As such, much of our collective design and development energy remains focused on the experience and the Level I reactions that experience produces.

Developing better measurement practices begins with putting those habits in the rearview mirror. The primary stewards of your organization simply do not care what learners think or feel about company sponsored training experiences. They seek evidence of change tethered to training that positively impacts the results they have responsibility to deliver.

The implications of that reality suggest that the target of your measurement practices should be pretty much anybody but the trainees themselves. What are those trainees doing differently that is attributable to the experience, and what impact is it having on the outcomes key executives have invested interest in achieving?