No Tags Found!


leolingham2000
260

FOR trainers / HR managers, a piece of useful information.

Measuring Effectiveness With Learning Analytics

May 2005 - Chris Moore



The term “learning analytics” has been thrown around the training and development industry by everyone from technology heavyweights to analysts, consultants and practitioners. But what does it mean, how does it work, and where’s the value? In short, learning analytics means the study of the impact of learning on its learners. One of the most common ways to study the impact of learning is by measuring its effectiveness. Many people use the terms “effectiveness” and “impact” synonymously when it comes to measuring learning, but there are differences.

The origins of measuring the effectiveness of learning relates to its pedagogy—seeking to understand whether learning effectively meets its original design objectives. Furthermore, CLOs seek to understand if learning is effective based on the manner in which it is delivered. Studies by Richard Clark have shown that there is little difference in effectiveness between traditional and technology-delivered learning, so the real focus in measuring effectiveness returns to its pedagogy.

There are good arguments that position learning effectiveness as a learning measure, not a business measure. However, in the business of delivering learning, the line between the two becomes less clear. What learning executives are really after is measurement of the impact learning has on the learner. Effectiveness plays a key part here. If the learning is not properly designed and aligned to the business goals of the organization, the impact on the learner is less likely to produce the desired results. Learning analytics can help CLOs measure and monitor both effectiveness and impact, as well as the overall goals of their learning organizations.

Why Learning Analytics?

Analytics is the breakdown or decomposition of a system through logical analysis. In the case of CLOs, that system is learning and the myriad data points that accompany it through its evolutionary phases of planning, design, development, delivery, implementation, maintenance and retirement. Each phase introduces numerous dimensions, such as learners, instructors, resources, objectives, actions, processes and outcomes. The logical part of analytics provides the ability to ask questions about the learning system and the relationship between the various dimensions. Keep in mind that “learning system” refers to the whole learning ecosystem, including the business systems it touches—not just a single system like a learning management system (LMS) or learning content management system (LCMS).

Why do you need analytics at all when you’ve invested in an LMS or LCMS? Don’t these systems have turnkey and custom reporting capabilities? Yes, but most of these systems are data-rich but information-poor. They collect and store thousands of data points of learning activity and process, but rarely present the data back in an informative way that allows business decisions to be made. Most of the reports included in the system are activity-based. These types of reports are valuable because they help CLOs manage the learning environment, but they don’t help them lead.

In order to lead learning organizations, CLOs must have information that shows what they did, what they’re doing now and where they’re heading. This same information must dissect each phase of the learning system and call out trends, not just anomalies, so that decisions can be made in advance of an unseen failure. Learning executives must know if they’re meeting goals, and if not, what they can do to correct course. This is where studying impact and measuring effectiveness come into play.

Measuring effectiveness can be split into two distinct areas: effectiveness of learning content, or learning effectiveness, and effectiveness of the learning organization, or operational excellence. Let’s think of these two areas as indicators in terms of achieving goals, such as delivering the most effective learning programs and operating learning as a business. The ability to maximize learning effectiveness and operational excellence indicates that goals are being met. These two measurement supersets are key learning indicators. They are supersets because there are numerous individual measures, such as Kirkpatrick’s four levels of evaluation, financial and staffing metrics, and business measures, that aggregate to each key learning indicator.

Key learning indicators measure goal attainment. Learning executives can watch these indicators and the key measures that aggregate to each to assess the health of their organizations. Analytics helps CLOs forecast trends and identify patterns or anomalies in both indicators and underlying measures so they can see where we are heading. Use the information to make decisions early to adjust or accentuate the work processes that influence your measures, which, in turn, impact your learning goals and objectives.

Planning for Learning Analytics

Before implementing learning analytics, you first need to plan what you want to measure and monitor. Starting with the key learning indicators—learning effectiveness and operational excellence—identify the most important measures for each that, when combined, will provide a complete picture of how well goals are being attained. There are conceivably dozens of measures you could watch for a given indicator, but there are some that are more important than others. The crucial measures you select should correlate to specific objectives you set that align upward to your goals.

Remember, the key learning indicators measure goal attainment, whereas key measures gauge objective attainment. A sample objective for learning effectiveness might be “ensure 95 percent completion rates on all learning activities.” The key measure correlating to this objective is the completion rate. For operational excellence, an objective might be “reduce the average cost per student day by 10 percent.” The key measure for this objective becomes the average cost per student day.

You need to be able to quantify your key measures and, most importantly, influence them through your work processes. Key measures should provide meaningful feedback, support challenging targets and allow you to benchmark performance either internally or externally. You want to select key measures that are important to you, your colleagues and your customers. For example, the reaction survey results of Kirkpatrick Level 1 may be meaningful to you, but less significant to your customer and less important than the learned test results or the delta between pre- and post-exams of Kirkpatrick Level 2. This doesn’t mean that you should not choose Kirkpatrick Level 1 as a key measure for learning effectiveness. It simply means that you may not want to give the measure as much weight as other important factors when calculating the overall key learning indicator score.

Additional key measures for learning effectiveness should include progress indicators, number of completions per starts and business metrics expected to be influenced by strategic learning activities (e.g., time-based, dollar-based, outcome-based). For operational excellence, consider measures such as average cost per student day, average number of students per session delivered, ratio of variable to fixed expenses, annual training expenditure per student and annual training expenditures as a percent of payroll.

After you identify what you’re going to measure, you then need to determine the various methods or dimensions of how you want the metrics reported. For instance, you may want to see a given measure by location, business unit, delivery method, category or otherwise. Keep in mind that the process that captures the measurement (e.g., survey tool, LMS, LCMS) must also support the capture or relationship to the desired dimensions. Think about your dimensions in terms of charts and graphs, and how they might display your data points in side-by-side bars or by pie segment.

Implementing Learning Analytics

A data warehouse is a central repository for all or significant parts of the data that an enterprise’s business systems collect. Information from various learning and business systems is selectively extracted and organized within the data warehouse for use by analytical applications and user queries. The data warehouse is commonly modeled within a relational database management system (RDBMS). Many organizations already have one or more data warehouse initiatives under way. Unfortunately, integration of learning systems data is not always near the top of the prioritization list. But it should be, and it’s to your advantage to influence and educate your chief information officer about the value of integrating learning data with other business data, if for no reason than to prove that training is an investment and not an expense.

At a minimum, the blueprint or model of a learning data warehouse should support the key measures and dimensions determined from the planning exercise. As transactions occur on a day-to-day basis in learning systems, all or part of the transactions are sent to the learning data warehouse for consolidation and subsequent analysis. The data sent to the warehouse should contain attributes for the various dimensions. In many cases, the dimension attributes will be part of the original transaction. In other cases, the computer programs that are sending information to the warehouse will need to collect and format the dimension attributes prior to sending the transactions along. When the data arrives in the warehouse, it is organized according to the model so that the transactions can be aggregated into measures and accessed across the various dimensions. Analytic gurus call these analysis dimensions “cubes.” The data warehouse and the underlying RDBMS typically only provide crude tools for selecting and displaying statistics. It is the job of the analytic software to provide the presentation interface and the algorithms to aggregate, compare and forecast the data, turning it into information from which strategic decisions are made.

The Learning Dashboard

Analytics software comes in various shapes and sizes, ranging from broad business intelligence tools to niche products geared specifically for measuring the success of learning initiatives. The solution you choose may depend on the level of support from and involvement with your information technology department.

Chief learning officers need a dashboard that allows them to visualize their key learning indicators and monitor the underlying key measures that support them. Learning managers need access to more detailed information, such as project-based goals and measures. The analytic software you choose will provide either a framework from which you can build a custom dashboard or a turnkey solution that presents information in a defined yet somewhat configurable format. Ideally, the learning dashboard integrates directly with the data warehouse. In absence of the warehouse, many of these analytic systems also can retrieve and aggregate measures on their own.

As you design your dashboard, think in terms of not only what you want to see, but also the decisions you want to influence. Dashboards should be visual. Remember, a picture is worth a thousand words. The key learning indicators are best represented as gauges, whereas key measures are best expressed in terms of time-based graphs with multiple Y-axis plots for current, targeted and projected values. Consider different views for the different audiences. As an executive, you want quick, at-a-glance indicators that tell you in the blink of an eye whether you’re hitting the mark or not. For each indicator, you want to drill down (remember the analytics definition: break down or decompose) to the key measures supporting the indicator. For each key measure, you want to drill down again to see how the measure is performing over time.

Leading the Way

“If you can’t measure it, you can’t manage it.” This quote has been attributed to a number of great thought leaders, including David Norton, co-author with Robert Kaplan of “The Balanced Scorecard: Translating Strategy into Action.” The most successful balanced scorecards have multiple measures linked to objectives, just like our examples. In addition, they incorporate cause-and-effect relationships between the measures and their movement.

As with balanced scorecards, when you define your learning analytics solution, it is imperative to establish thresholds against key learning indicators and key measure values so that you are notified of any approaching problems. Don’t set thresholds at or beyond your breaking point. The thresholds you set should point you or responsible parties to the work processes that influence the measures. By channeling the energies, abilities and knowledge of your workforce against those processes, you will not only be managing what you’ve measured, but also leading the way to an effective learning organization and, in the end, an effective workforce.

Chris Moore is president of Zeroed-In Technologies and the creator of CLO Dashboard and other innovative solutions for visualizing and measuring learning strategies.

regards

LEO LINGHAM

From India, Mumbai
pranks
hi
im doing a summer project in Maruti Udyog limited
and my project is about Mapping organisational needs with the external training interventions
please give me some tips regarding this
and i want to do a another project in training And development
in which i could learn more things
please suggest me a project title as soon as possible
regards
pranks


Community Support and Knowledge-base on business, career and organisational prospects and issues - Register and Log In to CiteHR and post your query, download formats and be part of a fostered community of professionals.





Contact Us Privacy Policy Disclaimer Terms Of Service

All rights reserved @ 2024 CiteHR ®

All Copyright And Trademarks in Posts Held By Respective Owners.