For designers, developers, and trainers, measurement is a weighted word. Employee performance is difficult to measure objectively -  as is the success of a training program. Furthermore, for folks whose strength is creative communication, data analytics is a far cry from our comfort zone.  

Many instructional designers have also been turned off of analytics by seeing data used out of context or as the only measure of a complex initiative. As a result, measuring learning initiatives often falls into the pattern of measuring production - "we've provided X resources" or "we've distributed Y manuals" - because these are factors L&D can control.

What really matters for an organization, of course, is not how many manuals L&D creates, but how many behaviors or outcomes L&D changes. With all the demands on our time, it’s easy for L&D departments to be consumed by “putting out fires”: Meeting short term needs, creating resources for squeaky wheels and solving immediate performance problems often come before taking time for strategic decision-making.

So the need is greater than ever for learning and development to embrace data, measurement and analytics to enable us to target initiatives to efficiently meet business objectives, make the best possible use of our limited time and to demonstrate the efficacy of our work to organizational leadership.

Tough Economies Put Pressure on the Numbers

Immersing yourself in data-driven decision making will enable you and your team to make strategic decisions that meet broad organizational goals. Ultimately, the ability to harvest, discuss and process this information correlates to improved financial outcomes for large organizations:

 

A recent study by KnowledgeAdvisors and Bassi Investments illustrated that a group of companies with high learning and development measurement acumen outperformed the Standard & Poor’s 500 Index in terms of share price appreciation by more than 15 percent.

 

http://www.knowledgeadvisors.com/archives/emerging-issues-in-measurement/

Many L&D departments dread the measurement and ROI discussion. As a cost center (and not a direct revenue generator), L&D is always challenged to justify its connections to organizational success. If L&D professionals use good measurement programs to demonstrate improved productivity, changed behaviors and improved outcomes, they will be in a stronger position in the next budget cycle.

This month, I’ll outline a framework for measuring changed behavior, and share examples of this model’s success in a social change program I worked with in Anchorage, Alaska. I will also showcase leading organizations’ use of learning analytics to meet organizational goals. We’ll continue next week with a discussion of connecting learning design to a data framework from the beginning.

Further Reading on Measurement: