A growing disconnect between the metrics most trainers use and those that corporate leaders prefer is driving new interest in making a better case for training's impact. While CEOs may consider measures of course satisfaction or learning transfer appropriate and even necessary within the learning and development function, they don't fly in the executive suite.
Learning consultant Jeanette Harrison recalls working with the learning unit of a large company on their balanced scorecard. When they shared it with the CEO, he said, "Those metrics are important for you to know, but here's what I'm interested in: revenue progress against plan and factory productivity." Harrison adds, "We've trained our industry to use metrics that have no meaning to the people who pay our salaries."
While most organizations measure their profitability, revenue, and growth in similar ways, other performance metrics can vary by industry. Call centers track average call handling times, sales organizations look at sales volume and expansion, manufacturers measure "whip turns," and consulting firms measure how quickly new associates can begin billing.
Trainers have exhorted one another for years, even decades, to make their case for training in business terms. While the Kirkpatrick four-level model has been useful for showing effects at the level of the course or program, it has been less useful for measuring the business impact of complex processes such as integrated talent management, in which learning is threaded through a number of key functions such as recruiting, compensation and rewards, and performance management.
A 2011 study by ASTD and the Institute for Corporate Productivity (i4cp), Value of Evaluation: Making Training Evaluations More Effective, found that only about one-quarter of respondents agreed that their organization got a solid "bang for the buck" from their training evaluation efforts. And while 91.6 percent of the organizations surveyed use Kirkpatrick Level 1 to evaluate trainees' reactions, only 35.9 percent said it had high value. Only 36.9 percent of companies use Level 4, evaluation of results.
Jack and Patti Phillips breathed new life into training measurement, starting in the 1980s, by developing formulas for calculating the return-on-investment of training. But is this what CEOs really want to know?
A 2006 ASTD-IBM report titled C-Level Perceptions of the Strategic Value of Learning noted that while chief learning officers and CEOs have different needs with regard to learning evaluation, at a high level, they agree that learning is strategically valuable. Both groups also agreed that isolating and measuring learning's financial contribution to business is difficult, and often, perceptions of stakeholders (employees, business unit leaders, and executives) are a key indicator of learning's success.
"CEOs don't believe ROI numbers, especially when they seem to be extreme," says Leslie Joyce, vice president of global talent management at Novelis and former CLO at Home Depot. "If there is change in behavior or improvement in performance, most CEOs I've worked with will agree that training has had an impact."
In 2002, ASTD created metrics that show the size and scope of the training industry expressed in part by the amount of investment in training and the types of training delivered.
"Industry metrics from ASTD and other organizations are useful benchmarks," says Susan Burnett, senior vice president of talent and organization development at Yahoo, "but they're transactional. They don't get at value."
In CLO positions at HP, Deloitte, and Yahoo, she has used ASTD's metrics for training investments to show how these companies compared to a national average. When she showed Yahoo's executive committee that the company's training investment, expressed as a percentage of payroll, was below the average, CEO Carol Bart told her "I don't want to drive to a percent. I'm going to drive to value. So what are you going to do that will move the needle on our company goals? If I'm investing in this, what am I getting?"
Burnett met that challenge by focusing on a key corporate goal - strengthening leadership. "We targeted our leadership development efforts at accountability and goal alignment because we knew that they could influence the quality of leadership," she says. Before the training began, Burnett used an employee survey to set a benchmark for accountability and alignment. One year later, the 800 leaders who received the training in alignment and accountability scored higher in those areas than leaders who hadn't been through the training.
Burnett explains, "Now I could say to the exec staff, 'When you made this investment, you actually strengthened leadership. And here are the metrics to prove it.'" Bart's response was to continue to invest in the program - the only one that wasn't cut in that round of budget negotiations. Bart said, "We know when we invest here, we get results, so let's keep investing."
Obsession is futile
For Joyce, a dialogue about the real value of ROI is long overdue. "Given what we know about adult learning and development after all these years, does it make sense to continue to try to demonstrate a monetary return on the investment in the training itself and for the most part fall short?" To Joyce, years of dissatisfaction suggest that the obsession with ROI is futile.
"When so many other functions don't have to show ROI, are we playing victim?" asks Joyce. "Leadership development is very expensive. But what's the alternative? Not developing leaders? Does the constant search to prove the value of investment in learning harm the learning function or help it? I think it's harmful." She points to the number of times the training industry has renamed itself (think performance consulting or human capital improvement) in an attempt to change the perception of its usefulness to business leaders.
For Dan Pontefract, the metrics wake-up call came from working at TELUS, a Canadian telecommunications company, as an evangelist for learning 2.0, which he describes as "the formal, informal, and social ways we all learn."
He believes that industry metrics that cover the percent of budget spent on payroll, or numbers of courses run, are fixated on the formal - the premise that learning occurs solely in a classroom. "Because informal and social are inextricably part of learning, how do I capture their impact so that my boss knows I'm still relevant?" he asks.
Kirkpatrick's model is a thing of the past at TELUS. "I don't disagree with the Kirkpatrick model," says Pontefract, "but I abolished its use here because it focuses exclusively on formal learning."
At TELUS, measurement is evolving from counting events to determining how a person's contributions to a social network or their network depth and breadth relate to items on the TELUS scorecard. Instead of surveying employees after a learning event, Pontefract fields quarterly surveys that ask people how their use of formal, informal, and social learning has improved their performance and moved them along their career path. Are they microblogging to put information back into the community; are they gaining knowledge through social media; are they coaching and mentoring or being coached and mentored?
"We use an algorithm to convert these qualitative measures into a quantifiable percentage that goes on the scorecard." In a nod to social learning, Pontefract says "The algorithm is a work in progress, but our intention is to eventually put it out in Linux (free, open source software) and let others refine it."
New measures emerging
The quest for better metrics has led i4cp to conduct research with a number of large companies. "The first stop for organizations on the analytics journey - when they really want to know their impact on the business - is quality of hire," says a report from i4cp titled The Metrics of High Performance: Quality of Hire. Their research has also identified three other metrics that contribute to performance:
- Quality of movement: When an employee is shifted within the organization, what is the success rate?
- Quality of separation: Who is leaving the organization? Is the organization losing key talent? Is the termination rate a problem?
- Time to full productivity: How long does it take to master a new role and become productive within the organization?
Whatever direction metrics take in the future, says Joyce, "they should reflect the outcome of high-impact learning design on the business. We owe our companies that much."