There is no stakeholder in the learning and development process more important than the CEO of an organization. In the August 2009 issue of T+D, we examined the results of a significant survey of CEOs. This study revealed what CEOs want and do not want for measures of success of learning and development. In this article, we'll combine those survey results with other executive input to develop prescriptions for what is needed to ensure that learning professionals meet the expectations of this critical group.
All of the data points to several critical concerns, and consequently, opportunities for learning and development leaders. As expected, CEOs want to see value in terms that they can appreciate. They view the value of learning and development in terms of business impact, business alignment, and return-on-investment. They do not see value in the inputs (how many people attended training) or reaction (participants rated overall satisfaction 4.2). There is very little data at the business contribution level presented to them, and yet, that is their most important data set. Ninety-six percent of the CEOs said that they wanted to see learning and development connected to business impact data, but only 8 percent see this now. Seventy-four percent wanted to see ROI data, yet only 4 percent see it now.
They see little value in lower levels. As anticipated, top executives see very little value in having a tremendous amount of input data. Although it is needed (they want to know how many people are involved, how many hours, and the cost), this information does not show value in terms of contribution; it's only input. Also, the least-ranked data item is reaction data. Executives view this as people enjoying their experiences. A few suggested that it is important, because satisfied customers are necessary to make the process work; however, they do not see the connection with actual business results. Additionally, there is not much appetite for learning data. As a few executives commented, that's an operational measure.
Executives want to see the higher levels of evaluation. Interest picks up with application data, but there seems to be a disconnect here. Sixty-four percent of executives said that they would like to see application data (change in behavior, use of skills, use of technology), yet only 11 percent actually have this data. More revealing was the tremendous disconnect with this business impact and the ROI, as discussed earlier. In comments on these surveys, executives routinely mention terms such as "business contribution," "business alignment," "business value," and "connection" to the business. Eighteen percent of CEOs said that they determined the funding for learning and development based on the payoff of the investment.
Most CEOs do not see a learning and development scorecard. If learning and development organizations are using a scorecard, it's not making its way to the top executives. Only 22 percent of executives said that they have a learning and development scorecard. Executives noted that if there were a scorecard, they would probably use it and use it properly because they're applying it across many parts of their organization. The scorecard, however, must not be dominated by inputs (Level 0), reaction (Level 1), and learning (Level 2). It must have information that includes application and that connects to the business in terms of both tangibles and intangibles. And yes, awards and ROI would be appropriate, too.
Other concerns. CEOs are not as actively involved in learning and development as they need to be, and like many other functions in an organization, learning and development needs executive support. Direct involvement by top executives is an excellent way to stimulate the interest and ownership of others and helps drive results for the various programs. More effort is needed to increase executive commitment. This often creates a dilemma because executive commitment is needed to drive results, yet one of the best ways to secure additional commitment is to show results. This will represent a challenge for the learning and development team.
Less than half of the CEOs refer to the learning and development function as a corporate university, at least by the traditional definition of corporate university. Some are pleased with that concept, while others believe that it can go astray and create an entity that is not relevant, and perhaps even unattached, to the organization. Most of the executives were concerned that learning must show its value in terms that can be clearly seen.
Most CEOs consider their companies to be "learning organizations," and they discuss the possibilities of learning from all perspectives. But as companies use more learning vehicles and approaches, there is a need for those vehicles to be relevant, connected, and driving results in some way.
What caused this situation?
Many professionals in this field know how to connect learning and development to the business. Practitioners have been exploring and discussing this issue for more than half a century. Since then, practitioners, researchers, and professors have improved evaluation practices and have written dozens of books on the evaluation of learning and development.
However, most observers in this field conclude that there has been little actual progress when it comes to connecting learning and development to the business. Our primary stakeholders - top executives - who must fund, support, and make a commitment to learning and development, are not necessarily happy about it. Unfortunately, there are real barriers for not addressing this. The real barriers often are not identified directly and explicitly in benchmarking and survey data.
Fear of results. Perhaps the strongest barrier is fear of disappointing results, whether that means a negative ROI study or results that are far less impressive than executives expect. It's a fear of the consequences. We often encounter this comment, "If my key program is not delivering business value, why should I conduct a study to show my top executives that it's not working?"
The fear is that the results may be used for a performance review of the chief learning officer, lead to a cut in the budgets, or be used to hold specific team members accountable. If a program is not delivering results, the client probably already knows it. He just doesn't yet have the study to show the details. After all, business results are generated in the business units, and if the business measures are not improving, the executives are aware of it. So a negative or disappointing study may not surprise them.
A much better approach is to tackle the issue on a proactive basis with process improvement in mind. Take a high-profile strategic program, and conduct an analysis of its success with the goal of process improvement; so if it's not working, plan to make changes to make it successful in the future. This proactive approach wins points with senior executives and helps to improve the partner relationship.
Waiting for the request. Closely linked to the previous barrier is waiting for the request to show the business contribution. Often, we see CLOs interviewed on the issue of ROI and business connection. Some CLOs state that they've never been asked for ROI, and thus, there is no need to pursue it. There is faulty logic at work here because if you wait for the request, it will often be too late.
When a request for ROI (or business impact) on a particular program is made, results are often expected as soon as possible. Without prior planning for a study, an executive waiting for results will become impatient. It takes time to develop processes, build capabilities, change practices, and collect and analyze data. When you wait for the request, you are on the executive's timeline and agenda, which is an uncomfortable place to be.
Smart CLOs are taking the initiative to develop this capability before it is requested. They are controlling the agenda and the timeline, providing the executives with a healthy dose of accountability, routinely and consistently.
Lack of investment. Let's face it: We have not invested enough funds for measurement and evaluation processes. To determine an appropriate level of investment, estimate the cost for measurement and evaluation as a percent of the learning and development budget. This includes any expenditure for staff and resources for collecting data (measurement) and using the data to make adjustments (evaluation). For most organizations, expenditure is at 1 percent or less. Annually, we benchmark with organizations using a comprehensive measurement and evaluation system, and these best-practice organizations spend between 3 and 5 percent on measurement and evaluation.
To make the case for additional funds, show executives the benchmarking data of best practices, and show that your investment in measurement and evaluation for learning falls far short of other processes in the organization. Then, fund measurement and evaluation gradually with the success generated by the process. For example, the first impact or ROI study will show how a program can be improved or how it can be done at lower cost or in less time. In either case, there is added value because you've actually evaluated the program. With results in hand, you have a great opportunity to make the case for additional measurement and evaluation funds.
Think about evaluation early. The time to think about evaluating a particular program at the business level is at the time of conception. Unfortunately, by habit, practice, or teaching, we don't think about evaluation early enough. For years, learning and development professionals used the ADDIE model - analyze the need, design the solution, develop the solution, implement the solution, and evaluate the solution. Unfortunately, this model causes us to think about evaluation after the program has been implemented. For most projects, this is too late.
By thinking about evaluation much earlier, we ensure that the solution is connected to clearly defined business measures. Objectives are developed at multiple levels, including application and impact, to provide the proper focus throughout the program. Expectations are created for participants and others to clearly see why the program is being offered and what their role is in its success. Data collection is built into the program to make it more palatable to the participants. Planning for data collection is accomplished early, making the process much more efficient while assigning more responsibilities to others.
They asked for it. The learning and development team is a support group, supporting the needs of the organization by delivering learning and development to satisfy those needs. When a top executive requests a leadership development program or an operating executive requires technical training, the programs usually are provided with the anticipation of results. After all, they are requested by someone who should understand the requirements and needs, and they are designed to meet those needs. So why should we evaluate these programs? Isn't it assumed that the value exists? And why waste resources trying to prove value when the requestor already understands that value?
This seems logical; unfortunately however, the logic breaks down. Executives often request a learning solution when they see a problem. If something is not working in the organization, those executives assume that employees don't have the knowledge or skills that they need. Research continues to show that when there's a dysfunctional or ineffective process, the most appropriate solution is often a nonlearning solution.
Unfortunately, managers are not experienced in the analysis techniques needed to determine the cause of the problem and the appropriate solution. The managers are not the bad guys, though they appear to be. We force them into this process, and we'll have to change their behavior. But we have to do it diplomatically, subtly, and gradually, teaching them to ensure that the request is appropriate.
Lack of preparation. Unfortunately, learning and development professionals often don't possess the skills and knowledge needed to implement evaluation to demonstrate business impact. Some have the luxury of a degree in learning and development, instructional technology, or human resource development. Most do not. Even then, the curricula usually focus very little on measurement and evaluation - perhaps one course on evaluation and maybe one that covers needs assessment. The bottom line is that most learning and development team members have no formal education in this area.
Many learning and development professionals transfer from other areas in the organization. When individuals enter the field, they have almost no insight into, nor experience with, evaluation. Preparation to address these issues is minimal at best. The problem is then exacerbated by the excessive number of evaluation books offered to these unknowing practitioners, often leaving them confused.
Changes are needed
Not all of the comments and data from the executive study point to a negative situation. Many CEOs are proud of the learning functions in their organizations and note the contributions that are made. These executives are supportive and encouraging, and they are thoroughly committed to making learning a part of their growth and profit strategies.
At the same time, even these CEOs see a great deal of room for improvement, leaving the learning and development team, particularly the CLO, with some important challenges. Before taking action, it is helpful to examine the barriers described earlier to see if you agree with our assessments. Explore, examine, and evaluate the barriers; make sure you clearly understand the issues; and be prepared to minimize or remove them.
The CLO is in a critical role here. We've yet to see a successful evaluation system implementation without the support, commitment, and involvement of the CLO. We've seen some enthusiastic learning and development team members try, but eventually fail without CLO support.
The challenge is to focus on results throughout the learning and development cycle. The chart shows the 10 key areas where the focus on results should be addressed, detailing what actions are needed, what's involved, and the payoff. These are basics - the fundamentals - but they are often ignored in some learning and development functions.
Move quickly to make changes, and take action in this area. Don't put it off to next quarter or next year. Here are a few actions to take now:
- Assess where you are now with the results-based approach. Samples of assessment instruments are available directly from the authors. With this assessment you can plan specific actions.
- Invest more in measurement and evaluation by using one of the more specific strategies suggested here.
- Change practices and address evaluation early and often in the process, building evaluation into learning and development.
- Focus on objectives and expand them beyond the learning objectives. Require application and impact objectives for 70 percent of new programs. This alone will drive more results than any other action.
- Take a fresh look at a learning scorecard, perhaps building one that reflects some of the data in this article. Don't let the top executive team design your learning and development scorecard. They don't know how to do it, and in all likelihood, it will be something that's almost impossible to do. Instead, show them something they can review.
- Invest in technology, or at least use the technology at hand, to assist with the evaluation challenge.
- Build a measurement culture. Ask questions. Require data. Ask people to think about results, accountability, measures, metrics, and analytics, but not to an extreme. Make accountability a routine part of conversations, expectations, and ultimately, the reward structure.
- Conduct a few impact studies and maybe, occasional ROI research, for those programs that are substantial, strategic, expensive, and high profile. You know the ones - they attract attention from the senior team, and they often require higher levels of accountability.
- Start providing information about successes to the appropriate executives. Feed their appetites. Don't promise too much, but deliver more. Make it routine. Avoid the request for showing the value of your entire function. If that request is coming, it may be too late.
- Get your executives more involved. Executive involvement helps to keep the focus on results and accountability.
- Never miss an opportunity to speak to an executive about the success of programs in their area. After all, it is their team that delivered this great performance, and these successes are important to them.
Now is the time
As we emerge from the recession, there will be a greater focus on accountability for all types of projects and programs. Functions across the entire organization will be making a plea for increased funding.
Those programs that make the best business case will ultimately prevail. Executives will require more evidence that increased funding will drive business results. It is absolutely critical that leaders of the learning and development function prepare to face this challenge and demonstrate the connection to the business. We have not done so well with this in the past, but it's fundamental, and it can be done. Now is the time to pursue it.