According to our survey of L&D departments in the U.S., one of the key differentiators of top performing departments, regardless of industry or size, is their close relationship with senior executives and their commitment to measuring the ROI of corporate learning.
We’ve asked Jack Phillips, Ph.D. and Chairman of the ROI Institute who has helped thousands of global companies and public sector organizations create powerful business cases for their learning functions, to share his expertise and offer some practical guidance that you can use to keep and increase your L&D budget.
What do L&D professionals lose when they don’t demonstrate the impact of learning leadership?
They experience three consequences.
1. Budgets are sometimes in jeopardy. We know every executive realizes that organizations must have learning. I’ve never heard an executive say, ‘’We don’t want people to learn.’’ They’ll always say, ‘’People are our most important investment,’’ but those same executives will announce the next day that they’re cutting 3,000 jobs. People are important and learning is critical, and we have to make that connection of learning to business. If we don’t, we never secure the funds that we need, and we’ll be subject to unnecessary budget reduction and budget controls. We usually need more money for learning.
2. They lose support from our senior executives. Executives think that learning is a necessary evil. We want them to see learning as a business driver. When this happens, they’ll support L&D more.
3. The L&D teams needs to see business results for learning. Everyone likes to know that they make a difference in the organization. If you have no data that shows that you make a difference, it creates a mystery, ‘’What’s the value of what I’m doing?’’ We started working with this issue when I was head of Learning & Development for a Fortune 500 company. We wanted to see the value that we contributed to the organization. It made us feel much better about our work. It made us feel respected by the senior team. We want the image of L&D to be very positive. It’s not so positive in some places, now.
We do have some great executives who see the value of learning sometimes in spite of a lack of data. They still support it in a big way. When times get tough however, many of them will cut the budget. Executives just say we can’t afford to do this. If they see the business connection, we can make the case to keep the budget.
What issues are preventing L&D professionals from showing value to their organizations?
There are three persistent issues.
1. The reality is that most learning in an organization is wasted. When we say wasted, we mean that it’s not used in their work. There are all types of research to support this dilemma. When I’m conducting a keynote presentation, I ask, ‘’How many would agree with the statement that most training is wasted?’’ It’s depressing to see that sometimes 80 to 90 percent of the audience raise their hands. They’re saying that most employees don’t use what they learned.
As a follow-up to that question, I ask, ‘’How many functions or departments in their organization would admit that half of their budget is wasted?’’ They start shaking their heads, suggesting that no department would admit it, but they just did. I’m trying to emphasize that we must address this issue. We’re getting better, and that’s the good news.
2. Most L&D departments don’t deliver the results that executives would like to see. When I ask audiences how many would agree with that statement, we get the majority of them unfortunately. We’ve conducted studies with ATD and other associations targeting top executives and ask: ‘’What kind of data would you like to see from the learning function?’’ The number one data set they value is business contribution. They want learning professionals to connect with the measures that interest top executives, such as productivity, production, sales, quality, timeliness, efficiencies, and hundreds of measures in the system that managers track to see how the organization is performing. Executives expect learning to be connected to them.
3. There’s a good chance that most learning professionals don’t have data showing that they make a difference in your organization. Unfortunately, I receive a huge percentage of agreement with this statement. If you think about how learning is evaluated, there are five classic levels of outcomes: reaction, learning, application, impact, and ROI.
Here’s the problem: it’s only the application and impact data that shows the difference made to the organization. When measuring reaction and learning (which is very important), you don’t know that learners are using it. If you don’t collect data at the application level, you don’t know if you’ve made a difference.
Many organizations stop their evaluation at level two. If there’s a heavy focus on technology-based learning, they often stop at input to the process - who’s there, how long they’re there, and the cost of it. This is really depressing. Application data shows that we make a difference, and impact is what the executives want to see.
3. What are some of the biggest pitfalls that prevent learning professionals from measuring and reporting the value of learning?
There are five barriers that stop learning and development professionals from tackling this issue:
1. Fear of results. If a program is not working, the last thing that program owners would like to see is the negative results. They are concerned that this will reflect on their performance, and consequently they may not want to see this data. In reality, the client probably already knows if a program is not working. It is better to be proactive and take steps on our own to correct it instead of waiting for someone to ask us for the results.
2. Not enough time. This approach does take additional time, but not as much as one may think. There is a perception that Impact and ROI studies are very time consuming with a tremendous amount of data and a lot of reports. To save time, the tasks necessary to deliver results are shared by the entire group and only a few programs are pushed up to the Impact and ROI levels. This keeps the time commitment manageable.
3. It’s too complex. There is a perception that ROI will involve too much finance and accounting or detailed statistics, which most people don’t want to tackle and most executives don’t want to see. But this is not true – it is a simple process that does not go beyond fourth grade mathematics and is a systematic, logical flow of data. Its only connection to finance and accounting is the actual formula for the ROI.
4. Dealing with disappointing results. When a program is not working, there can be several reasons pinpointed reflected in the data being collected. The program may have broken down at an earlier point of the process, something wasn’t done that should have been done. Perhaps it wasn’t connected to the business in the beginning, impact objectives were not developed, or it was not facilitated with the end in mind. In any of those situations, it is difficult to get involved individuals to change. A better approach is to make sure that everyone does their part, as a team, to design for the results that are needed so that disappointing outcomes are minimized.
5. I don’t know how to do it. Unfortunately, much of the preparation for learning and development professionals has not focused on this level of accountability. They don’t know how to do it. The good news is that there are ways to learn about it. Our Institute, for example, has produced more than fifty books on this topic, with many of them case studies and guidebooks to help readers make it work. ROI Certification is available to prepare individuals for this challenge.
Why do you think that this is such a pervasive issue that L&D isn’t aligned with senior leadership from beginning to end? Why is the L&D function so isolated from business goals?
There are three issues:
1. Programs are started for the wrong reasons. We suggest that programs start with the why. Why are we doing this? And the why these days has to be a business contribution.
So many learning programs are initiated because the learning function says that they need it, or an executive has read a book and thought, “We might need this.’’ In other cases it could be, ‘’They’re doing this at Google so maybe we should do it.’’ Executives will start with a solution without understanding the business need for it. If we don’t start with the business need for a solution, it becomes difficult to end up with a business contribution.
2. We tend to measure what we think we can control. We measure level one and two because participants are captive in our system, whether it’s eLearning or classroom training. We capture data because it’s convenient and we stop the evaluation there. We have this philosophy that we can’t control what employees do with learning, so it’s not our fault if they don’t ever use it. That may be true, but it’s the organization’s budget that you’re spending. You better be concerned about it because it’s your budget that’s going to be cut if the top executives don’t see the business connection.
3. We have to redefine learning success. Success for a learning program doesn’t occur in the classroom, on a keyboard, or on your mobile phone. Success occurs when the participant actually uses what they learned and made an impact in the organization - with their work, with the customer, with the people around them. That’s the business connection. Until you have a business impact, you’re not successful. In our presentations, we ask participants to define your success now. They usually list input (the people in programs), reaction, and maybe what they learned. They rarely define success as employees using what they’ve learned (application) and a corresponding consequence of business impact.
How have you seen executives respond to L&D departments who don’t show business impact?
If we don’t connect to business measures, executives question the value of what we do. If we haven’t influenced business measures, it’s easy to cut the budget. I’ve been a top executive - as head of a banking system, we approved budgets based on the perceived value added.
If you don’t have a way to connect what you do to business measures, you will likely be considered a necessary evil, or your executives will take it on faith that your programs need to be offered. If it’s a faith-based initiative, executives might cut the budget during tough times. We wind up spending money on learning when there are excess funds, but don’t worry about it when funds are not available.
That’s not the right approach. During a recession, a downturn, or confronting anxiety, uncertainty, we need to have more investment in learning, not less. But if L&D departments don’t have data to show that, they have to face consequences.
We have record economic performance in the United States. Although there is a favourable stock market, low unemployment rate, and high consumer confidence, the learning and development budgets in some major companies have been frozen. Why would they do that? It’s because they want to make sure they have a tight ship going into the future. In the face of uncertainty, executives want to make sure that they’re spending the budget wisely. They don’t let new programs into the system unless it’s a value add.
6. What are some simple steps L&D professionals can take to ensure business impact from learning?
Here are three simple steps:
1. If you want results from L&D, design for it. Let’s make sure that we plan for it so that when we measure, the results are there. When we’re initiating a new program, let’s make sure that we see the business reason for it and that the executive who is sponsoring, supporting, or providing the budget understands it.
2. Let’s make sure that we have the right solution for the business need. If improvements are needed in productivity, what’s the right solution for that? This might require some analysis and discussion, but we can keep it simple and make a connection between what we think we need to help, what’s been requested, and how that connects to a business measure.
3. Let’s think about expecting success from learning. By expecting success, we first have to define it in a way that connects to the business.
What does expecting success in learning look like in action?
There are three simple things that we can do here.
1. Define the success of learning. We let everyone know that the program they’re involved in as a participant is not successful unless they’ve used it in their work and there’s been a corresponding impact. For example: a retail store chain’s corporate learning centre was called the Center for Learning Excellence. Now they have renamed it to the Center of Business Excellence to remind participants that they are not there to learn, but to drive business. Learning is our means to get there. Participants realize they’re not learning just for the sake of learning. They understand that if they don’t connect learning to something in the organization, it has been a waste of time.
2. Develop objectives at multiple levels. The learning community is good at developing learning objectives. They’ve been trained on how to do that and they’re good at it. What they’re not so good at is developing application and impact objectives, defining what participants will do with what they learn, and the corresponding impact. Just that exercise alone will have the team really thinking through what they are doing. It’s basically saying, if I teach them this content, how would I define what successful use of this on the job looks like? If they’re using this, what does impact look like? Are sales going to improve, productivity increase, reduce waste, reduce errors, reduce absenteeism, reduce compliance discrepancies? Expect a successful business impact.
3. Hand off to key stakeholders. When you define that success and have developed objectives, what you’ve done is put a powerful message in front of the designers, developers, and facilitators. Now they have application and impact objectives, and they will step up to the challenge and design, develop, or facilitate with application and impact in mind. With application and impact objectives, even the managers of participants will provide more support because they see how learning affects their departments.
8. We’ve talked a lot about how important investment in your learning initiatives is from senior leadership, can you talk more about how you can get buy-in from learners at all levels of the organization?
Make it matter. Make your programs meaningful to the participants, and make it important for the organization. Make sure there is powerful content, relevant to their work, and something that they can use. They must see it as valuable to them in some way and can be successful with application.
Do you have some tips for L&D professionals to ensure that learning is applied on the job?
A large body of work exists on the transfer of learning to the job. When trainers and learning professionals suggest that half of their budget is wasted, this means that learners don’t use it. How can we design for learning that sticks to the job? The manager of the participant is critical. They have to be a part of the process because the number one influence on participants using learning on the job is their immediate supervisor. We show them the power of the employee actually using new knowledge and skills and the impact. We help the manager to support it. Additionally, we develop tools, tips, and templates for participants to use. Lastly, all stakeholders are designing and implementing learning with application and impact in mind.
What is the best ways for professionals to package this impact data to the executive team?
Make it credible. Best practice is ten percent of your programs should be pushed to business impact evaluation. We collect the impact data driven by the impact objectives. The participants know about this because this is how we’ve designed the program. We monitor that data or have the learners monitor it. Remember, we’re only doing this ten percent of the time so this is not a heavy, burdensome task.
What are some steps to make your ROI calculation credible?
The first issue is to sort out the effects of the program on the data. Take the simple example of sales training. Imagine teaching a new sales representative how to sell in a different way. They use it with their customers and sales have improved - the big question would be, how much of this was connected to this training? If we don’t do that step, we really have no credibility.
If the evaluation is taken to the ROI level, data must be converted to money. The value add for sales increase is not the value of the sales, but the profit of the sales. The value add of reducing employee turnover is avoiding the cost of turnover. We suggest evaluating at the ROI level in half of the programs evaluated for impact. Five percent is pushed to the ROI level - that’s best practice. That’s when we need to convert data to money, and we take that money from the value add and compare it to the cost of the program, calculated the way a chief financial officer would calculate the ROI of a capital investment, like a building. That is the ultimate accountability; ROI, which is where costs are compared to the monetary benefits.
You must be credible if you’re going to this level because you’ve got to please your executives and your chief financial officer. One of your biggest supporters can be the finance and accounting team, particularly the CFO. The CFO being asked by the CEO to show the value of all of these support functions, and you probably have some data that will make this person happy.
For L&D departments who have never done this before, where do you recommending starting?
We recommend starting with one project evaluated all the way to impact and perhaps the ROI level. Present the results and see the reaction from the executive team. With the design for results emphasis, we’re almost ensuring that there are results because we have designed for the outcome that we want.
Should L&D departments include impacts when reporting that can’t be tied to money?
We recognize that there are some measures connected to programs that cannot be connected to money easily, such as teamwork, reduced stress, reputation, image, collaboration, and job satisfaction. If you cannot convert a business impact to money credibly with a reasonable amount of effort, leave it as an intangible; it’s still impact, just not converted to money. Now we have six types of outcome data: reaction, learning, application, impact, ROI, and intangibles. That is a balanced profile of success with quantitative and qualitative data taken at different time frames and from different sources, managed in a way so that it is credible. That’s powerful data.
Do you have any tips for presenting your data?
The next step is to summarize your data and tell a story. We all have a story to tell, but it’s a much better story when we’ve got data. Data is what the audience likes to see - it’s powerful.
This step is to get the data to the right people and to get the right story in front of them. We suggest briefing the team, showing and telling them what you have, and bringing in anecdotal comments along the way. We know they want to see the impact. They may not want to see the other data sets, but we explain to them how we achieved the business results. Disappointing results usually break down at the lower levels. We need to show them how we can improve it.
This leads us to the last step, which is to optimize the results. Show them how this program is connected to the business in a credible way. Then you can add, ‘’We want to make sure that you don’t have an urge to cut our budget in the future. Maybe you can give us a larger budget, or give us an additional budget to do this type of analysis.’’
You’ll eventually make changes along the way as you see things not working – this is optimizing the return on the investment. When you optimize your return on investment, you’re making a great case for allocating more funds to invest for the future (and get executives out of the mood of wanting to cut your budget). That decision to invest in L&D is made when executives see this as an investment rather than a cost, all because you’ve just shown a return on investment in the same way that a chief financial officer would show the ROI.
Interested in learning more?
Co-written by Jack Phillips, Ph.D.,The Business Case for Learning: Using Design Thinking to Increase Results and Increase the Investment in Talent Development offers a complete guide to obtaining and proving the ROI of your L&D initiatives including practical templates that you can action at any organization.
About the Author
Jack Phillips, Ph.D. is the Chairman of ROI Institute, Inc. and a world-renowned expert on accountability, measurement, and evaluation. He has provided consulting services for Fortune 500 companies and major global organizations. As the author or editor of more than 75 books, he has conducted workshops and presented at conferences throughout the world, receiving several awards including the Society for Human Resource Management award.
His work has been featured in the Wall Street Journal, BusinessWeek, and Fortune
Magazine. With more than 27 years of corporate experience in the aerospace, textile, metals, construction materials, and banking industries, he has served as training and development manager at two Fortune 500 firms, as senior human resource officer at two firms, as president of a regional bank, and as management professor at a major state university.
This interview is part of the U.S. L&D Report 2018.
Download the full report below:
What's included?
- The employee training budgets, training topics, and training methods of organizations in 2018.
- Practical advice from L&D leaders to help you adopt new technologies, nurture a learning culture, and measure and promote the value of workplace learning.
- How learning professionals rate the executive engagement in learning, assess the impact of training and more!