Proving the ROI of Leadership Development
“Just getting started” is one of the biggest challenges leadership, learning, and talent development professionals face when implementing an ROI study, says Jim Irvine, global dean of leadership & professional development at Nissan.
“Most people assume they need to create a study as rigorous as one they’d put in an academic journal article, but that’s not always necessary,” says Irvine. “Senior leaders mostly want to know that the money being spent for development initiatives is having a positive impact on the bottom line. We’ve recently started doing ROI studies that are more manageable in scope, with maybe a hundred participants, that perfectly accomplish this objective.”
To help with the process, Irvine brought in ROI expert Dr. Paul Leone of MeasureUp Consulting.
“Paul suggested we scale back and focus on just one of our leadership development programs—Blanchard’s SLII® program—and measure how the training impacted participants’ job behaviour, how the new behaviour impacted performance, and how that performance impacted business metrics.
“The good news is that planning ahead made the process easier than we expected, especially because we were able to use existing business metrics,” says Irvine. “By starting with those measures and working backwards, we could more easily show the financial impact of the training.”
In Irvine’s case, the proposed ROI study looked at how the SLII® leadership training program impacted business metrics across three countries, five locations, and four functions—engineering, sales, marketing, and finance. The study was structured not only to measure self-reported data from the participants but also to include corroborating data from their bosses and direct reports.
“Paul helped us create a post-training survey that we sent to participants, their direct reports, and their bosses. The survey takes less than five minutes and asks each group if they observed changed behaviour on the job, whether that changed behaviour impacted their performance, and, if so, by how much.”
Irvine and Leone started with a standard five-level evaluation methodology that included traditional measurements such as:
- Did participants like the training?
- Did participants learn anything?
- Did they adopt new behaviours?
- Did the behaviours impact performance?
- Did the bottom-line impact exceed the costs of the training?
Irvine expected significant differences between participant reported data and the observations of the participants’ direct reports and the bosses—but that wasn’t the case.
“Going in, Paul told us that the difference would probably be within 10%, and that’s what we found. For example, in one metric, participants estimated they improved their performance by 64% versus the direct report’s estimate of 61%—only a three-point difference. That surprised me.”
The study also looked at a sixth level of measurement, which Leone created, and which helped one of his Verizon case studies win the 2019 Brandon Hall Gold Award for best advance in measuring the impact of leadership training. Level Six looks at climate factors that help or hinder the long-term impact or sustainability of training. The factor that had the most impact on results was manager support: participants whose managers actively supported their training generated the most financial impact.
“Paul showed us a simple way to run the data for employees who received high manager support separate from employees who received low manager support,” says Irvine. “From there, we looked at the impact of high manager support on behaviour change and how that change affected performance metrics. Isolating these differences and monetising their value told a compelling story.
“Overall, we achieved a 452% ROI on the SLII® training investment. When we looked at the Level Six climate factor, we found a 1.3 times magnification factor on that ROI for employees who had high manager support compared with employees that received low or fair manager support. That was more than a $3,000 difference in ROI alone. Those are the kinds of numbers senior executives are looking for when making budgeting decisions concerning training.”
For L&D professionals who might still be sitting on the fence about conducting their first study, Irvine has some encouragement.
“If you are new to ROI studies, the first lesson is to partner with an expert. We do that in so many other things—if you're not good at auto mechanics, you hire a mechanic. If you’re not good at information technology, you outsource it. Do the same with ROI.
“Here’s the second lesson: don’t wait to do an ROI study until the end of the year after you’ve run all of your workshops. Plan your study at the beginning so you can schedule your workshops around collecting your data. Identify existing performance metrics that your organisation is already collecting on a regular basis. Then schedule your leadership training around those measurements, with those participants in mind.
“Finally, use short, five-minute surveys when conducting your post-training evaluations. Administer them to participants, their direct reports, and their bosses.”
From there, it’s a simple ROI calculation, says Irvine. “Take the benefits of the training, subtract the cost, and then divide by the cost. With a little help, you can do this. Plan your ROI study before you deliver the training and then follow the plan. Get a partner, plan early, and focus on existing metrics to start. This process will help so much with demonstrating the impact of leadership development to others. Don’t wait—start planning your first study today!”
About the author:
David Witt
David Witt is a Program Director for The Ken Blanchard Companies. He is an award-winning researcher and host of the companies’ monthly webinar series. David has also authored or coauthored articles in Fast Company, Human Resource Development Review, Chief Learning Officer and US Business Review.
First published on Blanchard LeaderChat
4 November 2019