Author Profile Picture

Martin Couzins

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

Key theme: Measuring the ROI of L&D

kirk

The CIPD’s Learning and Talent Development Report 2011, published earlier this year, revealed that measuring the impact of learning and development remains a  minority sport for organisations across the UK, with 48% of participating organisations measuring return on expected outcomes and 42% assessing the impact of business key performance indicators.

Measuring the return on investment on L&D activities remains a priority for L&D teams as economic pressures continue to be felt across all areas of the economy.

So what is ROI and how should organisations apply it to L&D activities? TrainingZone talked to training evaluation guru Jim Kirkpatrick, who has worked with his father Don Kirkpatrick on developing Don’s Kirkpatrick Four Levels of training evaluation.
TZ: What’s your definition of return on investment (ROI)?
JK: A lot of people talk about ROI but one of the things that troubles us is the often narrow perspective that the ‘I’ means financial. And one of the problems with it is that we do not want executives thinking that the answer to getting training results is to just write a cheque because it is just not enough. 
The ‘I’ as we consider it is all about the investment we need from the business -  resources, time, accountability, support and finances. We need a broader perspective - it is not just money and training. Research shows that pouring money into training indiscriminately is not going to maximise results. 
We are all for ROI as long as the notion of the ‘I’ is expanded beyond the notion of a cheque being written and executives sitting back waiting for results.
The definition of investment is making sure that once the end result is agreed upon – there is an agreement on what success looks like - there then needs to be discussion on the level of effort and that level of effort is translated into the ‘I’. This effort includes the resources required, time, managers will need to be encouraged or mandated to be coaches and executives will need to hold people accountable.
TZ: Why is demonstrating the value of training such a big issue for L&D?
JK: Up until 2007, training pretty much had a free pass because the focus was on employees being your greatest asset. The 2007/8 economic climate exposed the fallacy that a training session alone was enough to bring about a change in application and results. Now everybody is under scrutiny and everybody needs to demonstrate their worth.
TZ: The Kirkpatrick model provides a template for measuring training value. In your experience of working with clients, what stages do organisations find hardest to succeed at?
JK: Three areas that are difficult for organisations are:
  • Starting out with the destination in mind - you can’t demonstrate value if you have not identified what “value” is in the first place.
  • Developing and executing the after- training support network - helping people across the gap between learning and doing.
  • Validating that training is the type of intervention that will impact results. Training is often chosen almost immediately as the solution to problems that are not training- related.
TZ: You say that return on expectations (ROE) is the ultimate indicator of value. Why is that?
JK: It is important to focus on expectations because they are the highest level goals and mission of an organisation. They are the observable, measurable business-level results that stakeholders have their eye on all the time. The ‘RO’ represents the collaborative effort required to meet stakeholder expectations (“E”) for a given initiative. 
We need to be clear that expectations are not the number of people trained or customer satisfaction scores. They must be linked to highest level business metrics or mission accomplishment. 
TZ: Now that employees are learning through multiple channels all at once (formal, informal, knowledge sharing, and so on) how can L&D professionals measure the value of all these interventions?
JK: In the feedback, ask people to what degree they are applying what they are learning - which is a Level 3 question.  Also, list out all of the channels and ask to what degree each helped them be successful.
For example, formal learning, informal, self-directed learning, coaching, social media, helpdesk support and so on. You are trying to determine the relative importance of the different components to on-the-job performance.
This defines the success factors not just for this initiative, but hopefully provides direction for future endeavours. Hopefully training will come out shining as far as being the major contributor to that success, along with coaching. Whatever the survey results say, it is important for training professionals to view their role broadly, as performance enhancement consultants, with a variety of tools and interventions to impact on the job behaviour to contribute to bottom line results.
At the end of the interview/survey process you then ask what signs they have seen that show what they are doing is paying off.   These may become level 4 leading indicators. You can liken it to training holding a mirror up to training graduates - they start to see their customers are smiling more, or they are catching more errors, or they are increasing in confidence. By doing this we are identifying factors that will contribute to ultimate results.
TZ: Is the collection of the data in this process - through feedback, surveys etc - a weak link?
JK: It is. Feedback is so important for establishing what is working and what isn't, and why it isn't working. The Level 4 leading indicators are the cornerstone of the process - because they are the early signs that we are starting to see dividends - if so, that’s great; if not we need to determine why the initiative isn’t working and revise the plan. This applies to both the individual and program levels.  So often the support doesn't happen because the feedback from the L&D group to managers is not there.
TZ: Are there ways of gathering feedback that are more effective than others?
JK: One thing we do is to encourage L&D professionals to take a trip “across the bridge” so that you are seen in the business and you inspire others by them seeing you there. We encourage L&D professionals to gather qualitative and quantitative feedback using methods like surveys, direct observation and manager feedback. 
The qualitative feedback includes taking a look around the business and asking questions of colleagues - formally or informally, and sharing success stories.  There is a school of thought that says if it isn't measurable it is not evaluation, but often you get better feedback from just walking around the business talking to people than you do from surveys.
TZ: People are learning in different ways – social learning, for example -  and neuroscience is shedding more light on how we learn. How do you see the Kirkpatrick Model developing as we learn discover more about how adults learn?
JK: In the New World Kirkpatrick Model (below), the green section in level 3 looks includes on- the- job learning. That is where the Kirkpatrick Model has been expanded because the when my dad developed it in the 1950s, learning occurred in the classroom and then people were on the job.  I think that more and more learning is going to occur on the job through social media and technology. We are learning how to reach different people. I think the key will be expanding the learning professional's definition of learning – we will have to be professionals of learning on the job - and then integrating learning and performance. 
Jim Kirkpatrick PhD is a senior consultant for Kirkpatrick Partners. Jim consults for Fortune 500 companies around the world including Harley-Davidson, Booz Allen Hamilton, L’Oreal, Clarian Health, Ingersoll Rand, Honda, the Royal Air Force, and GE Healthcare. Jim welcomes your feedback and comments; information@kirkpatrickpartners.com.

Author Profile Picture