Author Profile Picture

Charles Jennings

TULSER, 70:20:10 Institute, Duntroon Consultants

Director

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

How not to train

stop_sign_large

Despite the recession, companies are spending millions on ‘pointless and ineffective’ training systems and elearning courses. Charles Jennings says it’s about time learning managers pulled their heads out of the sand.

There’s one simple lesson that training and development managers and their business stakeholders could learn that would save their organisations large amounts of money - and saving money is an important consideration for almost all of us in the current climate. The lesson is this: stop wasting money on ILT training for system, process and product rollouts and upgrades, and cut down on eLearning courses, too.

So what impact will this have? Well, the amount of time, effort and money spent on formal ILT training prior to rollout or upgrade of enterprise platforms (particularly ERM and CRM), processes, and other new software systems and products is huge. Of the US$134bn (yes, 134 billion!) the ASTD reported as being spent on employee learning and development in the US in 2007 (the latest figure available) a conservative estimate is that at least 5% is spent on this type of system, process and product training. That’s more than $6bn every year that could be used much more effectively or saved.

However, some managers and L&D people just don’t seem to get it. When they are presented with the data that shows front-loaded training in these situations is almost invariably ineffective, they generally just put their heads down and continue on regardless. It reminds me of the remarkable insight of the author Aldous Huxley when he wrote the words “I see the best, but it’s the worse that I pursue.” Maybe most managers and L&D people are secretly drinking Huxley’s soma – the hallucinogen in his novel Brave New World – “Half a gramme for a half-holiday, a gramme for a week-end, two grammes for a trip to the gorgeous East, three for a dark eternity on the moon”

Charles Jennings"Managers and L&D people just don’t seem to get it. When presented with the data that shows front-loaded training in these situations is almost invariably ineffective, they generally just put their heads down and continue on regardless."

Yet the evidence has been around for a long time that formal training on detailed task and process-based activities in advance of the need to carry out the task or use the process is essentially useless. Both the logic and the available evidence point to the fact that the “we’re rolling out a new system, so we’ve got to train them” approach employed by many organisations and offered as a service by a myriad of training suppliers across the globe is both inefficient and fundamentally ineffective. It might feel comforting to attend a class prior to rollout of your shiny new HR or Financial system; participants may feel that they’ve learned, but the evidence points in a different direction.

In fact you might as well throw the money spent on these activities out the window. Actually, a better option would be to spend the diminishing L&D budget on approaches that do work. Not only would new rollouts and upgrades come into use more smoothly, but I’m prepared to bet that it would leave budget over to use for other employee development activities or to offer up as savings (perish the thought!) So where’s the evidence that the standard systems and product training approaches don’t work or are, at best, sub-optimal? Well, even if you’ve never been involved in training for rollout and upgrade and then finding that users demand re-training or simply call the help desk in large numbers as soon as go-live happens, it helps to be aware of some fundamental truths about this flawed model.

Truth 1: Too much information!

Most pre go-live training is delivered through ILT or eLearning and is content-heavy. Instructional designers and SMEs usually feel the need to cover every possible eventuality in the training and load courses with scenarios, examples and other ‘just-in-case’ content. During my training and development career I have seen multiple PowerPoint decks of 200-300 slides prepared for delivery over 2-3 days for CRM upgrades.

Few humans can recall the information for later use, or even a fraction of it. Maybe if they have photographic memories they can, but planning for photographic memories is not really a sensible strategy for instructional designers. Those of us without photographic memories just park most of what we do remember at the end of the session in the ‘clear out overnight’ part of our brains. We may also be presented with supporting material running to 250 pages (I’ve seen it) printed in full colour to augment the training.

"Do you think Tiger Woods’ brain retained the details of how to arrange his body to hit a golf ball 400 yards without lots of practise and reinforcement?"

However, all those expensively-produced user guides and manuals are simply a waste of the earth’s limited natural resources. They tend to be too detailed, linear, full of screen-grabs that the user will never refer to, impossible to navigate, and the last thing people reach for when they need help in using a new system. They are far more likely to reach for the phone and call the Help Desk than to use the manual. And who could blame them Training user guides are quintessentially shelfware. Usually the only time someone picks them off the shelf is to throw them in a bin (hopefully one marked ‘recycling’) during a clear-out, an office move or when they are moving to work with another organisation or retiring.

Truth 2: Too much time between the training and performance

Embedding knowledge in short-term memory and long-term memory are two very different processes. Even the information that can be recalled immediately after training - and that’s likely to be a small amount - will be lost if it isn’t reinforced through practice within a few hours. One of the keys to learning is practise. Practise and reinforcement are required for our brains to build new and persistent connections and patterns. This is what’s called ‘long-term memory’.

The neurological processes involved in building long-term memory – involving chemicals in the brain such as serotonin, cyclic AMP, and specific binding proteins – are different from the processes that allow us to recall information from short-term memory. The importance of practice in the process of learning can’t be stressed too much. Do you think Tiger Woods’ brain retained the details of how to arrange his body to hit a golf ball 400 yards without lots of practise and reinforcement?

Similarly, anyone who develops proficiency needs to practise, practise again, and then practise some more. Not just sports people, but anyone who needs to develop an ability to do anything that is in any way complex, whether it’s applying mathematical formulae or carrying out microsurgery. Unfortunately, most systems, process and product training offers only cursory practice opportunities. There’s usually not enough time allowed for extensive practice as there’s so much ‘knowledge’ to impart, or there is no access to the system/product prior to rollout and any practice that does occur is often on some cut-down system, the ‘training’ servers or a simple simulation.

Truth 3: Post-training drop-off has a major impact

Post-training drop-off is another reason why formal pre-rollout training simply can never be the most effective approach to developing user competence. Harold Stolovitch & Erica Keeps carried out some very interesting research on desired vs. actual knowledge acquisition and performance improvement. The data from their research uncovered some important facts about human memory and processing.

Following an initial dip during the training event (the ‘typing/golf pro dip’) - where performance drops as new ways of carrying out tasks are tried out and tested - knowledge and performance then improve during the training session. The individual then walks out the door knowing more and with higher levels of task performance than when they started the training. Then the problems start.

The drop-off following the training event (called the ‘post-training re-adjustment’ by Stolovitch and Keeps) can kick-in very quickly, maybe in a matter of a few hours if there’s no opportunity to reinforce and practice. So, you finish a day’s training course, go home, sleep, and by the next morning a lot of what you had ‘learned’ has been cleaned out of your short-term memory. Bingo!

Then, next day you get back to your workplace and try to implement what you garnered in the class. The trouble is you can’t remember exactly what to do, you don’t have any support (that trainer who you called over to prompt you when you went through the exercises in class yesterday isn’t there today and the user guide is unfathomable), so you try a few things, find they don’t work (unless you’re lucky) and then you simply go back to doing what you did previously. The result? Performance improvement = zero, value added by the training = zero and ROI = zero (actually it’s negative.)

Upwards - following the dotted line

The only way knowledge retention and performance can follow the dotted line in the graph is if plenty of reinforcement and practise immediately follows the training. Even better if this is accompanied by some form of support – from line managers setting goals and monitoring performance, from SMEs providing on-demand advice and support, or even from learning professionals providing workplace coaching. Unfortunately, this rarely happens, in spite of all best intentions. An even better (and certainly cheaper) option is simply to cut out the training altogether and replace it with a support environment from the start.

Where performance support trumps training every time

There are some very good ePSS (electronic Performance Support Systems) or BPG (Business Process Guidance) tools available now. They are economic and generally straightforward to implement and they trump training every time for supporting learning and the development of competence in using defined processes found in ERP and CRM systems and other software products.

"The failings of ILT and the lack of the opportunity for ‘real’ experience and practise in most formal training design has been highlighted over the past decade and more."

ePSS/BPG tools provide context-sensitive help at the point-of-need and according to Davis Frenkel the CEO of Panviva, the company that produces the very impressive SupportPoint BPG tool, they act more like a GPS system that a roadmap: “When you’re learning to follow a process you just want to know the next 2-3 steps you need to take. You don’t want to have to remember the entire 20-30 process steps and all the options,” says Frenkel and I think he’s absolutely right. A GPS in your vehicle instructs incrementally. It doesn’t tell you every turn you’ll need to take on the journey when you first set out. When there’s no access to GPS and the driver has to revert to a map and will tend to read and memorise just the next 2-4 turns on the journey, stop and re-read the map to get the next set of instructions until the destination is reached.

So why don’t organisations, business managers, project managers, and L&D folk wake up to the failings of using the wrong approaches to achieve their required outcomes? Why are billions spent every year training employees on using systems, processes and products in this way when there is ample evidence to prove that it simply doesn’t work? Sadly, I think the answer may lie in Huxley’s statement. “I see the best, but it’s the worse that I pursue”.

Certainly training and development professionals should be aware of the facts and the research. The failings of ILT for systems, process and product training and the lack of the opportunity for ‘real’ experience and practise in most formal training design has been highlighted over the past decade and more. For the past 30 years research has indicated that only somewhere between 10% and 30% of all ILT training results in changed behaviour and improved performance in the workplace, with the lower figure predominating (Baldwin & Ford, Ford & Robinson et. al.)

However, I think that we will look back in ten years time, when the majority of support for the rollout of new systems, processes and products is being carried out through on-the-job performance support in one way or another and wonder why we didn’t see the best and pursue it earlier.

Charles Jennings was chief learning officer at Reuters and Thomson Reuters. He now works as an independent consultant on learning and performance. A version of this column was first published on Charles blog which is also linked from his website www.duntroon.com

Author Profile Picture
Charles Jennings

Director

Read more from Charles Jennings