No Image Available

TrainingZone

Read more from TrainingZone

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

A brief history of e-learning – feature

default-16x9

"A brief history of e-learning" was contributed by Brian Holley, Managing Director of Sherpa Integrated Learning Ltd.


It all began with a dog and a pigeon - well, several of each actually. In the early twentieth century American psychologist J. B. Watson laid the foundations for a school of ‘behavioural’ psychology. His ideas were built on by Russian psychologist Ivan Petrovich Pavlov, who explained ‘conditioned response’ from his work with dogs. Then came B.F. Skinner, another American, who, with the help of some willing pigeons, developed Watson’s concept of ‘stimulus-response-reinforcement’ into a learning theory and went on to develop ‘programmed learning’.

By the 1970s programmed learning was being used widely in the training field (among humans rather than pigeons). Seafarers were taught the ‘Rule of the Road’ using a programmed learning text, and teaching machines were used to support maths teaching in many secondary schools. Programmed learning was not without its critics, but even so the results obtained would bring tears to the eyes of many modern training buyers. Well-designed programmes could achieve 90/90 validations (90 per cent of those taking the post-test getting 90 per cent right). This achievement was due to excellent analysis and design skills, and thorough testing and revision using proper statistical techniques.

Then came the recession.

As usual, when the economy catches a cold, the training profession gets the ‘black plague’. Many trainers were made redundant. By the time ‘micro-computers’ became popular in the early eighties, the expensive programmed learning market had dwindled and programmed learning professionals had gone to work elsewhere. Computer Based Training was therefore developed largely by IT specialists and they have continued to dominate ‘technology based training’ ever since. It seems that, in the gap between programmed learning and CBT, some fundamental knowledge was lost. If e-learning has lost sight of its roots, it is not surprising that it has gained such a poor reputation.

In the Spring of 2000, the Department of Education and Employment report, Authoring for CBT and Interactive Media, said that:

"A typical TBT author will not have received specific training in the many skills required for TBT development, such as analysis skills, instructional design, writing aims and objectives, question techniques and answer analysis."

Since then the Institute of IT Training and others have begun to redress this dire skill shortage. But much of the currently available material has been written by people without the appropriate training and experience. Indeed, thousands of courses are available, and e-learning corporations need to recoup their multi-million dollar investments. So it will take some time before new standards are established.

In recruiting e-learning authors, we have had to turn down eight out of ten applicants. Most of them had written material for leading blue-chip organisations but they knew little about the theory and practice of e-learning design, even to the point of being unable to produce behavioural objectives.

Hardly surprising, then, to read that analysts, Biz Media, discovered that 61 per cent of those spending money on e-learning rated the outcome as only fair to poor. Only 1 per cent rated it as excellent, 5 per cent very good and 33 per cent good.

So what does history teach us?

Fundamentally, that we need to have a sound theoretical basis in at least five areas for the design of e-learning material:

1. Meticulous training needs analysis
2. Meticulous subject matter analysis
3. Meticulous learning design
4. Meticulous and appropriate IT design
5. Meticulous testing for validation

1. Meticulous training needs analysis
Too often TNA is generated by a bottom-up approach, via an outdated appraisal system. If the annual appraisal is the generator for training needs identification, then it’s too late. By the time training takes place, the need may well have changed. Many people are undergoing training to resolve last year’s problems!

Training needs analysis needs to be driven top-down, beginning with the strategic objectives of the organisation. This will enable the activities necessary to achieve those objectives to be identified. It is not then too difficult to describe the skills required to carry out strategically sensitive activities and to establish criteria against which to measure the current skills of the workforce.

Unfortunately, too few HR people are privy to strategic plans and too many have to be reactive to managers’ demands, rather than proactive to the demands and needs of the customer. Only a major cultural change can alter this.

2. Meticulous subject matter analysis
Designing e-learning content requires a quite different approach to designing classroom courses. In the classroom you can read body language, respond to questions immediately and even change the agenda if it becomes apparent that a particular group needs something slightly different. Obviously, you can’t do much of that on-line.

Many excellent trainers, who are very professional in designing classroom courses, may find the detailed analysis demanded for e-learning design more tedious than knitting. It takes a particularly meticulous mind to tease out all the teaching points from a learning scenario and then assemble them into an appropriate logical order.

3. Meticulous learning design
It is at the point of learning design that the knowledge of learning theory becomes vital. At the micro level, Skinner’s stimulus-response-reinforcement cycle is, to my mind, essential if people are to construct knowledge and skill from the building blocks of the programme - the teaching points.

At the macro level, another cycle kicks in - the ‘cycle of self-development’ (Cp Kolb et al): visualise, act, reflect, adapt. These ‘wheels-within-wheels’ (stimulus-response-reinforcement and visualise, act, reflect, adapt ) help the designer to avoid pressing on in a purely linear way, ignoring the needs of learners. It is essential to consistently incorporate new learning into the previous. This experience is considerably enhanced when a real live tutor is involved, albeit on-line. Providing email feedback on exercises, using chat rooms, forums and conferencing facilities brings learning material alive in way that cannot be achieved by text or sound alone. Stand-alone courses, that have not been designed with human intervention in mind, are doomed to failure.

Everyone now agrees that e-learning is not the solution to everything, and so the designer needs to consider whether part of a learning programme would be better delivered in a classroom workshop, from a book, by video, in a virtual classroom or by some other, more appropriate method.

We have so much more in our training repertoire than Skinner and his pigeons. Knowledge of how the human brain works has been grown componential in the last few years and we can draw on work by Kolb, Honey and Mumford, Reinert, and Adler, and more recently, Clark and Mayer, to name but a few. Many theories and practices from NLP and Accelerated Learning can be applied, so there is no excuse for poor learning design.

‘Blended’ learning is a popular term, but ‘blended’ sometimes means cobbling together two, otherwise disparate, components - like Marmite and ice-cream! I prefer to use the term ‘integrated’ because this infers that the learning elements are different, yet integral to the whole learning programme, and that they are designed to fit together - like Marmite and toast. Indeed, ‘learning integration’ is another new piece of jargon to hit the training world. It is derived from the IT world’s equivalent of ‘systems integration’ and there are parallels. Just as it is essential to have different IT systems ‘talking’ to one another using a common language or protocol, so it is essential that e-learning modules are designed to fit a programme of learning, not dumped as a pile of CDs on the trainer’s desk along with the outline of a classroom session.

4. Meticulous and appropriate IT Design
Too often, site designers are more concerned with multi-media facilities than with design that will enhance the learning experience. Multi-media design can produce visually stunning sites, but that’s not necessarily what learners need.

Basically, learners need a secure environment in which to work. They need to know where they are in a course, how far they have to go and what are the appropriate stopping points in between. Learners need to be able to log in and resume a course automatically from the point they left off. They don’t want to have to search for that point.

If the site is too heavily populated with ‘flashy’ stuff, users will be distracted from learning. It’s like listening to ‘hi-fi’ instead of music. Even Mozart can’t get through to the ‘anorak’ sitting among his quadrophonics twiddling with his tweeters. It’s too easy for learners to focus attention on what’s happening on the screen and to forget to think about the subject in hand.

5. Meticulous testing for validation
The purpose of validation is to test the efficacy of the course, not to test the knowledge of the learner. If a learner does not succeed on an e-learning programme, it is usually the fault of the designer, not the learner. Often, failure is due to the course never having undergone stringent validation testing to ensure that it teaches what it purports to teach. More often, that it was never designed correctly in the first place.

Initially, validation consists of piloting the course with a group of learners who represent the course’s target population. Before they begin, learners all complete a pre-course test which must be exactly the same test that they will take as a post-course test. The pre-course test identifies what they know already or can guess.

The results of the two tests are then analysed and statistics produced for raw score and percentage gains for each question. Where there are consistently low or high marks, the relevant elements of the course are analysed to see if they need simplifying or toughening up. If they do, repairs are effected and the course is tested again.

Validation tests are only used when piloting the course. Another, much simpler, and perhaps totally different, type of test may be used to find out what individual learners have learned.

Many of Watson’s, Skinner’s and Pavlov’s ideas have been, at least in part, superseded. Nevertheless, just as Eddison and Bell made contributions without which our modern computers could not exist, so theories about learning, absolutely fundamental to e-learning, are ignored at our peril.

Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

Thank you!