No Image Available

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

Evaluation of training delivery pt2: What to do next


Sean Errington of People Projects concludes his look at the best ways to evaluate that most illusive of metrics, trainer performance.

Observation methodology

Let us also consider how observation can help to collectively develop an organisation's training capability. Whilst in organisations with only a small number of trainers who are based in the same place, there is often a natural process of discussing training practice. Where there are large numbers of trainers, or where trainers are dispersed geographically, it is less likely that this sharing will take place. Observation provides a process for - and can be a very effective way of - gathering examples of good and better training practice. Once identified, these examples can then be shared across the training workforce.
It is not unlikely that many managers have considered observation and been put off by the apparent complexity of the task. Most people would agree that training is a complex task, and it is not unreasonable to suggest observing training to determine its effectiveness, is also likely to be a difficult thing to do well. The question is whether the products of observation justify the effort required to manage the complexity. We will now explore this complexity.
Observation is particularly complex where it is an exercise in looking at how well a tutor delivers, rather than simply recording whether competencies were demonstrated or not. The latter approach is perfectly adequate where the objective is simply seeking to determine whether a given standard is being achieved. It does not assist an organisation which is pursuing an 'excellence' approach. This takes us into the debate regarding whether an organisation wants competent or inspiring trainers and training. A competency approach will not measure inspiration. Inevitably, where an observer must make judgements about degrees of effectiveness, the task is more challenging. Training which focuses on evidence collection enables observers to make such judgements accurately and with confidence.
"If managers are undertaking observations, can they have credibility without the same or a higher level of subject knowledge?"
Another dimension to this complexity is the extent to which an observer must be a 'subject expert'. Not all managers of trainers will have knowledge of the subjects the trainers deliver. If managers are undertaking observations, can they have credibility without the same or a higher level of subject knowledge? This is not a straightforward question. Practice in the public education and training sector, suggests that the observer must be able to understand what the participants will learn, in order to judge the extent to which they are learning in a session. This does not require them to be a subject expert.
That said, this may limit the topics or subjects a manager could credibly observe. However, tutors often teach a range of topics, or managers can call upon other colleagues or external observers to meet this challenge. Observation credibility is primarily achieved, by observers accurately identifying what tutors do well and not well, and having effective methods to communicate this to tutors.
A particular problem that can arise in poorly designed observation processes, is what can be called 'personal preference'. This is where observers make judgements about what the trainer is doing well or better, the aspects of the session that were adequate, and any aspects that were  unsatisfactory, based upon what they like. This means the observer is judging the trainer's effectiveness simply in relation to the types of training they prefer to experience as a learner, or techniques they prefer to use as a trainer. You might suggest that this is a reasonable application of experience. Consider however what the fundamental purpose of a training session is.
I am sure we can agree that it is for learning to take place, and specifically, participants learn at least what it was intended they should learn. If this is true, then surely judging the effectiveness of any part of a session, or the delivery of a given session topic, must be judged in relation to the extent to which delivery enabled the intended learning. Effective process design and effective observer training prevent this.
Observation has to have a measurement focus otherwise the process does not contribute to the corporate evaluation of training effectiveness. It must also have a development dynamic if it is to help individuals improve their performance. If tutors do not perceive the process to be about their development, they will be wary, defensive, and quite possibly hostile towards the process. This can be avoided by ensuring that the process plays an important part in supporting their professional development. The post observation dialogue, between observer and trainer, which should be a genuine exploration of the observed practice, also plays a significant part in shaping trainer perceptions. If the experience delivers value to trainers, which it is more likely to do if a coaching style is used to manage the dialogue, then most  trainers are likely to embrace the process.

Round Up

The proposition presented at the outset was that observation of trainer performance adds a significant dimension to, and complements and enhances, established evaluation processes. It has been claimed that it does this in two respects. The first is that it provides information about the effectiveness of training delivery, not provided by other evaluation techniques, and it enables better informed performance management of trainers. However, only you can decide if the case is being effectively made. If you feel the case has been made well, then perhaps it is time to evaluate how observation of trainer performance can be implemented in your organisation.

Sean is a passionate educator and advanced skills trainer delivering training organisation improvement training. Sean has worked at all levels in public education from primary schools to universities, and is involved in the inspection of publicly-funded learning. He also works with organisations as diverse as Hanson Aggregates and the Football Association. For more information have a look at and

No Image Available

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.


Thank you!