Following the roll out of a new organisation wide piece of training I have been asked to organise an evaluation/review session for all the facilitators who delivered the course.
Any suggestions as to what questions I should be asking, other than the obvious 'what went well / could be improved?', etc Should this be much different from the evaluative questions asked of the participants after the course?
Thanks
Louise
Louise Cole
13 Responses
Tutor Assessment
I have found that the following topics need to be addressed:
* Changes Needed to the Course
* Prior Learning (starting point)
* Future Course Opportunities
* Remedial Teaching
Changes Needed to the Course are obvious and include changing the time-table, teaching methods, course sequence and integration between sessions.
Prior Learning raises the question about the knowledge and skills level of the course participants and how this varies between people and hence the appropriateness of this course. A more fundamental question than the first one – in other words did the delegates attend the right course!
Future Learning Needs – presumably the course is one in a series – what are the learning paths of the delegates? This question is both for the trainers and the delegates as it links into lifelong learning.
Finally, there is the question of remedial teaching and reinforcement – how are you ensuring that they can and do use the course material. How do you protect and ensure your investment in training?
Regards
jeremyhall@simulations.co.uk
Self praise is no praise at all!
Louise,
Even if application forms already gave answers, I ask each participant what they hope to get from and are willing to give to the training.
I can then assess whether the individuals, the team or the organisation would most benefit from remedial, generative or evolutionary changes.
In reality most people focus on and will settle for the merely remedial.
I encourage people to set their sights higher, to evolve into different kinds of organisations, teams, individuals!
At the end (and often in the middle) of each training day, delegates reflect on what they’ve got and given so far.
In relation to their desired outcomes, what was:
One Key Learning Point.
One personal insight.
Any disappointments.
I ask myself:
What do I like about what I’ve done so far?
What did I learn from what we did together?
What might I (need to) do differently to meet the needs and criteria of:
the delegates,
the organisation and my own values and desired outcomes?
My training allows me to be endlessly creative and flexible because I make it all up as I go along.
Working in the private or public sectors, be it working with teams in crisis, running Negotiation Skills or Clear and Critical Thinking for Decision Makers teaching Presentations Skills for the FCO, Supervisory Management, Running Effective Meetings or whatever, I always work with the people in the room and what they bring.
I do not want people to fit into my agenda and, fortunately, I am competent at creating exercises on the spot based on the fears, frustrations, passionate hopes, deepest despair or highest aspirations of the delegates.
An important criterion for me is that, when people leave, they have some of the skills and the real will to think, feel and DO something different.
Lamentably, that is not what all (or even most) organisations want, and, all too often, it is not what the delegates want.
My approach enables me to assess whether people really learnt something useful, whether they are committed to using it, and what areas they might need more direction, coaching or support.
Initially people find it difficult not to have a timetable, but if I argue my case effectively, and I usually do, they get a different perspective and, often, different paradigms for listening, leaning, looking and learning.
This, along with evaluation forms, provides abundant data for evaluating and reviewing the training.
Reviewing what?
Louise,
You don’t talk about the purpose of the review.
Is it to improve the course (to make sure it better meets business/delegate needs), to celebrate the end of a programme, to see if facilitators need help, to learn how to develop/organise a more
effective programme next time?
Questions will differ with every different purpose.E.g to see whether it better meets business needs you could present feedback from the business and ask “What could we change to address x issue?”
If it’s a celebration “What did we most enjoy about the programme?” “What was the most significant moment for us?”
If you’re clear about the purpose, the questions will “drop out” easily and the attendees will feel it a useful/enjoyable exercise.
Hope that helps a little.
Carole
Trainers’ Feedback
Ask if the learning objectives were approrpiate to the audience and furthered the Agency Mission; did participants identify with objectives and were they clear; did trainers feel competent delivering this training; did the curriculum follow and fulfill the objectives? Was it consistent with the Agency Mission and culture?
If not, describe obstacles and ask for recommendations for changes.
Evaluation comes before training
Whatever questions you want to ask now – they should all have been asked before the training was designed – then the post evaluation is a very easy job.
see Bitesize no. 4
Maintining a ‘clear line of site’ with organisational objectives
Louise,
During an international conflict during the 1990’s, a spokesman for the UK’s Ministry of Defence stated that the MoD evaluated the effectiveness of their aerial attacks by asking three questions: Did the missile hit the intended target? Did the damage caused by the attack materially reduce the enemy’s functionality? Were the combined effects of the various strikes advancing the overall aims and objectives of the war? These criteria provide a metaphor for assessing the validity and effectiveness of targeted initiatives in other areas, not least organisations.
Applying this thinking to the business world provides a useful strategy for validating and evaluating development interventions and/or ‘service’ provision in general. The critical insight here is to recognise the importance of maintaining a clear line of sight between specific actions and the ultimate aims and objectives being pursued. I agree with Paul Kearns that evaluation comes before training. Too often, evaluation processes begin with the interventions themselves and work backwards, in an attempt to prove the business benefit; whereas the MoD story emphasises the need first to understand the context within which the specific actions are taking place. It is the context that validates the intervention, not the other way round.
I have an 8-Point checklist, which identifies key issues that need to be taken into account if development activities are to be validated and evaluated effectively. It does, though, require you to think of evaluation in relation to the organisation’s objectives over time, rather than solely as a post-event assessment of participants’ perceptions. I’m happy to give you a copy, if you think it might be of value.
Overall, making sure that you are ‘doing the right things’, that is maintaining a clear line of sight between business performance and development interventions, is usually more critical to the long-term success of the organisation than ‘doing things right’. Doing the wrong things expertly well is a sure-fire way to destroy value and put the business at risk.
Regards,
Chris Rodgers
Pre & Post evaluation
Louise
Evaluation needs to start with the manager & learner before they attend an event. Having a pre/during/post evaluation system gives you valuable analysis which can be measured against the set criteria from the business needs identified. It also enables you to review the course content to ensure it delivering what was required.
Post evaluation should be “happy sheets” and after a set period (3 months) to identify how effective the knowledge transfer had been.
The questions for the facilitators would be similar to the questions for the learners, but from the other side so to speak.
Regards
Loraine Sawyer
Training Advisor
Evaluate or validate?
Picking up on Paul Kearns’ comments, if you are after *evaluation* (as in VALUE), you should have started here way before doing the training, asking questions about what key performance indicators (e.g. profit, customer/employee satisfaction/waste/quality/etc) need to change, by how much, and why. This gives you a baseline to compare with after the event, and a clear link to business performance. You are now looking at evaluating the net value added (benefits less costs) to the organisation.
If you are interested in whether the right training objectives were delivered, whether they were effectively delivered, and feedback from the trainers themselves I’d suggest you are looking at *validation* (as in VALID).
Happy sheets have a role to play, but not in evaluation. They tend to be easy to administer and analyse, but tell very little/nothing about the value added to the organisation. There is a perception that finding out the real value added (in cash) is significantly more difficult and costly to do, especially for the soft skills or behavioural stuff. It isn’t, and I’d be happy to send interested parties a note on how I do this.
Martin Schmalenbach
martin@p-nrg.com
I find involving all in a 360 degree evaluation helpful.
For many years we just did post training/event evaulations but this did not move our learning on and I am not sure that it was that helpful to participants. We work primarily in the not for profit sector where organsiations also have their own evaluative methods – most simple one page question sheets.
We now ask a series of questions of the commissioning organisation and if possible participants. These comments/requirements feed into trainers briefs. At th start of any training session all participants are asked their expectations and exactly what they want out of it – trqainers carry out the same exercise. We get participants to complete a rather long evaluation plus trainers do a post event report. All these are analysed and sent with a summary to the commissioning organsiation. We recontact the commissioning organsiation and a random sample of participants after three months to see what outcomes, particularly soft outcomes and movement have been achieved.
Chris Richards
trainer evaluation
Hi Louise,
I agree with the comments so far posted – we normally do either a face to face or telephone debrief with the trainers after each event, and ask them things like what went well, did the event meet the objectives from their perpective.
We also ask the trainers about how effectively we set up the day and if there was anything that we the Company could do differently. This has been the most useful question as the trainers and facilitators commented on things like the participants preparation for training events, and as a result we have taken care to factor that into the training process.
Good luck with your project.
Vicki Edgington
Begin with the End in Mind (Stephen Covey)
Louise
I agree with most of the comments others have provided. In my view those of Paul Kearns and Chris Rodgers are the most relevant to evaluation in general terms. You do, however, ask about an evaluation/review for the facilitators who delivered the course. To my mind these are the least important people to ask as far as evaluating the effectiveness of training is concerned. They will almost certainly have a vested interest in providing positive feedback. I notice that you don’t mention the participants’ line managers in terms of getting evaluation information. These are the people who would (should) most likely notice a difference in work performance as a result of learning and this is crucial evaluation information.
I regret that if you didn’t think of the evaluation questions that you needed to ask when you were planning the course, you have probably missed the opportunity – hence the quotation I provide at the head of these comments. Regards
Chris Cordery
Aurora Training & Development Services Ltd
peer evaluation also
The one bit I would add is do you have a mechanism to help the facilitators coach each other?
If the roll-out is finished, fine, but if it is in progress, it can be good to have facilitators working in pairs (or watching) and then providing a structured feedback and discussion to improve the delivery of the training…
Sali
evaluating outside the box
Amongst this sound advice what seems to be missing is anything creative or radical or outside the box.
It depends whether you just want to tidy up round the edges or use the combined wisdom and creativity of these facilitators as a springboard for radical change and spectacular results.