Hi
I have recently started working in a company of approximately 700 staff, based across 4 UK sites, and am the only official trainer. My responsibilities include the delivery of training for 3 different applications and support for a few more. I have 2 colleagues who provide training and support for other applications.
When I joined, the training delivery wasn't evaluated. I am trying to encourage feedback from delegates who have attended drop in (face-2-face but not scheduled), Skype and face-2-face scheduled training sessions. We are able to evaluate the effectiveness of the training sessions based on questions received by SMEs and IT tickets raised due to errors, this is all useful data, however we are struggling to get level 1 feedback from delegates unless we hand them a paper form. We are fast becoming a paperless environment, therefore forms will be phased out for our face-2-face sessions and these aren't practical for our Skype sessions.
We have tried individual emails with evaluation forms attached (no more than 6 questions), surveys and a SME forum for feedback. Nothing seems to work. The main reason given is people don't have time to complete the forms or they receive too many emails.
It is mainly to do with the culture change, we have invested a fair amount of time engaging and communicating with staff about the reason for evaluating training and how it benefits them.
How have you reached out to your audience? Have you faced a similar challenge, how did you overcome this?
Any suggestions would be greatly appreciated.
L
6 Responses
a couple of ideas. Break the
a couple of ideas. Break the learning down into learning points and ask people before AND after to rate their current knowledge, for each, on a scale of 1 to 10 – clearly they should have increased afterwards. Secondly, level 1 doesn’t have to be after, it can be during. So rather than make it a giant chore afterwards, break it down and do it during the session. You can do it anecdotally and video it if you wanted to. Also, have a short session at the end and get anecdotal feedback and video it. Don’t forget level 1 is about how much they learned not about what they thought of the trainer/room etc.
Thanks for the response,
Thanks for the response, Clive.
The level 1 is tailored to how timely the training was, how relevant it is to their role, standard of materials etc. The only question specific to the trainer is from a knowledge perspective i.e. was the right person delivering the training.
I’ll try contacting the person after the Skype/drop in sessions and record the feedback, thanks for the suggestion.
We use A5 introduction cards
We use A5 introduction cards with specific learning points listed and space for them to add personal leaning they wish to achieve.
These are filled in at the start of the session, collected by the trainer immediately and handed back out at the end. The delegates give marks out of 10 for each learning point they feel they have achieved.
There is also a 0 to 10 level of expertise line completed at the start and the end to show the journey.
Delegates will never have
Delegates will never have time to fill in forms. They will have time if you phone/go round to see them to answer a few basic questions about the training they received. You can also encourage them to give you that ‘nitty gritty’ small detail you need for incremental improvement.
Why not perform on a sample say 1 in 10 of your delegates?
I use a highly effective list
I use a highly effective list of question that elicit specific improvement ideas from the participants and also asks them to think about their contribution. This can be issued on the day (face to face), as an e-mail or as an on-line survey. If you would like copy of the questions please contact me on nick.hindley@marhsalladg.com Cheers, Nick
Over the last few years, I’ve
Over the last few years, I’ve whittled down the Level 1 questions to just the NPS (net promoter score) question, with a comments box asking people what would need to change to improve on the score recorded. People will only comment on what they really thought needed changing, rather than find themselves faced with questions asking them to rate this and that, about which they may have no strong opinion. With the Level 1 taken care of with just one question, the other 2 or 3 other questions all focused on Level 3 and 4 predictions.