We run a mandatory introductory programme for new staff which gives them an overview of our organisation and also includes Mnaual Handling, Fire and Security, Infection Control etc.
At the moment participants give feedback on "happy sheets" but I am unstisfied as to the usefulness of these. Please could anyone share with me some evaluatory tools that all staff could use to give meaningful feedback re. whether learning outcomes were met.
I look forward to hearing from you.
Alison Bayne
11 Responses
Evaluation Forms
I have two forms (each a single page) that you are welcome to use. One is for your evaluation of the class, and the other is their evaluation of the training.
email me & I’ll send them through to you.
Feedback on Assimulation
With any induction programme, I guess the overarching objective is to quickly assimulate the new person into the organisation and ensure they enjoy a safe and productive working environment. Without seeing the learning outcomes, it’s difficult to comment on the type of evaluation, as in some of those topics, aside from evaluating the retention of the subject matter at some point in the future, it’s only when an incident arises, that you get to “evaluate” the true effectiveness of the training.
But I guess in some areas, you may able to seek the feedback of the new starter’s line manager at some point after the programme and obtain their observations about how quickly/successfully the person settled into their role and how compliant they were in those regulated areas; and whether there were things they should have known, but didn’t, that could be included in the programme.
Follow up
I guess there is a slight difference with mandatory courses in that the learner may not really want to learn but you want and need them to …. so evlauting whether learning goals have been met is difficult.
When I have ben involved in setting up induction courses I have gone back to the attendees after they have been in the company for a few months to ask them to reflect on the initial course and ask whether they thought the content was right or whether additional things would have helped. I would actually say that I was going to do this while they were attending the course so hopefully they will keep it in mind.
I offer a “happy sheet” that seems to work and some wider ideas
Dear Alison
I have found three questions work well. These are:
“What was the most significant thing you learned?”
“What was good about the event?”
“How could it be better next time?”
The first question gives you some idea about what people take away from the training. The second gives you some appreciative feedback. Training is tough, so it is good to have something positive to take home.The third may give some useful ideas for improvement. These are much easier to hear when you are feeling good.
I agree with other correspondents that you need to ask the people, in the few weeks after the event for more considered feedback, and go back to the sponsor or manager to see if what should have happened, has happened.
Appreciative Inquiry by Watkins and Mohr has an example of an appreciative evaluation of training. Essentially they asked people to describe the best outcomes from the training and the conditions and learning that supported those outcomes. This then enabled the organisation to strengthen and develop what was already working. This seems to me to be a lot more fun that finding out what goes wrong and trying to fix it.
Best wishes
Nick Heap
nick.heap@ntlworld.com
End of course evaluation.
Alison
We had the same problem with several courses when we were taken on as members of a brand new QA Department at a national training organisation about two years ago. Our ‘happy sheets’ asked all sorts of questions, none of which fedback into the development of the courses.
Our response was to get together and work out what information would actually be useful e.g. relevance of content, effectiveness of training and learning methodologies, currency of training equipment, etc. We then identified the most effective ways of gathering this information and arrived at a two tiered approach:
(1) an end of course questionnaire to be completed anonymously by students. This asked whether or not we had met their needs in all areas that impacted their learning.
(2) Student group interviews conducted by one of the QA Department. These were semi-structured interviews designed to explore any emergent issues.
We never used both methods for the same group of students. The information is then fedback to Heads of all relevant departments.
In addition to the above we conduct level 3 evaluations of our core courses asking how often students have used what was taught on the course and if so how relevant was it to their workplace role.
Your question doesn’t say whether or not you work alone. I have the luxury of being one of three quality assurance staff who evaluate all aspects of training full time.
If it helps I can send you a copy of our end of course questionnaire and give examples of questions we use at level 3. I can be contacted on 01480 401850.
Cheers
Alex Paterson
Right Questions, right time…
I believe there are two issues embedded in your question.
Firstly, there is the the issue of asking for the feedback that you need to help evaluate the success or otherwise of your programme. I’m not going to comment on this beyond saying that there ought to be transparency between the stated Outcomes of the trainig and the evaluation questions.
Secondly, when do you ask for feedback. My view is that any decent trainer ought to be able to get good scores an happy sheets issued at the end of a course – they tell us more about the style and atmosphere generated than the real outcomes. I don’t use sheets at the end but keep them for a week so that participants complete them in the cold light of day.
Hope this helps
Geoff
evaluation or assessment?
Hi Alison
I think Tim Drewett has hit the nail on the head for me.
Seeking the future feedback of the line managers rather than the delegates will tell you whether the course has produced the appropriately skilled people that they need. This provides a realistically orientated evaluation tool as opposed to an assessment tool of the opinions of the delegates. It also keeps your Training Objectives up date and helps to get buy in from the line managers
Emotional response
Happy sheets are never the full picure, they always present an emotional response when issued immediately after the course. Furthermore there are some items that are mandatory that delegates may not enjoy learning and may ‘mark down’ the trainer as a consequence wihtout being able to see the wider consequence of the learning.
Evaluation comes later. Have a look at some of the techniques, read some books and have a look on the search engine here and the Evaluation zone.
Induction
Immediately after the course I use a combination of a course evaluation (questions related to the material taught versus the course aims) and a course critique (how it was presented – instructor, atmosphere etc). I then do a post-course evaluation by e-mail electronic form (compiled using desktopauthor) 4 months after the course has ended. In addition, we contact managers with a post course evaluation every 6 months to see if the course continues to provide the needs of the departments we teach. It amounts to a bit of work, but keeps the courses relevant and keeps students and managers in the loop.
Simple but effective
Of course, there are a range of solutions which can include immediate feedback as well as follow-up information on the ultimate effectiveness of the training.
In some situations, where immediate feedback is required, the 2 main problems are a.) getting a reasonable number of them filled in at all, before the participants rush away (not so much of a problem on mandatory courses) and getting critical feedback (the “happy sheet” effect). Over the years I have used a simple but effective form, which has produced useful feedback. It is usually filled in because it is short, confidential (no name is requested), and has questions written in plain English. It asks for a grading of the course on a 1-5 scale (Very Good, Good, Satisfactory, Unsatisfactory, Poor). Then it asks 3 simple questions: 1.) What did you like best about the course? 2.) What did you like least about the course? and 3.) How could this course be improved in the future? Questions 2 and 3 give people encouragement to be critical and say what they would have liked, and the answers I get have been helpful both in confirming what works well and coming up with ideas for improvements. It does not, of course, capture the effect of the training over time, but does give a useful snapshot at the end of a course.
Structured debriefing as an evaluative tool
I find structured debriefing, which uses stik it notes and prepared aims/objectives with a flip chart visual, is an attractive alternative. It allows every one to have a say and listen to other views too. Finally it asks delegates to say what they have learnt and give ideas for course improvement.
More info on http://www.structured-debriefing.co.uk