No Image Available

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

How to Establish Standards in Training Evaluation using Kirkpatrick’s model

default-16x9

1st: How have you established your standards for participants’ training satisfaction according to Kirkpatrick’s level 1 of training satisfaction? We use a 5 point scale and have used the 4 as the success factor but management now questions if we should not put our standards higher? Thoughts?
 2nd question:  We also use level 2 of Kirkpatrick to evaluate the knowledge gain of our participants.
Here is one of our dilemmas:
 When pre results are high because the participants already had some previous knowledge sometimes it leaves not much play between the post and pre results and we obtain a low knowledge gain. Is there a way to present a low knowledge gain from the previous example and convince management that it is not negative (meaning a bad course)? Could we do that by presenting the correlation between the pre results and the knowledge gained and say that a significant positive correlation would mean that the course seems to favour participants who have more prior knowledge on the subject as judged by the pre score?
 Thank you for your help,
Carole Morin
 

4 Responses

  1. Answer to Question 2

    Hi Carole

    On your second question, can I ask what do you employ to capture level 2 data?

    We use an online testing application, and is used in all our technical trainings.  So if we feel that our audience already has a basic level of knowledge before attending the course, we either hike the content up a notch, or the questions that we would test them on later.

    It’s a simple solution, but it works for us.  I hope this helps.

  2. Thanks for the answer

    Hi Steve,

     

    Thank you very much for answering and send the businessballs website.  I apologize to not have answered before. I knew about their answer already but for other readers it may certainly be a good start. We have been inspired by the way the World Bank assess their courses at level 2. We use pre and post multiple choice questionnaires based on the knowledge objectives (10 questions). The pre and post questions are matched and reflect the same knowledge content. We calculate the difference between the pre and post and present the results in percentage points.

    Very recently, we got in touch with Jim Kirkpatrick (Donald Kirpatrick’s son) and he told us that there are no standards as for a good knowledge gain. He said that our use of the analysis of the pre and post results, is good because we use it to detect where there would have been some mishaps or lack of focus on certain elements during the delivery. Therefore by using this kind of analysis and results to improve the course we are are on the right path but to present the results to management who are eager in getting numbers only, is too risky and this way of doing level 2’s is not advised for reporting to management.

     

    Jim Kirkpatrick advise to use a retrospective pre and post. The idea is to send questions after the course, the questions based on the objectives of the course would ask on a scale of five where you stood in your understanding of specific knowledge objectives of the course and where you now stand after the course. The results of this retrospective pre and post questionnaires is the best reporting for management when they want numbers and do not really care about improving the course. 

    Thanks again,

    Carole

  3. How we capture level 2’s

    Hi,

     

    We use paper based pre and post multiple choice questionnaires based on the knowledge objectives (10 questions). The pre and post questions are matched and reflect the same knowledge content. We calculate the difference between the pre and post and present the results in percentage hank points.

    According to Jim Kirkpatrick (Donald Kirpatrick’s son), there are no standards as for a good knowledge gain. He said that our use of the analysis of the pre and post results, is good because we use it to detect where there would have been some mishaps or lack of focus on certain elements during the delivery. Therefore by using the analysis and results to improve the course we are are on the right path but to present the results to management who are eager in getting numbers only, is too risky and this way of doing level 2’s is not advised for reporting to management.

    Apparently it is best, when reporting to management to use a retrospective pre and post. The idea is to send questions after the course, the questions based on the objectives of the course would ask on a scale of five where you stood in your understanding of specific knowledge objectives of the course and where you now stand after the course.

    Carole

No Image Available
Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

Thank you!