No Image Available

Seb Anthony

Read more from Seb Anthony

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

ICT Training Feedback Questionaire

default-16x9

Hi,

Does anyone have any good examples of ICT Training feedback questionaires that they wouldnt mind forwarding to me? I have just come back from a trip to our South African office delivering ICT training there and would like to guage what they thought of my training. I am trying to draft something but wondered if anyone had any examples that they wouldnt mind sharing? My e-mail is nwykes@brunswickgroup.com.

Thanks very much!

Nat
Natalia Wykes

4 Responses

  1. ICT – is it all the same?
    Hi Nat
    You may need to be more specific – the feedback from for a word course will look very different from bespoke application.

    Indeed the type of things you need to measure may well be different – unless you want to go the very generic (meaningless) route of:
    pace, content, style, objectives met etc – then each on a scale – but don’t expect to be able to do anything with this – it is just ‘feel good’ data.

    There are generic forms like this on http://www.businessballs.com and http://www.trainerbase.co.uk

    Mike
    http://www.rapidbi.com

  2. Learning Management Analyst – Aimeric
    Hi Nat,

    The design of your evaluation sheet will depend on whether you are asking the questions immediately post training or at a later stage. I have some suggestions for either stage so drop me a emial or give me a ring if it would be easier to talk direct.
    Susan
    info@aimeric.co.uk
    07930 183920

  3. Plan evaluation from the outset (part 2)
    (Part 2)

    I’d like to suggest that you consult one of the many excellent articles and books that set out in more detail how to approach the evaluation of your own training at these four levels.

    Each successive level provides a more precise measurement of the effectiveness of your training, but at the same time requires more time and effort on your side.

    As for timing:

    Usually level 1 evaluations take place immediately after training, otherwise time will blur learners’ subjective impressions.

    Level 2 assessment can be formative — i.e. ongoing and taking place during the training, to provide you with feedback during training — or summative, i.e. done at the end, to give you a final measure of learning, possibly against a pre-test that you conducted before training started. Level 2 evaluation is often done via pen and paper tests, or practical evaluations, or a host of other methods, and unless you conduct level 2 assessment of the learners you will have absolutely no idea of whether they actually learned anything as a result of your training.

    Level 3 can be evaluated using behavioural checklists (based on task analysis) or other forms of assessment, usually in the workplace, and usually at some interval (e.g. one month, then possibly again at three months) after training was concluded.

    Level 4 is essentially a value analysis, aimed at determining ROI on a training investment. In practice, this step is often skipped, as it requires a careful estimation of all the costs and benefits associated with a training program. (Determining this kind of result in financial terms is hard, and linking it directly to training is even harder.)

    I’d like to suggest that you always, without exception, evaluate at levels 1 and 2, and that you increasingly aim to build level 3 into your ADDIE design from the outset, while trying to make the business managers see the sense of doing so.

    When approached in this way, evaluation is not an afterthought but becomes an integral and essential component of your training, providing you with vital information on how effective your training efforts are.

    All the best
    Johan

  4. Plan evaluation from the outset (part 1)
    (Part 1)

    Hi Nat

    As Susan and Mike have pointed out, your approach to evaluation would depend on questions such as:

    1) Is the evaluation taking place immediately after the training, or at a later time?

    2) What was the nature of the content (bespoke, generic, etc.)?

    Here are some more thoughts to consider:

    Your aim to assess what learners thought of the training is what’s considered training evaluation at level one of the Kirkpatrick four-level model for assessing training effectiveness — this aims to answer the question: “Did they like it?”

    This kind of assessment will provide you with useful information, but the other three levels are where you will gain really valuable feedback, which will enable you to improve your future training sessions.

    These four levels are:
    1) Level 1: Did they like it? (often measured with “smiley sheets”, available in template form from many Web sources)
    2) Level 2: Did they learn it?
    3) Level 3: Did they apply it?
    4) Level 4: Did performance improve?

    (… continued)