According to the Cheshire Cat in 'Alice in Wonderland', "if you do not know where you are going, it does not matter which way you go". Mike Morrison says the same can be said for training - but it's time to stop meandering in training wonderland, he says, and put real evaluation on the map.
We often talk about evaluation and strategic impact but are we taking the appropriate first steps? Often we know, or at least have a good idea of, where we want to get to, but as the old saying goes – 'If I was going there, I would not start the journey from here'.
That is a very logical or left brain reply, even if it is not of much immediate practical value to the questioner. If you do not know where you are going, you are not likely to get there. As the Cheshire Cat said to Alice, "if you do not know where you are going, it does not matter which way you go". Equally if you do not know where you are, it is difficult to know which way to start travelling first.
It is an important step to know where you are and where you intend to end up before starting the journey. Is this why all tourist maps have a 'you are here' marker?
As individuals it is great to focus on the journey. However, as a business we need to focus on the destination. It is not about one or the other but both.
Where are we now and where do we want to be?
This is a simple yet basic step in any intervention at all levels within our respective organisations. Yet I wonder what is the extent to which we really do it? Where is the 'you are here' marker in our organisations? Sure, some of us have tools like customer satisfaction and staff engagement data (as well as the basic business financial measures), but that is only a small part of the picture, like seeing only places north of you (on the tourist map) and ignoring east, west and south. In organisations, this is the holistic and strategic data.
Do you know where you are?
In the 2007 survey Link to 'Develop the Developers' (by Morrison & Ritchie), responders provided the following answers in response to development activities:
Use of diagnostic approaches in development...
Always (8%) Usually (33%) Sometimes (46%) Rarely (10%) Never (4%)
Use of evaluation approaches in development...
Always (37%) Usually (43%) Sometimes (15%) Rarely (2%) Never (2%)
This to me highlights why much of what we do in organisational development (OD) and human resource development (HRD) fails on a regular basis to make the desired (and recognised) strategic impact.
We have read many threads on community forums like that on TrainingZone.co.uk about the difficulties of evaluating training and development activity. For example, how to calculate a return on investment (ROI) or show value for money (VFM) is a commonly recurring theme.
How can we ever hope to evaluate any intervention effectively if we do not know where we started from? We will only know this by having the same measures at the beginning as we intend to use for measuring success at the end. In finance we do it – we look at the financial position (profit, turnover etc.), we set a plan to achieve it and then we measure after an agreed period of time. In medicine, before a person starts treatment we have some measures – pulse, respiration, blood pressure etc. We measure before and after (often on-going) any treatment. Why in our profession do we not do the same? Often we do, for things like retention, sickness and attendance – but not for the more strategic and integrated elements.
What is a diagnostic process?
Often simpler than it sounds. It is a tool that identifies ‘where you are now’, the dot or arrow on the map if you like. Tools like SWOT (evaluating strengths, weaknesses, opportunities, and threats) and PESTLE (looking at political, economic, social, technological, legal and environmental factors) are alright to start with, but often these tools are not used as effectively (or broadly) as they were originally designed.
Diagnostic tools that only look at the area of the business you are interested in, like culture surveys, have their place, but how do you know that culture is the issue – where is the diagnosis to show that a specific tool like a culture survey is the right one? There may be a need with a higher priority. A doctor wouldn't send you for a special test or scan until they have undertaken a general diagnosis. In training and organisational development (OD) we need to do the same. We need to use holistic diagnostic tools to help us orientate to real needs – often we react to the symptoms. In medicine it is easy to treat the cut to the hand from a fall, but if we miss the reason for the person falling – a minor stroke… sure the hand will get better – but in the mean time the stroke can do more damage in the short, medium and long term.
Making evaluation easier
The more robust the diagnostic process, the easier evaluation. Some would argue that an evaluation is just a repeat of the diagnostic but with different emphasis on the results. The diagnostic is looking for an action plan, an evaluation is looking for change since the last measure. So a regular, yet effective organisational diagnostic process not only evaluates previous actions but the same data can be used (in association with a robust business plan) to identify immediate and future needs.
Insanity in our world?
As Albert Einstein once said – The first sign of madness is doing the same thing over and over and expecting different results. It can be a bit like watching a replay of a race and expecting someone else to win! Obvious when we think about it… but why do we do this with our business activity?
Looking back at the results from the 'Develop the Developer' survey, I wonder why many interventions are evaluated – but little or no formal diagnostic processes are undertaken at all. Then we (the profession) wonder why evaluation is so difficult and why we are criticised for not being strategic or - worse - the first to be cut in tough financial times.
Do we as professional not learn? Why do we keep doing the same things (no diagnosis but evaluation) and then wonder why we do not add as much value as we expect. Are we ‘mad’…or maybe we are just reluctant learners?
The good news is that the results from the 'Develop the Developer' survey suggest that more of us will be doing more diagnostics in advance of any activity, although unfortunately that increase is only in the area of a specific initiative and not looking at the organisation as a whole. I’m sure we will learn eventually!
Mike Morrison is director of RapidBI Ltd. A consultancy specialising in helping individuals and organisations improve their business performance through people and organisation effectiveness. Mike is also a founding member and director of TrainerBase. For more information go to www.rapidbi.com/bir.