No Image Available

Seb Anthony

Read more from Seb Anthony

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

meauring the impact of soft skills training for managers

default-16x9

OK - your company has seen the light and decided to invest in soft skills and development training for your much maligned and long ignored managers. Great - but how do you measure its impact? How can you get a good measure of the ROI?

Is it the role of training & development to develop (or purchase) and administer performance assessments? Or should access be provided to the performance measurements already in place and administered by HR?

What is your expereince? What would you advise?

Thanks!
Mike Kelliher

12 Responses

  1. Measuring Management Development effectivness
    Hi Mike
    The easiest way to evaluate is to re-visit the TNA that identified the need. If the need is no longer a problem you have successfully invested in the individual. That is you can show a behavioural return for your investment.

    Now ROI is a different thing – when identifying the need it needs to be valued. What is the value of the activity to the business? What would it cost to hire a person with the skills the individual concerned is missing? (a simplistic approach)

    Is it the role of T&D to administer performance assessments – well that depends on the culture you have or want to develop. Personally I would say no. It should be a line management function – supported by T&D experts/ advisors. Who owns the budget? Who should own the budget?

    It just depends what your brief is and what the culture of the organisation is and requires.

    If SMART objectives are being used throughout the process (at all stages) the evaluation is often easier.

    Mike
    http://www.rapidbi.com
    http://www.pairoftrainers.co.uk

  2. Not asking much !
    The topics of ROI and of measuring management development I seem to remember have been covered in the past few months, so do a few searches. For me it’s more about what business drivers are and what results are expected to be impacted. Even more so when I take the implication that there is maybe something more whimsical than goal-directed in the decision-making here to invest. If it is, you won’t get commitment to the amount of time needed to get quality ROI numbers.

    Having said that, it IS your job, as you have most to lose. Look at what HR & business measures are available to you that are related to the business goals. Agree which should be impacted and to what extent it would be attributable to training. Ensure you can translate measures (eg sickness levels) into ROI. Gain agreement to new measures where there are gaps.

    Don’t just focus on ROI because there’s always people that will be sceptical about your ability to attribute success. In addition, get your senior managers to look for evidence of behavioural change. If they see it for themselves, ROI may be less important (except to the beancounters). Use your competence framework, if you have one, within a 360 feedback process to measure before and after.

  3. Hard or Soft?
    This is my view Mike: “OK – your company has seen the light and decided to invest in soft skills and development training ~” I am not certain what ‘the light’ is in this instance. Perceiving that training managers is a good thing is probably correct, but that view or opinion should not necessarily lead to training being delivered and not necessarily ‘soft skills’. Just like any other group within an organisation, training should not immediately follow a decision that training is a good idea; it should prompt an analysis of where organisational demands and requirements exist and where management responses to those demands perhaps fall short. This is a task often supported and aided or even driven by the HR Department, it should be a field of their expertise and competence.

    From this should be identified the key performance areas (KPA) where improvements and developments are important. Against these KPA’s managers, (Not HR) should be able to define Key Performance Indicators (KPI’s); specific traceable and financially rational and related. KPI’s will be about deliverable factors, such as: absence, return rates, errors, down time, performance failures. (Feeding into these might be ‘soft skills’ which will support their resolution or achievement, but the KPI will not in themselves define soft skills.)

    Based upon these performance shortfalls the HR department could then define a response, it could be training it could be something else. But if the end result is training and that training is delivered and God forbid some of it is ‘soft skills’ based then the results will be traceable against a financial metric. You would only necessarily undertake an ROI analysis if the training were not compulsory i.e. driven by law or sector specific requirements. Who undertakes this ROI? It should be the managers with aid and guidance from the HR department; both need to know the results.

    In my view there are no such things as soft skills, they are all hard and should deliver hard results. Soft skills are often only called soft skills because the trainer or consultant has not got a grip of where the skills will and should produce hard returns. Defining ‘soft skills’ as financially untraceable in the workplace is pure BS, but bare in mind it may not always be appropriate to conduct an ROI for reasons all ready outlined. A Value For Money VFM exercise however is still justifiable.

  4. VFM ?
    Garry, could you explain what you mean by a value for money exercise in a training context ? Is it different to an expected ROI ?

  5. VFM
    A VFM exercise is an analysis and comparison of training delivered against comparable alternate training methods and work based outcomes achieved, i.e. could we achieve the same results with an alternate training strategy for less money. Three metrics are therefore compared: Cost, Benefits, Methods.

  6. Simple measures
    Hi Mike

    I don’t know if you have an Employee Satisfaction Survey in your company, but that’s one great way to measure any improvement in the soft skills of managers.
    As you say, the other way is to make sure there are some measurable objectives in their performance review and to get access to that (rather than start a new process from L&D).
    Another thing we do is send out questionnaires to all delegates two months after a course asking if it has made them more effective, which gives some basic metrics per course. Could also do that with their manager.

    Cheers
    Chris

  7. RoI? Probably not.
    I’m conscious that some contributors are the same as those who normally offer an ROI take whenever ‘evaluation’ raises its head; however, at the risk of opening the can again:

    In my view, ROI would be a dangerous path to follow for the type of training you mention. The main difficulty is that its methodology can always be questioned by those who may challenge the need for a programme – particularly if their pet project can demonstrate ‘better’ ROI using the sort of project appraisal techniques most accountants understand.

    Far better, in my view, to return to the original TNA (or other driver) and identify behaviours that needed to change, and why. Impact can then be measured by assessing the degree to which this has happened, and the (often subjective) improvement in departmental outcomes. To be done properly this requires a before-and-after measurement and a benchmark or control group, and is quite difficult for soft skills! However, most sponsors are happy with subjective (but independent) data such as 360 feedback, senior managers’ views or even the participants’ own take, so long as this is gathered after they have had a chance to apply their learning back at work (I suggest 2-3 months as a minimum).

    Ultimately, like so many other decisions, the investment in a learning programme will often be best justified using a fair degree of trust rather than pound signs. This is entirely consistent with most organisations’ needs.

    Dave

  8. ROI on training, and its measurement
    If you start from the premise that:
    1. the goal of training is to equip individuals with knowledge and skills that will raise performance; and
    2. that evidence of change is only through observed, changed, behaviours,
    then the evaluation of the impact and benefit of training is likely to be through two routes.

    a) Anecdotal evidence of the trainee’s performance before and after the training. Is s/he doing things differently (and better), which we can reasonably infer – but cannot prove – is a result of the training intervention. Or,

    b) A more structured (but nevertheless subjective) approach to measuring pre- and post-training, using multi source feedback (e.g.‘360’), which is perhaps a little more effective as a diagnostic tool at the front end (i.e. you define the behaviours & knowledge that will deliver the business results you want, so you can target your training at the area that needs ‘fixing’).

    The difficulty is that it is very difficult to attribute ‘cause and effect’ to training interventions, as there are so many variables at work. An obvious one is the Hawthorn effect.

    And what happens when the employee returns to work? Is there a de-brief? Is the boss suppportive and interested? Is there an opportunity to try out the newly-acquired know-how? Was the timing of the training right? Is there a supportive environment in which to try out and practice the new skills? What are the peer pressures to conform to ‘old practices’? What’s the workgroup culture like?

    Brinkerhoff and Apking (High Impact Learning, Perseus, 2001) conclude: “Almost all organisational training is a marginal intervention and has only slight effects on performance improvement.” Further: “If we define ‘training impact’ as simply the transfer of knowledge and skills to on-the-job performance, research indicates that impact of training is realised only for about 15 percent of all training participants.” They aren’t saying that training never transfers to on-the-job performance. There will always be self-starters and lifelong learners who believe in what they’ve learned and persist in spite of barriers to change. But these exceptions can’t deliver the return on investment that business executives are looking for.

    This all sounds depressing, but you can make training more effective and deliver a better return on your investment if you take an active approach as an HR/L&D function:
    – undertake a TNA with the senior team
    – get their buy in to the development intervention (including resources)
    – measure behaviours, both before and after
    – provide support for the ‘trainees’

    If you’d like an interesting article on “a reinforcement-based approach to learning and development that achieves permanent, measurable changes in behaviour”, drop me an e-mail at [email protected] and I’ll send you the article.

    Harvey

  9. Measuring What?
    Concerning soft skills, there are two measurements involved just as there are with quality.

    First, there is measurement of the manager’s compliance with using soft skills. This requires precisely defining those skills just as one would define quality standards.

    Second, there is measuring the effect of the result of the use of soft skills or the use of quality standards. This would mean measuring productivity for soft skills or increased sales and consumer satisfaction for quality.

    I think that we know how to make the second measurements, but that we don’t have a defined set of measurable soft skills. As a starting point, I would recommend the following test.

    This is a simple test of 10 questions. Rank a manager on a scale of 1 to 10, 10 being the best or almost always, 1 being the worst or almost never. Add up the points for each question.

    If the score is close to 100, I would expect that employees will be over 3 times more productive than if the score was 30 or less. In addition, with a score close to 100 employees will unleash their full potential creativity and innovation, love to come to work and have very high morale. 🙂

    DOES THE MANAGER

    -provide regular and frequent opportunities for employees to voice complaints, suggestions and questions, provide reasonable and timely responses, and give employees what they say they need to do a better job? (At least weekly?)

    -elicit answers/responses from the team and get them to use their brainpower to solve problems?

    -listen to employees with 100% attention without distraction, without trying to figure out a response and with the use of follow-up questions to obtain missing details and suggested fixes?

    -refrain from giving orders since by their nature they demeaning and disrespectful and destroy innovation and commitment?

    -treat members better in terms of humility, respect, timely and high quality responses, forthrightness, trust, admission of error, etc than they are expected to treat customers and each other?

    -publicly recognize employees for their contributions and high performance and never take credit him/herself?

    -openly provide all company info to employees to the extent they need/desire?

    -use values and high standards of them in order to explain why certain actions are better than others?

    -use smiles and good humor with subordinates, not frowns or a blank face?

    -generate in employees a sense of ownership?

    Hope this helps, Ben
    Author “Leading People to be Highly Motivated and Committed”
    http://www.bensimonton.com

  10. measuring impact of soft skills training for managers
    Here’s different spin. I am writing not from the perspective of a training professional, but as someone who writes training award submissions for a living.

    I want to add to David Scott, Peter D and Mike Morrison’s responses where they say its key to refer back to the TNA.

    Judges of training awards love seeing impact from management training proven beyond doubt. They are bang on when they say that it is not about blindly doing training and hoping for a positive impact, it is about defining UP FRONT what success looks like both from a business and individual performance improvement perspective. And I completely agree with the comments earlier that 360 and staff sat surveys are great here, but only if they are agreed as measures up front.

    This might all sound blindingly obvious but is extremely rare from my experience. So, if you do have a training programme with SMART objectives at business and individual level, and you have used the measures agreed to prove you achieved your objectives, then job done, no more proof is needed (and you should be entering for an award).

    Chris Robinson

    http://www.boost-marketing.co.uk

  11. ROI’s, VFM’s and how to waste more time/money
    Thank you Chris for injecting some good old common sense into this debate.

    Measuring the impact of soft skills training really is as easy as Chris says. Trouble is, common sense is not always common practice.

    And yes Gary P, soft skills do exist and can be measured. In my experience those who resist the idea of soft skills and their importance in the workplace are those people who actively demonstrate their own lack of them. Instead, they may rely on their grasp and use of hard skills relevant to their job, their power base and authority levels to push themselves and others to perform. Measure the “push” style in the faces, demeanour and behaviour of those around them. Not an exact measurement science but still a powerful one.

    Provided an appropriate soft skills training initiative is appropriately implemented and measured afterwards then a “pull” style should be seen and able to be measured; 360’s, satisfaction surveys etc plus the “feel good” factor should be adequate.

    In my opinion ROI’s, VFM’s and the like are overrated and mainly serve the personal needs of the people who instigate them: keeps ’em busy and employed.

    Incidentally, has anybody ever done an ROI on an ROI exercise? If not why not? Equally, if VFM’s analyse and compare alternate training methods and outcomes then does it mean that for a truly measureable comparison you have to actually implement and compare all of the alternatives?. If you don’t then surely you are only finally measuring an analysis against itself, unproven.

    Measuring the value of soft skills in the workplace is about more than trying to shoehorn them into financial measurements and returns. Ask people who are impacted by them questions around feelings, emotions and behaviours. Now try attaching pound notes to those answers! TNA still wins the day for me.

    Grrrr.

  12. Common Sense ~ If Only
    I am not quite certain how to read Ray Loftus’s input on this discussion as it is unclear to me whether he has misunderstood my posting or is just being obtuse. When I said soft skills do not exist I went on to clearly state that soft skills have hard effects and as Mr Loftus has so clearly confirmed here they can be tracked and measured. So soft skills are actually hard inputs hence not that soft.

    It also appears to me that Mr Loftus does not share my experience or grasp of how to undertake an ROI or VFM analysis. It is not as he seems to imply a large undertaking that requires significant amounts of time nor indeed massive calculations. They are both relatively straightforward exercises which if the ITN has been done properly in the first place require little subsequent follow up.

    Mr Loftus also asks: “Equally, if VFM’s analyse and compare alternate training methods and outcomes then does it mean that for a truly measureable(sic) comparison you have to actually implement and compare all of the alternatives?”

    The answer is no it doesn’t and from this comment I infer that Mr Loftus doesn’t understand what a VFM analysis is; it is a comparative analysis of potential developmental strategies and their projected outputs to determine which one might be worthy of consideration and subsequent implementation.

    I should also like to comment on Harvey Bennett’s comment that: “The difficulty is that it is very difficult to attribute ‘cause and effect’ to training interventions, as there are so many variables at work. An obvious one is the Hawthorn effect.”

    ROI does not determine what it is within a developmental intervention that has lead to better outcomes it simply determines from a financial model whether it has impacted or not. Personally I am not concerned about what effect the phase of the moon has had on the work based results, common sense should prevail and act as your guide here and it works when you employ it.

    Alternatively you end up with this sort of rubbish:

    https://www.trainingzone.co.uk/cgi-bin/item.cgi?id=162846&d=680&h=608&f=626&dateformat=%25e-%25h-%25y

Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

 

Thank you!