Author Profile Picture

Krystyna Gadd

How to Accelerate Learning

Founder

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

Soft skills: is it really that hard to measure their effectiveness?

z_wei-measuring_impact

Measuring soft skills may seem like an almost impossible feat, but by asking the right questions before any training takes place you can produce measurable outcomes to assess the impact of your learning solutions.

In the noughties I decided to move my career from IT training to soft skills training. In those days (as you may have read in my last article) I did not fully measure the effectiveness of the IT training I delivered, but since then the world of L&D has changed.

Studying for my Certificate in Training Practice and moving into ‘soft skills’ made me reconnect with my engineering brain, leading me to ask myself:

“Why don’t we know exactly what we want to get out of this?”

This’ could be many things: a leadership programme, customer service training, diversity training, a buddying scheme or coaching, to name but a few. The last one, coaching, could be seen to be a particularly sticky learning solution – I think most people would agree. What I would like to suggest, though, is that the ‘stickiness’ is more to do with our mindset than anything else.

Back in 2008 I won a ticket to attend the CIPD L&D conference and, having never attended before, I looked at the sessions on offer with relish and chose seminar A2 ‘Coaching with Impact’. There were a couple of speakers, but the one who stood out for me was Rick Woodward, Global L&D Director for Kimberley Clark. Listening to him, I weirdly thought to myself:

“He thinks like I do”.

Looking at his biography I discovered he had also studied Chemical Engineering as I had done at Sheffield, but several years earlier.

What was it about his thinking, his mindset, that set him apart from the other speaker? For me it was his clarity and a refusal to be defeated in the face of so many saying it was impossible to measure the impact of coaching.

In engineering you would never embark on any project without first having a clear definition of what the outcomes might be. Neither would you complete a project without checking that you had achieved those outcomes. Very simple… no nonsense!

So Rick marched on and did it, measured the impact of coaching

In his presentation he showed conclusively that coaching significantly improves performance and positively impacts business results.

He also showed that business improvements were four times as likely to happen when leaders mentor employees and support their coaching efforts.

This led me to question that if Rick can do it, for a global company that trained 800 of its team leaders in coaching skills, then why can’t we?

I truly believe it is all about mindset. We are told it is difficult and we believe it. So how do we make it easier?

Important questions you need to ask

My very good friend Kevin M. Yates has shared three key questions to ask that will help you unearth what to measure in a particular learning solution.

  1. What’s happening in the organisation?

  2. What is the organisation’s goal?

  3. What performance requirements are needed to achieve your organisation’s goal?

These questions might just be enough to help you on your way to discovering what to measure for soft skills training/learning. You might also want to use some of these questions from the HIRE model I developed to help in these circumstances.

The HIRE model

HI
  • Tell me about what is happening just now?
  • Tell me about what is not happening just now?
  • Why now?
  • Who is (not) involved?
  • Who should be involved?
  • How long has this been going on for?
  • Who are the key people?
  • What have you tried?
  • What are the underlying issues?
  • What might have triggered this/these?
  • What has gone well?
  • How did it happen?
  • Why did it happen?
  • What would you do to prevent this in the future?
  • What changes might have triggered this?
RE
  • What are the ramifications of not addressing these problems?
  • If you solve this what will you achieve?
  • If you do not solve this what will happen?
  • What is the worst that can happen?
  • What is the cost of not doing this?
  • What is the cost of doing this?
  • How is this impacting the stakeholders?
  • What are the implications to the wider organisation?
  • How do people feel about this?
  • What do you expect to get from this?
  • What are your priorities?
  • What will really make a difference?
  • Where do you want to be in six months?
  • What does ‘good’ look like?
  • What is the budget?
  • What are the constraints?
  • Is there anything else that we have not thought of?

By the end of any investigation, what you want to get out of it are some measurable outcomes. These need to be focused on the organisation and some aspect of performance that needs to be improved.

Here is an example on YouTube, explained simply. It is about setting objectives for a ‘Feedback Skills for Managers’ course.

Here is another example (taken from my soon to be published book ‘How not to Waste Your Money on Training.’)


Source: Krystyna Gadd, How not to Waste Your Money on Training.’

Let’s look at this in more detail...

Aim

  • The customer experience needs to improve

This can be classed as an aim, as it is vague and woolly and difficult to measure. It is, however, useful as a guide towards reaching some achievable outcomes. Some of the specific measures that need to be improved by the organisation are:

Organisational outcomes

  • To reduce the number of complaints from 45 to 30 per month by 3Q

  • To achieve an overall customer satisfaction score of 85% from 76% by 3Q

These are the organisational outcomes – and it does not take much effort to turn these into performance objectives for a team or individual:

Performance objectives

  • To reduce the number of complaints per week from three to one by 3Q

  • To achieve a customer satisfaction score of 88% from 80% by 3Q

  • To resolve 9/10 complaints without needing to escalate to a supervisor by 3Q

If it is the product knowledge and not being able to work on own initiative that are the main barriers to the performance, then reasonable learning outcomes would be:

Learning outcomes

  • Individually in a role play, be able to describe all the key aspects of our main products

  • Without reference to notes, individually identify the key areas of your customer interactions, from your job description, where you feel your confidence is below average

  • Using an action plan, individually describe what you need to do to improve your confidence in the areas listed in your action plan

The objectives are written using Robert Mager’s PCS framework and it makes writing performance and learning outcomes a doddle.

So what’s stopping you from measuring the impact of your soft skills training/learning? Is it all in your mindset?

Interested in finding out more on this topic? Read Kevin M. Yates' content series on how to become an L&D data detective.

 

Author Profile Picture
Krystyna Gadd

Founder

Read more from Krystyna Gadd