Author Profile Picture

Robin Hoyle

Huthwaite International

Head of Learning Innovation at Huthwaite International

Read more from Robin Hoyle

googletag.cmd.push(function() { googletag.display(‘div-gpt-ad-1705321608055-0’); });

Why are we still ignoring the data?

With an abundance of data and metrics to help evaluate learning in organisations, how can L&D truly understand it and actually improve learning outcomes?
Learning metrics overload

You’ve heard about learning culture, I guess. But how that seemingly simple phrase translates into something you can touch, feel, monitor and support is tricky, isn’t it?

The challenge is that ‘learning culture’ means different things to different people. Its imprecision as a phrase makes it pretty much useless, but it is still bandied around in books, conference presentations and self-help articles for the lonely learning leader.

But because we can count it, should we? Does it give us any information we can use?

Defining the undefinable

In order to think more effectively about learning culture, we need some kind of handle on what it might actually mean. While trying not to define it (I’ve done my fair share of fog knitting, thanks) there does seem to be some commonalities about what this term describes, such as…

  • Individual responsibility for managing learning – employees who ‘pull’ their engagement in learning rather than having to be directed.
  • A focus on continued development and continuous performance improvement by employees.
  • A support structure, including line managers, which encompasses a focus on individual and team development, coaching and discussions about current skill gaps and future aspirations.

I don’t think this is an exhaustive list, nor do I think this is something everyone reading this will necessarily agree on (which kind of emphasises my point about it being an L&D concept which is likely to trigger eye rolling and despair in those elsewhere in our organisations).

Increasingly, I see learning analytics, and the data generated through rapid and extensive adoption of digital learning activities and resources, being touted as a route to supporting ‘learning culture’. But is this true? As the pandemic has seen an increase in digital access to courses, resources and modules, the amount of data generated has increased exponentially. 

But because we can count it, should we? Does it give us any information we can use? Does it – importantly – provide management information which can support moves towards a ‘learning culture’? Not if we think about data in the way we have previously utilised information from our digital platforms. 

Lose the LMS mindset

The first challenge here is what I call the ‘Learning Management System (LMS) Mindset’. We all know what an LMS is. It exists to monitor completion of tasks, success in tests and compliance. These are the modules which no one really wants to do but, for reasons of health and safety or regulatory compliance, we have to have some evidence that our people have been told what not to do.

The data from the LMS has traditionally been used to deliver prompts to people to complete - swiftly followed by direction to line managers, league tables of completion statistics and finally stern words in private meetings. Nudging rapidly gives way to nagging and the integrity of the L&D offer and its usefulness is undermined by compliance policing.

However we explain it away, the reliance on these simple measures of conformity has had a negative effect on how L&D, and especially digital programmes, platforms and resources, have been viewed by the remainder of the enterprise.

Marketing data about how many eyeballs were focused on that particular digital module tells us nothing except that the title was interesting

Getting beneath the numbers

Now we have a world in which there are many varied and different digital learning resources – from user generated videos to podcasts; work-based task briefs to ‘how to’ guides. There are also generic modules bought from a library of similar offers to create a critical mass for our fancy new learning experience platform (LXP).

The LXPs (especially those which describe themselves as ‘The Netflix of Learning’ – urgh!) generally offer a kind of click through measurement – derived from the marketing industry - which tells us how many of our colleagues have engaged with that video, how long they watched for and which other links they followed. There are a couple of issues here.

Marketing data about how many eyeballs were focused on that particular digital module tells us nothing except that the title was interesting. We don’t know whether it was any good or whether any meaningful action followed - we simply know how many people clicked the intriguing image we presented them with.

When I was judging some awards a couple of years ago, the entrants were asked to provide evidence of impact. Most of the impact measures relating to digital resources were focused exclusively on numbers of downloads and ‘engagement’. No performance measures were provided linking completion to business impact.

Those with the LMS mindset still want compliance – they want everyone to complete something - regardless of whether what it is that we are offering, and measuring addresses the development needs of those who are expected to complete it.

It seems to me that these two issues are significantly hampering our approach to learning data and, by extension, our drive to build a learning culture, however it is defined. So, what do we do about it?

What should matter to the L&D team is what matters to the rest of the organisation – performance metrics which show improvement, however that is defined

1. Before collecting data, define what you want to do with it

The minimum requirement for learning data is that we can correlate use of (digital) learning resources with improvements in performance. 

2. Focus on outcomes not inputs

What should matter to the L&D team is what matters to the rest of the organisation – performance metrics which show improvement, however that is defined. Whether we talk about improved business results – usually financial – or better outcomes for service users or patients, or greater efficiency – essentially doing more with the same resources - we need to know that which L&D inputs support the achievement of the goals and strategies of the organisation.

3. Share success stories and reasons to engage

If we can correlate completion of a programme or engagement with a suite of learning resources with success in terms the organisation understands, then we should encourage further involvement in those resources and activities by telling those stories.

Measure what matters

What we should guard against is the tendency to fall back on out of date and retrograde actions associated with the LMS mindset – to check up on people and drive compliance and completion – not driven by a focus on effectiveness but simply because we said so. Compulsion and persuasion are different things.

Whenever I see data being discussed in L&D teams, I ask myself a question: ‘Are we measuring what matters or counting things which are easily quantifiable?’ Whenever we focus on data, we should count what matters, and invest the resources into making sure the data gathered is robustly related to our key purpose – the continuing and continual improvement of the performance of people and their teams.

Interested in this topic? Read What to do when your problem data lies.

One Response

  1. Well said Robin.
    Well said Robin.
    The end game is desirable behaviour change and that is only loosely correlated with bums on learning seats.
    And I do like the alliteration in ‘lonely learning leader’ 🙂
    Cheers, Paul

Author Profile Picture
Robin Hoyle

Head of Learning Innovation at Huthwaite International

Read more from Robin Hoyle
Newsletter

Get the latest from TrainingZone.

Elevate your L&D expertise by subscribing to TrainingZone’s newsletter! Get curated insights, premium reports, and event updates from industry leaders.

Thank you!

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to TrainingZone's newsletter
ErrorHere