In part four of a six-part series Trish Uhl argues that learning and development professionals today are awash with data – the challenge, however, is how to access and use it effectively.
We talk a lot about the discipline of workplace learning analytics – but what about the data that fuels it?
Collecting the right data at the right time from the right people is critical to assessing and evaluating learning and talent development programmes, as well as to the efficacy of the L&D function.
We’re no longer restricted to capturing and analysing data from training-related systems, thanks to technological advances like new devices, ambient data sources, and AI-driven analytics platforms.
We can now link learning environments directly to operational work environments. Upstream, downstream - we can measure everything.
We don’t need everything, however, and while our first impulse is typically to design a way to generate the data we do need, this is no longer the best way to start.
So, where do we start? Not with data, but with questions.
Asking effective questions
Training effectiveness can only be measured in terms of the positive people impact and/or business value delivered.
This requires an understanding of what measure/s the business requires to determine whether training is successful. Answer three questions:
- What is the enterprise goal?
- What is the business result(s) and people impact that leads to this organisational outcome?
- What factors contribute to this business result and people impact?
Then establish what analyses the business needs. Determine whether there is a need for:
- Operational efficiencies, e.g. making a function or operation more efficient.
- Growth initiatives, such as increasing sales or revenue projects.
- Compliance reporting, i.e. keeping track of required training.
- Making a better workplace for employees, e.g. offering wellness programmes, diversity initiatives, or recognition and rewards.
After this, select the appropriate data and data sources necessary for measuring against these key performance indicators (KPIs).
Using data to improve operations—how people work
Former CLO David Vance in his work at the Center for Talent Reporting cites two difference types of KPIs – ones we monitor and others L&D can actively manage.
To illustrate this point, learning executive Alwyn Klein and I often refer to our shared experiences working in emergency medicine, where there’s a need to distinguish between ‘vital signs’ that we monitor (e.g. pulse rate), and those that we measure in order to directly manage or influence, (e.g. activity or exertion).
Even a medical professional can’t directly affect another person’s pulse, but they can recommend interventions that contribute to the desired outcome.
Tracking progress is akin to flying an airplane; it requires proper instrumentation and environmental data. Otherwise, L&D is simply ‘flying blind’ and is at risk for making costly mistakes.
In L&D, KPIs are our ‘vital signs’. Some KPIs are outcomes we monitor, and others are drivers we can measure and manage.
Distinguishing between the two helps us focus on where to target our investments for the most impact at the lowest cost.
Determining which KPI(s) to monitor and which to measure, when, how often and to what end(s), tells us what data we need.
Here’s an example of how learning analytics can link a key performance driver with a key performance outcome in service to an enterprise operational goal.
Workplace learning analytics becomes the ‘glue’ bonding tactical and strategic objectives and providing visibility into downstream outcomes.
This in turn makes it possible to gauge progress against a goal the business cares about, in real-time – or as near to it as possible.
This is how we can now instrument our learning solutions and link them to the people and the business in their operational (work) environments.
“Tracking progress is akin to flying an airplane; it requires proper instrumentation and environmental data,” remarked Gene Pease and Caroline Brant in a recent blog.
“Otherwise, L&D is simply ‘flying blind’ and is at risk for making costly mistakes.”
How far behind the curve am I?
At this point, you’re probably not. Most organisations are using data in the most fundamental ways today, as input into descriptive and inferential analyses to generate reports.
This is changing rapidly, however – according to research from i4cp and the ROI Institute, many anticipate deploying more sophisticated mechanisms within the next 12 months.
Experimental design is not the standard yet, but it will be soon.
Where does data come from and how do I prepare to use it?
The first answer is easy: data come from everywhere. The second answer is: that’s the wrong question.
We naturally want to start with what we know - collect data, clean data, compile reports, etc.
Instead, we need to focus on the hard work of clarifying our questions and defining our terminal and enabling metrics.
We’re using data analytics not to do something to people but, rather, to do something with and for people.
Discard any notions about cleaning the data first, before deciding which to use and how to use it. Instead, clean data as needed and stay focused on how to use it.
As your workplace learning analytics practice achieves higher levels of maturity, you will reach a point of diminishing returns on ad hoc data cleaning.
At that point it will be time to invest in tools for comprehensive, ongoing clean up and a workbench of data already normalised for use.
Who’s in charge?
You are! With data available from so many different sources, it’s critical to establish trust and transparency around accessing and using it.
Formulate clear, rational policies and get stakeholders on board early.
Make sure all sponsors and stakeholders are aligned about ethical practice and the rules of engagement from the beginning helps to prevent people sabotaging a programme after significant investments have been made.
Recommendations for ethical standards in people analytics are summarized by global expert David Green in the figure below.
Finally, consider your employee value proposition (EVP).
“An EVP is a crucial component of a fully humanised people analytics practice,” says Lexy Martin, principal of research and customer value at Visier.
“Rather than having the organisation merely interpret data about an individual, a fully humanised people analytics practice gives a worker the tools to interpret and analyse their own employee data to provide explicit value to them”.
When all is said and done, we’re using data analytics not to do something to people but, rather, to do something with and for people— to help them improve performance and materially move the business forward - working in service to people, business results and organisational outcomes.
Interested in this topic? Read Data-driven L&D: how learning analytics can enhance the employee experience.