I opened up the participation report for our finance graduate programme and my blood ran cold. A quarter of a million was spent on this 12-month programme and at the halfway point we were looking at less than 50% participation per month. That meant around 50% of the graduates weren’t participating at all. I scanned the report to see if I’d made a mistake – nope, no such luck. On the Teams call, the finance director and the talent director were waiting for me to share my screen to show how the programme is going. “Sorry, there’s a glitch with the report, I’ll need to run it again later and send it on,” I lied. I knew I couldn’t share this data right now, I needed a plan. The instant the call ended I rang the talent director.
“Emma, hi, we have a problem.”
I shared the report on screen and her face dropped. “Well, to be honest, I’m not surprised. We’ve been telling them for years that they need to improve their training materials, but they just won’t give us the time of day,” she responded.
It was true that there was a significant history between the finance department and the learning and development team. The finance training materials were old. There were large walls of text placed against complex diagrams on overcrowded slide decks either presented in person by the finance teams in region or presented by a finance department lead via a webinar. There were no polls, no group activities, no breakout rooms; just plain old slides with someone talking to them. Not surprising, then, that people were losing interest as the programme progressed. Or were they?
When your gut betrays you, data can save you
I couldn’t help but notice that some sessions were more well-attended than others. Plus, there was no data at all for in-person regional sessions. We also weren’t seeing a consistent downward trend. Month one was circa 60% participation, month two 70%, month three 40%. Even if we looked at the onboarding call, we could see only 75% attendance – onboarding!
Suddenly I realised what we were looking at. If only three-quarters of our participants attended the onboarding, how did the others know what was involved in the programme and what was expected of them? In fact, some people hadn’t participated in the programme at all, so did they even know the programme had started?
Drawing up the battle plan
The data we had was good, but we only had a partial picture. We needed more data points to better understand the participant experience and why we were having engagement issues. We decided to look at:
- Who was supporting the participants in region?
- When were the onboarding comms sent out?
- How were we communicating the monthly sessions?
- When were the regional in-person sessions happening and how were we recording participation?
And of course, for good measure, we sent out a survey to gather feedback on the speakers, presentations and content.
Divide and conquer was the plan. Emma and I parted ways with our respective tasks, setting up a meeting with the finance director in three days’ time to discuss our findings.
Unravelling the data and addressing the elephant in the room
The meeting was awkward, politically sensitive and fraught with carefully worded questions tip-toeing around the humongous elephant in the room – why had no one been managing the programme for six months?
Our new data told us a lot. A litany of errors throughout the onboarding process meant the business was completely unaware of what the programme entailed. Some participants were unaware they were part of the programme (which explained a lot) and some sessions were only communicated about less than 24 hours before they began. No wonder people weren’t attending.
We also saw a diverse range of responses relating to the regional in-person sessions. Some regions were working really well, some were falling behind and some had decided not to run the sessions at all. There was, however, one unifying detail: they had never been asked to record attendance, so they hadn’t.
The big surprise
While our new data told us a lot, the most important thing it revealed was that we were wrong. The content was not the problem.
- Yes, the slides were ugly
- Yes, the style was chalk and talk
- Yes, we could think of a million and one ways to improve the sessions
But the graduates who attended the sessions, praised them highly. They appreciated how knowledgeable the presenters were. They took screenshots of the complex slides and annotated them. And they didn’t want polls or breakout rooms, they valued using that time to absorb the content so they could ask questions at the end.
A close call
We finished up the meeting with the finance director, agreeing on a series of next steps designed to correct our communications and re-engage with both the business and participants. But one thing we weren’t changing was the content. That was a close call.
In our first conversation, our talent director was ready to rip up all the slide decks and start again. In our opinion as learning professionals, there was a lot of room for improvement. But for the finance graduates it was perfectly acceptable. Had we pushed ahead without seeking out more information we would have been another quarter of a million in the whole with no tangible improvements to speak of – try selling that one to the board.
A final word
The gut feeling you develop as you progress in your career is invaluable. Your instincts are built up over years of experience, situations you’ve worked through and challenges you’ve overcome. But without seeking to validate your reflex response you can easily come undone.
Sometimes your gut is right, and sometimes it’s full of sh**.