What do you do when the boss doesn’t want feedback data?
Yes, you read that right. A stakeholder who doesn’t want to know how effective the training is, how worthwhile it is or even if people liked it. Weird, yes, but it happened.
“Wow! I can’t believe how far we’ve come. I almost don’t recognise the new hires’ induction training.”
We had just finished presenting the internal pilot feedback on our brand new, blended induction training programme. Transformed from a once drab and boring tick-box exercise into an engaging learning experience rooted in practical application. The feedback from our willing pilot participants was amazing. Five out of five across the board. We were ecstatic.
This was six months of hard work spurred on by damning feedback on our current programme and a mandate from the head of operations to do something, anything, to stop us looking so woefully behind the times.
We don’t need feedback data!
“The Learning Platform has been loaded with new modules and the managers have been briefed. There’s just one more thing before we go live. Are you happy to use the pilot feedback questions for the learner feedback survey or would you like to change them?” I asked confidently.
“Oh no, don’t worry about those, we don’t need them.”
My confidence turned to confusion.
“Finding out what real learners who are new to the business think about the programme can be really insightful. It can show us how it impacted their onboarding and their view of the company. It can give us some great material to help attract the best talent. And it can help us focus on the right things when we review the content in six months.” Like any seasoned learning professional, I know my feedback benefits like the back of my hand.
“No.” came the answer, “no, we’re not doing surveys, they put people off.” My confidence returned, I now had an idea of what the issue might be and had an answer ready to go.
“Don’t worry, we can consolidate the survey down into just a few questions. We’ll keep it entirely optional and it will only appear after the learner has completed the full programme. They won’t even know the survey exists before they’ve completed.”
“No.”
Well that certainly knocked the wind out of my sails. But I wasn’t about to give up. I hadn’t spent six months creating digital escape rooms, whodunnit scenarios, burn down the office games and even writing an entire module in verse to let evaluation fall into the void. No way. But I had definitely lost the argument in this room.
Time to change tack
Fast forward two weeks and I’m talking to a cash flow analyst about some new training modules to help standardise the cash flow approach and documentation process across a number of sites. I ask about evaluation and feedback data.
“As long as it doesn’t affect completion, do what you like. Don’t expect any responses though. I know these guys, they won’t care.”
Ok, not quite the level of enthusiasm I was hoping for, but I’ll take it! Given carte blanche to create my own evaluation questions and implement them across a whole department I set to work crafting the most insightful questions I have ever written. A few weeks later we launched, and I waited.
The results were incredible. I asked the cash flow analyst for a meeting to discuss the feedback and he was aghast. I mean, who is so inspired by finance training that they take the time to fill in evaluation surveys – these people, apparently!
When asked if the participants would recommend the course to someone else in their role we received a resounding “Yes” along with great comments about how the bite-sized videos helped them to navigate the real-life system and the practical activities built their confidence.
Perfect. We even had an incredible example of insight, too. One region gave us low scores across the board and comments similar to “we don’t do it like that around here”. Bingo! Whilst it might sound quite negative, it was exactly the kind of feedback I’d been hoping for.
The learners had revealed to us that they were using nonstandard processes and believed those processes to be correct. Therefore, they had scored our training poorly as they thought it was riddled with errors. Cue some intensive, on-site change management to get the team aligned with what should have been their standard way of working all along.
The moment of truth
Later that month I was presenting our team successes in front of our induction programme stakeholder. When it came to the cash flow modules I could demonstrate with clear data the amount of time learners spent on the programme, which sections were most effective, whether learners felt it was worth their time and, combined with data from finance, how much of an impact we’d made on their challenge of standardisation.
I also explained how the negative feedback data had shaped our approach to ongoing engagement and further training.
“That’s amazing!” exclaimed my stakeholder. “Now what I’d really like to know, is how do you get that kind of insight from our induction programme?”
Well, it’s funny you should ask…
NB: and in case anyone is wondering, no I never got to the bottom of why there was such resistance to feedback in the first place.