Robin Hoyle is a writer, trainer and consultant. He is the author of Complete Training: from recruitment to retirement and Informal Learning in Organizations; how to create a continuous learning culture both published by Kogan Page. Robin will be chairing the World of Learning Conference at the NEC on 19th and 20th October, 2016.
As we load up the hand cart and set off for an unknown destination, it can’t have escaped anyone’s attention that we are in a world where expertise is little valued.
When senior politicians announce that “we have had enough of experts” we can be sure the stock of experience, knowledge and insight which has built towards whatever expertise we lay claim to is somewhat de-valued.
Learning and development practitioners have long traded on their expertise.
Uncertainty has never been greater and complexity assured.
In some cases they know how to do the role for which they now provide training. In others they have built a career on understanding how individuals learn and how organisations develop.
Others have developed their expertise in the use of a particular tool, software or process – adapting their instructional approach to match the specific function to which the tool is to be put.
But maybe all that expertise is no longer required or – even if more vital than ever – no longer valued.
Writing in the Harvard Business Review last year, Bill Fischer, Professor of Innovation Management at IMD, Lausanne, wrote: “expertise is losing the respect that for years had earned it premiums in any market where uncertainty was present and complex knowledge valued.”
Now I don’t think we can easily describe our current circumstances as certain and simple.
We are now so carefully targeted by algorithms that we only see a sliver of what is available.
We are at a point in our working, cultural, economic and especially political lives where uncertainty has never been greater and complexity assured.
Despite this, individuals both in and out of their working lives seem happier to accept unfounded theories which accord with irrational biases over the tried and tested, the researched and referenced.
There are three forces at play here which I think we in L&D would do well to reflect on.
The ‘me-shaped’ world
The internet in general and social media in particular has created what the Open University referred to as a ‘me-shaped world’ in which filtered content based on past preferences and friendship groups emerges from the sea of information, opinion and conjecture which makes up the modern world of communications.
We are now so carefully targeted by algorithms and our own preferences and profiles that we only see a sliver of what is available and what may be possible.
I have described this in the past by drawing on the experience of Labour Party supporters in the UK general election in 2015.
Many of them were horrified by the eventual result because they never saw it coming – they had only ever shared stories, opinions and memes with like-minded individuals.
We believe things because everyone we connect with and everything we see and read reinforces those beliefs.
If everyone you know is voting one way, it can come as shock that so many people you don’t know are voting differently. I hardly need to make the point about the seismic disbelief of many Remain voters on June 24th.
Being ‘me-shaped’, the internet world we inhabit is populated by our personal opinions and biases.
We seek out those things which confirm us in our beliefs, whether that is about the state of the economy in the event of our leaving the EU; climate change or that the Royal family are actually descendants of a species of interplanetary lizards.
We believe things because everyone we connect with and everything we see and read reinforces those beliefs. Everything else is ‘other’ and – as the fallout of the EU referendum has shown, otherness is not always welcome.
Sometimes arriving at an answer to a question or problem requires us to step back, consider, reflect
Our networks at work and the beliefs’ supported and engendered are no different.
Anti-intellectualism
I have seen this repeatedly over recent times, such as in Psychology Today.
To be described as academic is - far from a compliment – now a form of abuse. It denotes being out of touch, ivory-tower dwelling and head in the clouds.
At its worse it signifies a belief that the learned think of themselves as a cut-above and collect long words and impressive arrays of facts only so that they can somehow belittle others. Bizarrely, the geeky kid in glasses is suddenly cast as the class bully.
The challenges that anti-intellectualism creates are that received wisdoms are not challenged and peer-reviewed proofs are easily dismissed. Where evidence suggests something more nuanced or complex than the pat answer and the appeal to common-sense, we run the risk of becoming stupid.
The challenges that anti-intellectualism creates are that received wisdoms are not challenged
The definition of stupidity? Do the same things and expect different results. Do we keep doing the same things and hope that things will turn out better next time? You bet.
A desire for the quick fix
Not everything is simple. Some things are more nuanced and difficult than the slogan, the hashtag or the 140 character summation.
Sometimes arriving at an answer to a question or problem requires us to step back, consider, reflect and unpack the circumstances that led up to where we are right now.
This is especially true when we are looking at performance improvement and changing behaviour.
To expect someone to abandon one way of doing things in order to embrace a new, uncomfortable and unfamiliar way of working is challenge enough.
To try to do so while arming ourselves with a couple of homilies and an action plan which fits on a t-shirt is asking for trouble.
What can we, as L&D folks, do about this state of affairs?
Do we abandon any sense of our own expertise or do we come out fighting?
I think there is a stealth approach which I commend to you. We need to work with those forces where we can and circumvent them where we must. Fighting these forces head-on seems to me to be bound to failure. As I said above, doing the same things and expecting different results is the definition of stupidity – we owe it to ourselves to try something different.
So, how do L&D teams respond to these drivers of irrational thought and resistance to change?
We need to embrace a Socratic approach
Socrates famously said that he was more interested in finding the right answer than in being right.
Adopting a sense of questioning humility can help to expose bias and beliefs based on shaky foundations.
I’ve rarely found anyone who holds shaky beliefs based on little or no understanding of the basis of those beliefs who can hold to them when asked a few politely probing questions.
However, beware the smug rant and the aggressive questioning. I can attest as someone who is occasionally prone to an “I do not believe it” rant that that approach rarely works.
Socrates famously said that he was more interested in finding the right answer than in being right.
We need to try and see things from the other’s perspective and acknowledge that their beliefs are sincere.
Tricky, as I know only too well, but I’ve seen others do it and it is remarkably effective. Giving ground, accepting the truth of some of the things that people tell us is part of the skill and one that one day I hope to acquire.
Einstein is reputed to have said: “If you can’t explain something simply, you haven’t understood it well enough.”
(Please note: There is no documentary evidence he said it and versions of this same sentiment have been found which significantly pre-date Albert Einstein – just in case anyone accuses me of being anti-intellectual).
Despite the dodginess of the quote’s attribution, the idea behind it is an interesting one.
Being intelligent human beings employed in a branch of the education industry, we do have a responsibility to make sure that what we say is intelligible without being dumbed down to absurdity.
When we are framing complex and nuanced arguments we should recall the other Einstein-attributed aphorism: “Everything should be as simple as possible, but not simpler”.
(Which he also didn’t say in so few words but he came up with something pretty similar if more long winded in a lecture in 1933.)
Speedy v quick fixes
Finally we need to give people speedy fixes if not quick ones. Kathie Dannemiller popularised a tool called the Change Equation or Change Formula in the 1980s. It looks like this:
D x V x F > R
Where D = dissatisfaction, V = vision for the future and F = first steps. Once multiplied the value of these three factors should be greater than R which equals resistance or cost.
If you think about attempting to change a long-held belief, practice or behaviour by an individual, this equation has relevance.
We first have to help them understand and articulate any areas of dissatisfaction; we need to help them see a compelling vision which they want to be part of and we need to define the first steps with them – helping them on the journey, but not necessarily defining the whole route map.
Sometimes I think we confuse people by outlining too many future steps when only one or two are necessary. When confused, an individual retreats to the certainty of their own, comforting biases.
The idea of marginal gains
The other ‘quick fix’ antidote is the idea of ‘marginal gains’. This is shamefully stolen from the work of Sir Dave Brailsford with the GB Cycling team where he and his coaching staff focused on specific areas of advantage and took care of those to help improve performance or to reach performance potential.
In L&D terms this can mean observing individuals and finding the one or two small things which – if changed – would deliver significant results. This can be hard. It requires watching and listening to individuals and understanding their key performance issues.
It’s not exactly a quick fix, but it enables individuals to start to see benefits from even a minor change and, as we know, each journey starts with a single step.
I bet Albert Einstein said that as well.
He didn’t.
I know that.
Only kidding.
5 Responses
Great article!
Great article!
There is perhaps also something to be said about cognitive dissonance – the idea that the more we hear arguments to the contrary of our held belief, the more we rail against it and the further entrenched we get in those beliefs (this happened time and again during the referendum debate – when people were presented with facts and evidence, they often responded by attacking the presenter’s credibility and/or objectivity).
Matthew Syed recently wrote an excellent article proposing the idea that this was driving Tony Blair’s determination to go to war in Iraq. He postulated that making people aware that cognitive dissonance exists at all (or, in current social media parlance, telling people that cognitive dissonance is a thing), can enable them to at least begin the process of overcoming it.
I also think the concorde
I also think the concorde fallacy is a big deal in organisations but we fail to address it – mostly people think of it as the financial sunk cost but the cognitive sunk cost is sizeable too. People do not find it easy to move onto new ideas if they have invested creative time in creating them.
Good point Jamie. I used to
Good point Jamie. I used to ask groups to build a lego wall and then change it. It took them 10 minutes at most to build the wall, but the leel of resistance to deconstructing it was huge. Sometimes the level of investment doesn’t need to be that big!
Excellent point about
Excellent point about cognitive dissonance, Toby. And I agree that awareness of this as something to be considered can make people question just why they cling on to their beliefs in the face of overwhelming evidence to contrary.
Thanks for commenting ad your kind words about the article.
Insightful article with some
Insightful article with some great, digestable advice for all – Thanks
Jackie