Along these lines, Clint Eastwood has a nice quote relating to political views: “Extremism is so easy. You’ve got your position and that’s it. It doesn’t take much thought.” (quoted in Schickel, 2005; and taken from Fernbach et al., 2013 page 939). We all have strong opinions and often, as managers and leaders, those views can have significant consequences for those who work for us. But the question is – are your opinions based on clearly reasoned arguments, or did you run out of space and settle for whatever you thought at the time? Or whatever your friend said? Or whatever your newspaper told you to say…
Work by Fernbach and colleagues explored this issue (Fernbach et al., 2013). They were particularly interested in the fact that many people have strong views (on climate change, on immigration etc) but evidence suggests these same people are relatively uninformed about the issues. It’s an issue of ignorance driving opinions perhaps? Or is it?
One hypothesis Fernbach tested was that, by and large, we don’t understand the complexities of issues, and rather than trying to work them out, we give up and opt for the easy (but probably false) opinion. So in a way we’re caught by an illusion of understanding – we think we know how it works, but we don’t. So we have confidence in our views, when in fact, if we looked a bit deeper, we’d realize how wrong we are!!
Earlier work in 2002 by Rozenblit and Keil looked at this. They asked participants how well they understood simple mechanical things like combination locks and toilets. Participants often confidently said they understood these things, but when asked to explain suddenly realized that they didn’t understand after all! Their illusion of understanding was broken and they realized their ignorance.
Fernbach took this further, to look at political views. I guess the question is – if most of us don’t understand how a toilet works, how are we to get a grasp on immigration policies? In one study, they asked participants their opinion of six political policies. Then they asked them to explain two of those policies. Finally, the participants gave their opinions of all six again. They hypothesized that asking for explanations would uncover the ‘illusion of explanatory depth’ and so help individuals become more self-aware of their ignorance. Secondly, they would also give more moderate opinions for those policies the second time round, because they appreciated their own ignorance and were thus (paradoxically) a little wiser in forming their opinions.
This is precisely what they found as well as a correlation between the extreme nature of the original view and the likelihood of change. In other words, it is exactly those extreme views which appear to become moderated when individuals appreciate their own ignorance.
In a follow-up study, the authors showed that uncovering the illusion of explanatory depth also affected decision-making. Peoples’ decisions to donate money to parties with particular polices was affected by asking them to explain the policies. Essentially, individuals become wiser through appreciating their own ignorance and this then affects their subsequent donating behaviour. Powerful eh?
Socrates apparently said that one route to wisdom was to “know thyself” and this in some way reflects the fact that it is easy to meander through life without really ever questioning our own beliefs and opinions. As a result we think we know the world and we think our opinions reflect that accurately. If we choose to explore our own ignorance, then we can become wiser as a result. [In fact, Wikipedia has this to say: “One of the best known sayings of Socrates is ‘I only know that I know nothing’. The conventional interpretation of this remark is that Socrates’ wisdom was limited to an awareness of his own ignorance.” And Socrates was a wise old chap…]
The Fernbach study focused on the political world, but we’re interested in the workplace. It is important that we make correct decisions and often those decisions depend upon our attitudes and beliefs. What this study shows is that we may be acting upon false premises no matter how confident we are (indeed the study suggests that the more confident we are, the more likely we are wrong!). Making bad judgement calls in the workplace can have significant consequences for ourselves, our teams and the company. So what can we do?
If we uncover our illusions then we can act more wisely. We can also help others to make reasoned judgements. Firstly, we can help our employees understand our own management policies and actions by walking them through them. Being transparent and guided about how we get to the principles we have.
Secondly, we can beware messages that try to confuse or overload our reasoning power. Don’t put the defences up and opt for an easy strong belief. Instead, discuss it with someone and try to understand the mechanisms underlying the issue. Also beware of people toting simple opinions in complex situations. In fact, next time you witness a strong opinion perhaps try asking what the mechanisms were that underlie it? Does the individual really know and understand the policy that underlies it? What are the assumptions and processes that have lead to that particular stance. I am not saying you’ll be popular, but you might just be right…
– Dr John –
Philip M. Fernbach, Todd Rogers, Craig R. Fox and Steven A. Sloman (2013) Political Extremism Is Supported by an Illusion of Understanding Psychological Science 2013 24: 939
Rozenblit, L., & Keil, F. C. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26, 521–562.
Socrates Wikipedia: http://en.wikipedia.org/wiki/Socrates