I was reading an article recently in the New Yorker by Elizabeth Kolbert (Why Facts Don’t Change Our Minds – New discoveries about the human mind show the limitations of reason.) which had some fascinating insights into how people think.
The article quotes a number of experiments at Stanford and elsewhere looking at how people form their views, the role of peer pressure in their thinking and how reason and evidence don’t always come into the decisions we make.
A number of observations stand out:
Firstly, “Once formed,” the researchers observed, “impressions are remarkably perseverant.” Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted.
Secondly, the article asks us to consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them.”
In this experiment, students were asked to respond to two studies about capital punishment. One provided data in support of the deterrence argument, and the other provided data that called it into question. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favour of it; those who’d opposed it were even more hostile.
Thirdly, researchers see an effect which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do
“As a rule, strong feelings about issues do not emerge from deep understanding,” two of the researchers write. “And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views.”
So reason is not what we thought it was. It seems to be much more about our social interactions and social standing than it is about objectivity and factual evidence. It is often about being right in the sense of being able to win an argument, rather than being “right” in terms of objective facts.
People use “reason” to reinforce beliefs and confirm their opinions, and then this bias becomes deep seated and difficult to shift.
There are lessons here for marketers and anyone involved in developing successful brands. For instance, to what extent do we use market research to truly uncover new insights, or to what extent do we use it to confirm existing opinions and prejudices in our marketing strategy?
The experiments also reveal the importance of creating the right “first impression” and to recognise that, once formed, opinions about brands will be difficult to change and information will be filtered depending on how individual consumers perceive them.
It also speaks to the power of social media and confirmation bias when numbers of people come together to agree on a point of view and filter out any conflicting evidence.
People are not as rational as they would like to believe – and that is why Marketing has always been an art, not a science.