One significant experience pushed Tali Sharot to formulate the thread of an idea that would ultimately form the backbone of her new book. “I was trying to convince my kids not to kill themselves.”
Director of the Affective Brain Lab at University College London and mother to two small children, Sharot wanted to teach her offspring the basics of staying alive: eat well, be safe etc. At the same time, she was attempting to guide students in her lab and convince them of best practice. There was nothing controversial in her desires – but how do you get someone to agree with them and follow through?
Being a professor of cognitive neuroscience, she looked to the brain for answers.
It turns out we are ill-equipped to convince others to do as we ask; we usually resort to methods that repeatedly fail us, such as scare tactics. Sharot decided that by equipping herself with this knowledge, she could learn to counter it. A series of principles began to emerge that would enable her to engage and influence her children for the better. “I try to use some of the principles daily,” she says. “For instance, expanding their sense of control: instead of telling them to do something, I give them a choice. Do you want me to choose your outfit or do you want to choose yourself? Do you want to make your own fruit salad? I try to navigate their actions by showing the benefits in the future – instead of saying ‘wear your jacket, you’re going to get ill’, say ‘if you wear your coat you will be nice and healthy’. Highlight how things will get better, not worse.”
These learnings rapidly took shape, and began to form the basis of her book, The Influential Mind. It not only demonstrates the failings of the human mind to learn from our mistakes – for instance, resorting to fear-mongering – but carries a practical series of lessons in overcoming those habits. And those habits, are everywhere we look.
The more we lie, the easier it is for our brains to be increasingly more deceitful
Long before fake news became a punchline in the US president’s bag of tricks, Sharot found herself sitting among a crowd of scientists, entrepreneurs and philosophers at Ted2012, growing increasingly uncomfortable about the power other’s have to influence our thoughts. Sharot, in attendance to talk about the concepts extolled in her new book at the time, The Optimism Bias, was listening to a series of talks about the ‘wisdom of the crowd’.
“I was concerned because we know about all these biases people have – put people together and those will expand, they won’t find the truth,” she argues. “I called it the ‘group delusion’.” She had named the phenomenon, but realised it would be better to untangle it and give people the ammunition to recognise and fight against it. “If we know why people have biases, can we use that to change minds for the better. If we have an active, rational approach maybe it can help us and work with these features of our brain to make things better.”
In 2012, Sharot found she was also having to convince people of the science behind The Optimism Bias. It was a problem she hadn’t expected to encounter – facts and science should surely speak for themselves? But she routinely discovered people who thought they were the exception to the rule, based on personal narratives and anecdotes. We have powerful, everyday examples of people ignoring the facts thanks to Donald Trump. The final draft of the book was written before the US elections, but Sharot said she couldn’t have predicted how well Trump, his win and his ensuing rhetoric, would make her point for her about the dangers of influence.
She points to this in the book’s introduction, giving the example of Trump going against science, accepted advice and even the words of Ben Carson, a former neurosurgeon who would go on to become part of the president’s administration, when he said: “Autism has become an epidemic… it has gotten totally out of control… you take this little beautiful baby, and you pump – I mean, it looks like it’s just meant for a horse, not for a child.” Trump was referring to the much debunked “debate” that vaccines are linked to autism.
Sharot says: “Trump tapped into my very human need for control and my fear of losing it. He gave me an example of someone else’s mistake and induced emotion, which helped align the pattern of activity in my brain with his, making it more likely that I would take on his point of view. Finally, he warned of the dire consequences of not following his advice.” Sharot argues that inducing hope is a far more effective method of persuasion. Fear, however, works well when someone is trying to induce inaction, and when the person you are speaking to is already anxious. Much like a concerned parent.
“Emotions are much more powerful in changing minds and action than pure data,” admits Sharot. “With Trump, there was a lot of talk about ‘he’s lying’. But it’s unclear to me that that really matters. That starts with the assumption that people care about what the facts are. There are studies that show even when Trump supporters get information about the correct facts, it doesn’t change how they feel about supporting him.”
Knowing your targets desires is far more important when it comes to influencing them, than the facts.
“It’s not just the emotional intensity. It’s about whether the message conforms to what I want in life; does it give me a sense that I have control over my life and I’m part of a group. Those things are more important in changing minds than something being 100 per cent correct.”
So, people simply believe what they want to believe.
When it comes to vital issues such as climate change, Sharot argues it’s not enough to show people maps of the Earth getting warmer or colder. That won’t change an already sceptical mind. “You need to give people the sense that they can have control and can do something about it, and understand what is really driving their beliefs and desires before we can address it.” Like the fruit plate, we need to give the public a sense of agency and optimism to encourage uptake of an idea.
One chapter in the book looks at curiosity, and the extent to which we actually avoid potentially negative information. In a study of 396 women who gave a blood sample, 169 chose not to later find out if they had genes that would predispose them to breast cancer, despite the fact it would allow them to take precautionary measures. The blood was taken for an unrelated matter but the women were told it was tested for the gene and all they had to do was say yes to get the results. “This may seem surprising,” writes Sharot. “But… while the benefit of knowing would be to reduce the uncomfortable feeling of uncertainty, the cost of knowledge would be not having the option to believe what you would like to believe. As long as we are ignorant of the test results, we can continue believing that we are healthy; we can fill our minds with positive thoughts.”
Beliefs affect our wellbeing, points out Sharot. “So it’s not necessarily all irrational – in some ways, avoiding negative knowledge can have immediate positive results.”
In theory, being able to influence people en masse shouldn’t be too hard. Sharot argues that the basic needs of people are the same wherever they are: “we want to feel loved and feel successful; we want money to buy food and we want to be respected.” One chapter discusses how our idea of ourselves as original, is a complete falsehood. People that think they are choosing an unusual or unique baby name frequently find their child’s name on the top 100 most popular names list. These commonalities are happening all the more frequently thanks to the internet, points Sharot. “We are exposed to the same information even in different countries, and are conforming to it.”
Beyond the tricks of mind influence, there is a great deal of promising news in the book about what we as humans value. One experiment, carried out by Ethan Bromberg-Martin and Okihide Hikosaka, shows how highly even monkeys value knowledge. In the study, monkeys were willing to sacrifice part of a potential reward, just to ‘pay’ for or find out if they were going to get a large or small reward of a few drops of water. Incredibly, when looking at what happened in the brain during the experiment, they found that the neurons responsible for releasing dopamine, a reward chemical, fired at a similar rate when the monkeys were expecting information as when they were given 0.17mm of water. “In other words, the neurons were as excited by advance knowledge as they were by drops of H20, which are necessary for our existence.”
Knowledge, is life it seems. Quite literally. It is key to survival and we as a species have learnt this. There are just an awful lot of other biases we sometimes need to overcome in order to make the smartest use of that knowledge.