Many people in unions and progressive politics talk about “framing” — made famous by George Lakoff’s Don’t Think of an Elephant — but it’s difficult to see what that means in practice. Effective communication is made even more challenging by the broad left’s obsession with “rationality” and fact-based arguments.
I’ve previously written that progressives should give up on fact-based arguments if their objective is to change someone’s mind. The reason for this is that the primary starting point for many progressives is that a person’s non-support for a progressive position is due to lack of information. It follows that by giving the person more and better information about the issue (e.g. global warming, asylum seekers, debt, etc), the person will change their mind to agree with the progressive position.
Unfortunately, more often than not, this “deficit” model of communication not only results in no change in attitude, but can actually entrench existing (non-progressive) views. A common default response to this is that people who don’t support progressive views are “dumb” or are somehow being “duped” by conservative forces.
An interesting aside is that multiple studies have shown that there is almost no relationship between intelligence (measured by IQ) and an individual’s ability to consider arguments that contradict their prior beliefs. However, the same studies showed that there is a strong relationship between intelligence and the ability to defend one’s own point of view. In other words, the reason that people don’t agree with progressive views isn’t likely to be because they’re stupid.
A related challenge is “myside bias” or confirmation bias, which is where you’re very able to see and agree with all the strong arguments in favour of your own viewpoint, and defend that viewpoint by seeking out information and facts that confirm it. Again, people with high intelligence are no less likely to display confirmation bias than other people.
The reason I give these asides is that very intelligent people in the union movement and progressive movements more generally, are stuck advocating for communication strategies that just don’t work. Fact-based communications are ineffective at changing peoples’ views and attitudes.
What’s more, today’s breed of conservatives don’t care about facts, and nor do their mates in the conservative mainstream media. The likes of Fox News, The Australian and the Spectator engage in “post-fact” argumentation. Facts are contestable, and when no amount of distortion or cherry-picking will work, many conservatives are content to resort to bald-faced lying.
Against this, a progressive who goes into a fight armed with facts is taking a fruit-peeler to a knife fight.
The University College London has an excellent report on climate change science, communication and action, called Time for Change. Although it is climate-focused, the section on communication has much wider appeal and applications.
It does a desk review into the latest evidence-based communication science. For this reason, it’s worth reading in detail. I’m summarising the main points here.
“Framing” and Context?
Framing is a much misused term thanks to Lakoff, and it’s because many people don’t really understand what it means. I find conservative spin-master Frank Luntz’s description much better. He talks about “context”.
“The observer never sees the pure phenomenon with his own eyes; rather much depends on his mood, the state of his senses, the light, the air … and a thousand other circumstances.”
Goethe, Empirical Observation and Science
What this means, according to the UCL paper is that meaning is assigned to stimuli in a context of prior knowledge, and is powerfully influenced by mood, emotions and social factors. Our brains, simply, have not developed to find the “unvarnished truth”, but rather work with a series of heuristics or “rules of thumb”. Simple environmental or social cues, like whether it’s raining or whether someone is dressed in a uniform, can change how we give meaning (and thus form attitudes and respond) to stimuli. The unfortunate reality is that these influences affect how we think regardless of whether we are aware of them or not, and more often their influence is unconscious. What’s more, once we form an opinion (even an insignificant one), it’s hard to change, even in the face of overwhelming evidence.
The classic example is the shade illusion.
Context is a powerful thing, and in the image above the environmental cues tell us, despite the evidence, that the squares are different shades because of the shadow cast by the apple. Even knowing that the “light” square in the shadow is in fact the same shade as the dark square outside the shadow, it’s hard to accept.
Rationality is much more subtle than we think
Although progressives appeal to reason a lot, as Lakoff says, this view of how we think and decide is based on 17th Century Enlightenment attitudes that do not reflect how we actually think.
In fact, as the UCL report highlights, cognition and “higher cognition” involves both intuitive and reflective reasoning. More often than not, intuitive reasoning comes before reflective reasoning. The frequent assumption is that reflective reasoning trumps intuition, but intuition is autonomous and effortless, and operates outside conscious awareness.
This is where heuristics come into their own, and the vast bulk of our thought is actually unconscious. Our reflective reasoning is often used to justify or rationalise our unconscious intuitions rather than analyse them.
The corollary to this is that our rational decisions are not separable from our emotions. In fact, emotions are essential for cognitive decision-making, and gives us our feelings of “rightness”. The idea of separating decisions from emotion is wrongheaded, and often results in poor decisions.
This is important, because attitudes, opinions and beliefs have an “affective” (emotional) dimension. When we recall an opinion or belief, our brains give us an emotional sense of “rightness” to reduce our cognitive burden. We feel that our view is right.
Being challenged on our views and opinions conflict with this subtle but powerful dimension, and can trigger a negative sense of worry or rejection. The feeling is stronger when the view or belief is more ardently held.
Progressives often dispute the views of those whose views they seek to change, using facts as the basis for the challenge. Not only can facts be interpreted and given meaning by someone based on environmental and social contexts, but the person’s feeling of rightness about their opinions can cause them to strongly rationalise in favour of retaining that feeling when their opinion is challenged.
How attitudes and opinions are formed and strengthened
Confirmation bias is one of the most important concepts to understand from the UCL report, because the phenomenon doesn’t just help explain how people seek out information, but how views and attitudes are intensified and polarised.
The phenomenon of “confirmation bias” also helps explain why progressives so commonly default to the “deficit model”. By nature, we all seek out information when presented with a situation or circumstance. On the face of it therefore, it is eminently sensible for progressives to aim to provide accurate information to people, for example when they seek out information about global warming. The problem arises when you realise that most people seek out information that fits with their prior knowledge, attitudes or opinions. Therefore, the information progressives provide is often useless in changing the person’s opinion. In fact, when presented with information that disconfirms existing attitudes, we are more likely to submit it to critical examination.
As social psychologist Thomas Gilovich wrote “for agreeable propositions, it is as if we ask ourselves: ‘Can I believe this?’ For disagreeable ones, it is as if we ask: ‘Must I believe this?'”
This primarily holds when someone has an existing conviction or attitude, and on almost all major public policy issues, whether it is climate change, public debt, Obamacare, immigration or refugees, most people have a pre-existing opinion or conviction.
Confirmation bias is a very powerful thing. It not only drives attitudes, but can affect behaviour.
When there is a discrepancy between someone’s actions and internal attitudes, this gives rise to a process of self-justification to bring the internal attitudes in line with the outcome of the behaviour. Over time, this cumulative effect can lead from initial indifference to a position of strong conviction and polarization. …
A key insight is that people strive towards internal consistency, particularly in the beliefs they have about themselves (e.g. ‘I am an intelligent, kind and competent person’). A challenge to our behaviour or attitudes acts as a potential threat to our self image (‘How could I, an intelligent, kind and competent person, hold an illogical belief or have done something hurtful?’). It gives rise to ‘dissonance’, a state of discomfort and distress, which acts as a powerful motivating force to rationalize the offending evidence. It is thus because of the desire to be right in our beliefs about ourselves (and its opposite, the discomfort we want to eradicate when those beliefs are challenged).
The UCL report does look at what works when communicating. As the paper is about climate change, it obviously only considers what works when communicating climate change. It finds that:
- Fear appeals work, when they point to specific dangers and are accompanied by solutions.
- In other situations, fear appeals likely lead to avoidance and desensitisation.
- Alarmist messages that fail to materialise contribute to loss of trust in the scientific community.
I think union communicators will be intuitively familiar with these three take-aways.
Overall, the clear message from the UCL report is that communicators must be aware of, and accept as real, the multiple cognitive biases and fallacies that we all experience. Communicators must have humility and be introspective when developing messages; there’s no point in building communications plans from fundamentally flawed premises (e.g. the deficit model).
There is some really incredible, useful evidence-based science on effective communications. The UCL report is a great desk review. Read the entire thing here.
- Predictably Irrational – Dan Ariely
- Everything is Obvious – Duncan Watts
- Mistakes Were Made (But Not By Me) – Carol Tarvis