Every decision we make is influenced by countless different external factors. But some of the most powerful influences on our decisions, attitudes and behaviours are internal. These influences can be major roadblocks for progressive organisations and causes, especially unions, as they can affect our judgements and actions in subtle and not so subtle ways.
It’s important for union campaigners to be aware of these biases when we’re thinking about our campaigns and communications plans. They can be a terrible hindrance but also a useful ally.
George Lakoff is probably the most important proponent of progressives being aware of how the brain works. He explains that many progressive campaigns have failed because we are unconsciously trapped in an “Enlightenment” mode of thinking, where we ascribe to all decision-making a rational, logical process. Lakoff — and modern cognitive science — demonstrates that humans are very emotional, intuitive beings. (If you’re interested in this, also read The Political Brain by Drew Westen.)
Awareness of how we actually think, decide and act will make us more effective and ensure we achieve real impact for our causes.
What follows is a short list of seven behaviour biases that all progressive campaigners should be aware of.
Focalism (also known as anchoring), is “when individuals overly rely on a specific piece of information to govern their thought-process”. It has an effect when people use an arbitrary point to make a measurement about something.
This is useful because the arbitrary point can be suggested to someone by someone else. For example, researchers Amos Tversky and Daniel Kahneman asked subjects to guess the percentage of African nations in the UN. Those asked “was it more or less than ten percent” guessed a lower percentage than those who were asked “was it more or less than 65 percent”.
For progressive causes, the possibilities are substantial, especially in helping frame problems or solutions. The focus need not be numerical, but should be something that people can observe and give weight to.
2. Availability bias
This is a fairly well-known cognitive bias, and is simply a mental short cut whereby people use examples that easily come to mind to make judgements about the probability of events.
A classic example is parents’ attitude to the safety of children walking to school and being kidnapped by a stranger. All the evidence suggests that children are very unlikely to be kidnapped by a stranger, but because of highly publicised instances of this happening, we (mostly) perceive the risks to be high.
Similarly, lottery companies and casinos rely on media hype and publicity to use the availability bias to make people believe that the probability of winning is much higher than it really is.
If you know something or can remember something, it must be important.
Availability bias is important for progressive causes, as it demonstrates why it is important to build awareness of an issue or danger. It also underscores the importance of specific examples. People remember the story of the miners in Chile being rescued more readily than the miners in New Zealand who did not, because the example is more easily brought to mind due to the saturation coverage the Chilean miners received.
It is easy for people to think that the economy is in dire straights because of one person they know who lost their job than the many people they still know in employment. For unions, people remember the bad union experience more than the good ones.
3. Bandwagon effect
Also known as “group think”, the bandwagon effect helps explain trends and fads, and boils down to the tendency of people to follow the actions of others in order to confirm. Author Mark Earls explains the effect in terms of the “herd effect”, because our individual desires or preferences can be over-ridden by those of large numbers of our peers.
In politics, the idea of momentum can help build on the bandwagon effect, as people want to back the winner. In the US, we see this during primary season, where early wins can help propel candidates to further success.
Unions can benefit from this as they build membership in a workplace: as more and more people join, you can get increase the tendency for people to want to join in order to fit on.
4. Choice supportive bias
Have you ever bought something that you were unsure about, and then retrospectively searched out or came up with reasons why it was a good decision? Chances are, that was choice supportive bias.
Choice support bias is an effect where positive aspects of a choice are remembered as part of the choice option, whether or not they were a part of the original decision making process. An interesting example of this is when people are asked to remember their high-school grades, they are likely to over-estimate their good grades and downplay their bad ones.
People like to feel that the choices they make are the best ones for them; that they pick the right options. It helps reduce regret and promotes feelings of well-being.
Think about your new members and recently joined members, and how this effect could affect them. You want to provide evidence, a demonstration, that helps them come to the conclusion that they made the right decision.
5. Curse of knowledge
Imagine a scenario where you are asked to tap out several tunes of different songs to someone, and they need to guess what the song is. You have a pencil and have to tap it on the table, and from that tapping alone, the listener needs to guess the song. Before the experiment, you are asked to guess what proportion of the songs the listener will guess. Most people guess 50%, while the actual number is less than 2.5%. What was most interesting is that the tapper would become infuriated that the listener could not guess the song. In their head, they heard the song as clear as if someone was playing it.
This is the curse of knowledge. Once you know something, you can’t unknow it.
In Made to Stick, the authors Dan and Chip Heath make the point that everyone suffers from the curse of knowledge on a daily basis. Whenever we use jargon or insider language, we hear that tune in our head and then get frustrated that our member or delegate or client doesn’t understand what we mean. To us, the acronym or jargon makes perfect sense.
The curse of knowledge is something that campaigners need to be aware of.
6. Halo effect
Edward Thorndike discovered the halo effect when he noticed that people estimate another person’s personal qualities more highly when they view their physical traits more highly. The scenario was in 1915 when he examined officer ratings of juniors and saw that taller, deeper voice, physique resulted in the officer rating their courage, intelligence, loyalty and leadership skills much higher.
In another example, study participants were asked to evaluate the writing of an author; some were shown a photo of the author as a beautiful woman. Those who saw the photo rated the writing higher than those who didn’t.
The halo effect has been noticed repeatedly in a wide variety of situations, even in reverse. For example, ugly criminals are perceived as worse than good-looking ones.
The halo effect is one of the reasons why advertisers put photos of models next to their products. Similarly, the ipod “halo” contributed to the early success of the iphone.
For unions in particular, there is a risk of being covered by the negative halo because of the HSU scandal.
On the positive side, unions should try to identify positive traits or characteristics to emphasise. And at the risk of sound superficial, it’s worth thinking about the halo effect when choosing photos of members.
7. Loss aversion
Loss aversion is “the combination of a greater sensitivity to losses than to gains and a tendency to evaluate outcomes frequently”. The aversion to loss can be up to twice as powerful as the desire for gain.
The crux of this behavioural bias comes down to risk. The more people assess risk, the more risk averse they become. When the potential losses are reduced, people accept more risk.
When choices are posed in terms of what someone has to lose, people are more likely to want to take a less risky option, even when the potential rewards are quite high. This was observed in 1981 by two scientists Tversky and Kahneman, who posed a problem phrased two different ways: In a scenario where the US was preparing for an outbreak of disease expected to kill 600 people, subjects were asked to choose between two options. The first option phased the solution as “200 people will be saved” and the second option was “there is a one-third probability that 600 people will be saved and a two-thirds probability that no people will be saved.
Then the problem was posed differently again: option three where 400 people will die, and option four, “a one-third probability that nobody will die and two-thirds probability that 600 people will die”.
The study found that 72% of people preferred the first option to the second. When the options were between three and four, 78% of people preferred option four to option three.
The change comes from people preferring to take the risk of saving one third of people, rather than the sure loss of 400 people, even though the chance of no one dying was only one-third, and there was a sure chance of saving 200 people in the first.
For unions, this behaviour bias suggests that the loss of an amount is worth double the benefit of gaining an amount. It favours inaction over action and the status quo over change. When campaigning against something (e.g. a restructure), highlighting the potential losses that members or workers face is likely to tap into this bias.