You are biased!

By Matthew Welsh

matthew-welsh
Matthew Welsh (biography)

Complex, real-world problems require cooperation or agreement amongst people of diverse backgrounds and, often, opinions. Our ability to trust in the goodwill of other stakeholders, however, is being eroded by constant accusations of ‘bias’. These are made by commentators about scientists, politicians about media outlets and people of differing political viewpoints about one another. Against this cacophony of accusation, it is worthwhile stepping back and asking “what do we mean when we say ‘bias’ and what does it say about us and about others?”.

When used in this accusatory manner, bias is a loaded word. It comes with an assumption of deliberate, motivated deceit. When we make an accusation of bias, we are accusing our targets of being bad people. We assume that they know better but continue to deliberately misrepresent the truth.

While this may be true in some instances, deliberate deceit is often not the case. If you examine people’s cognition – how they make decisions, interpret information or search for options – something becomes increasingly apparent: many biases arise without any need for us to posit conscious motivations. That is, biases occur not because people are bad or dishonest but simply because they are people – and have cognitive limitations that impact how they interpret and understand the world.

To take a well-known example from decision making research, confirmation bias is often described as people’s tendency to seek out and preferentially accept evidence that conforms to their pre-existing beliefs (see, eg., Nickerson 1998). Reading this description, it sounds like the result of a person’s conscious decision to deliberately avoid information they know is relevant. Discussed in terms of how people process information, however, it becomes less sinister.

For example, to assist their understanding and reduce cognitive effort, people simplify the complex world by constructing interconnected, causal explanations and categories – so-called schema. The entire point of such schema is that when a new piece of information is presented, if it fits within a schema, it can be uncritically accepted. If, however, new information contradicts existing beliefs, it triggers cognitive processes designed to see whether the new information should be disregarded or the schema changed. Obviously, changing a schema involves significantly more effort and, so, makes sense only in exceptional circumstances. Thus, we subject ‘surprising’ information to critical examination, which results in it being subjected to greater scrutiny and being regarded as more doubtful than unsurprising evidence. This holds true whether a scientist is reviewing a paper that presents evidence that runs contrary to their own theories or a non-scientist is scanning the internet for information about the efficacy/dangers of vaccines – their initial beliefs will, unconsciously, bias how plausible data seem.

Similarly, when we need to select among sources of information, we do not know which sources are reliable across all areas of knowledge. Instead, we select sources that we trust to provide plausible information – and, for exactly the reason outlined above, these tend to be ones that present information that accords with our pre-existing beliefs. This is because such information is easier to digest and our minds use ‘fluency’ (the ease with which we understand something) as a marker for truth. Thus, without any motivation beyond seeking information, our cognitive limitations result in us preferring evidence that reinforces what we already believe – producing the confirmation bias and potentially amplifying any initial errors in our beliefs.

So, when you are next inclined to bemoan other people’s bias, remember that ‘biased’ is the natural state of people – resulting from our attempts to grapple with a complex world using limited cognitive abilities. Rather than assuming malfeasance, a better use of your time may be trying to understand which limitations might be causing biases and considering how to present information to assist others in overcoming our shared cognitive limitations.

That is, if everyone is biased, what can we do about it? Just explaining that someone’s belief is false does not work – just think of how difficult it has been to displace the idea that vaccinations cause autism once it took hold. There is no evidence for this relationship but removing it leaves a ‘hole’ in the person’s schema. That is, because the actual cause of autism is unknown, removing vaccines as the cause results in an incomplete and thus incoherent account and people revert to their simple, coherent, false explanation as this is, subjectively, superior (remembering that ‘easy = true’ so a simple explanation is seen as more likely to be true).

This, then, illuminates a key for debiasing effects like confirmation bias. Rather than trying to insert facts into pre-existing schema they are at odds with, we need to focus on creating alternative accounts that are complete in and of themselves – building on a person’s existing beliefs. Imagine a process like a Socratic dialogue, where you start with simple, readily agreeable, facts and then build on these towards your final conclusion. Another cognitive limitation assists in this – knowledge partitioning: the observation that people construct different schema to explain different parts of the world and can be unconscious of discrepancies between these. For example, rather than repeating evidence for climate change from various scientific sources (which a sceptic may regard as unreliable when interpreted through their political schema), you could attempt to build on their potentially separate schema for basic science (the physics of the greenhouse effect, etc) towards the scientific consensus opinion that human activity is driving climate change.

The process described above is, of course, time consuming and difficult – requiring deep thought about how and why a person might be displaying biased thinking and about what other aspects of their knowledge you might be able to leverage in assisting them avoid this bias. It is, however, far more likely to work than standing at ten paces shouting “bias!” at one another.

Are there any strategies you have found useful for identifying and countering your own or other people’s cognitive biases? Do you have examples of ways to present information that might assist a recipient in avoiding confirmation (or other) biases?

To find out more about confirmation and other biases and how they can be countered:
Welsh, M. (2018). Bias in science and communication: A field guide. IOP Press: Bristol, United Kingdom; http://iopscience.iop.org/book/978-0-7503-1311-7

Reference:
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 2: 175-220.

Biography: Matthew Welsh PhD is a Senior Research Fellow at the Australian School of Petroleum, University of Adelaide, Australia. His research and teaching focus on how people’s inherent cognitive processes affect their judgements, estimates and decisions and the implications of this for real-world decision making.

4 thoughts on “You are biased!”

  1. Thank you Matthew for this thoughtful post. I have read it already one and a half years ago and now I reread it again. While reading and thinking about it this second time, I discovered several new aspects which might be important to me in general but also for my project in which I am concerned in decision support for infrastructure management in rural communities. Nice. I need to block some time in my schedule to dive deeper into your book. Thanks again for inspiring me!

    Reply
  2. Thanks for your comment Fateme.

    Identifying biases is, as you point out, extremely hard. Part of the problem, of course, is that there are so many different biases – even just counting cognitive rather than motivational ones – and these can act in concert or to conceal and influence one another. In academic work, we have the advantage of being able to ask people questions to which we already know the answer – allowing us to compare their responses with what we know is true and, thus, highlight systematic errors (biases) in their responses. These situations are artificial but allow us insights into how people form opinions or try to make estimates and the limitations of their cognitive abilities that result in biases that then apply to real-world decision making. For example, the tendency for people’s estimates to be anchored by any number they have just seen means that you have a starting point for querying a person’s estimate – i.e., “what was your starting point?” which allows you to probe their understanding and have them potentially improve that estimate.

    In general, though, making people aware of the existence of these biases and when and how they affect judgements is a necessary first step – so that they understand that you aren’t attacking them, just trying to account for universal human cognitive tendencies. If you are worried about biases affecting complex, real world decisions, I would suggest that a good starting point is maintaining records of decision making processes (e.g., using a Decision Analytic approach), which then allows you to identify cases where a decision doesn’t accord with the decision maker’s stated objectives and the available information. The ways in which the decision differs from what you would regard as optimal given those inputs will be informative about the biases that may be affecting the decision. In the absence of documented evidence of a decision making process, we can make assumptions about how a person reached a decision and what biases may have caused this but may never be able to unpick exactly what they were doing. Even they are unlikely to be able to reconstruct their decision making process – as hindsight bias can result in our updating our memories with knowledge and beliefs that we didn’t actually possess at the time.

    Reply
  3. Thanks Matthew for this interesting post.ِ Debiasing is a challenging and time-consuming process. But before thinking about how to do debiasing process, we need to identify “Biases” and their sources and effects, which I found more challenging. Also, not all of the biases need to be reduced.
    Which methods can be used to identify and understand the biases and their characteristics?

    Reply

Leave a Reply to Fateme ZareCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Integration and Implementation Insights

Subscribe now to keep reading and get access to the full archive.

Continue reading