You are biased!

Community member post by Matthew Welsh

Matthew Welsh (biography)

Complex, real-world problems require cooperation or agreement amongst people of diverse backgrounds and, often, opinions. Our ability to trust in the goodwill of other stakeholders, however, is being eroded by constant accusations of ‘bias’. These are made by commentators about scientists, politicians about media outlets and people of differing political viewpoints about one another. Against this cacophony of accusation, it is worthwhile stepping back and asking “what do we mean when we say ‘bias’ and what does it say about us and about others?”.

When used in this accusatory manner, bias is a loaded word. It comes with an assumption of deliberate, motivated deceit. When we make an accusation of bias, we are accusing our targets of being bad people. We assume that they know better but continue to deliberately misrepresent the truth.

While this may be true in some instances, deliberate deceit is often not the case. If you examine people’s cognition – how they make decisions, interpret information or search for options – something becomes increasingly apparent: many biases arise without any need for us to posit conscious motivations. That is, biases occur not because people are bad or dishonest but simply because they are people – and have cognitive limitations that impact how they interpret and understand the world.

To take a well-known example from decision making research, confirmation bias is often described as people’s tendency to seek out and preferentially accept evidence that conforms to their pre-existing beliefs (see, eg., Nickerson 1998). Reading this description, it sounds like the result of a person’s conscious decision to deliberately avoid information they know is relevant. Discussed in terms of how people process information, however, it becomes less sinister.

For example, to assist their understanding and reduce cognitive effort, people simplify the complex world by constructing interconnected, causal explanations and categories – so-called schema. The entire point of such schema is that when a new piece of information is presented, if it fits within a schema, it can be uncritically accepted. If, however, new information contradicts existing beliefs, it triggers cognitive processes designed to see whether the new information should be disregarded or the schema changed. Obviously, changing a schema involves significantly more effort and, so, makes sense only in exceptional circumstances. Thus, we subject ‘surprising’ information to critical examination, which results in it being subjected to greater scrutiny and being regarded as more doubtful than unsurprising evidence. This holds true whether a scientist is reviewing a paper that presents evidence that runs contrary to their own theories or a non-scientist is scanning the internet for information about the efficacy/dangers of vaccines – their initial beliefs will, unconsciously, bias how plausible data seem.

Similarly, when we need to select among sources of information, we do not know which sources are reliable across all areas of knowledge. Instead, we select sources that we trust to provide plausible information – and, for exactly the reason outlined above, these tend to be ones that present information that accords with our pre-existing beliefs. This is because such information is easier to digest and our minds use ‘fluency’ (the ease with which we understand something) as a marker for truth. Thus, without any motivation beyond seeking information, our cognitive limitations result in us preferring evidence that reinforces what we already believe – producing the confirmation bias and potentially amplifying any initial errors in our beliefs.

So, when you are next inclined to bemoan other people’s bias, remember that ‘biased’ is the natural state of people – resulting from our attempts to grapple with a complex world using limited cognitive abilities. Rather than assuming malfeasance, a better use of your time may be trying to understand which limitations might be causing biases and considering how to present information to assist others in overcoming our shared cognitive limitations.

That is, if everyone is biased, what can we do about it? Just explaining that someone’s belief is false does not work – just think of how difficult it has been to displace the idea that vaccinations cause autism once it took hold. There is no evidence for this relationship but removing it leaves a ‘hole’ in the person’s schema. That is, because the actual cause of autism is unknown, removing vaccines as the cause results in an incomplete and thus incoherent account and people revert to their simple, coherent, false explanation as this is, subjectively, superior (remembering that ‘easy = true’ so a simple explanation is seen as more likely to be true).

This, then, illuminates a key for debiasing effects like confirmation bias. Rather than trying to insert facts into pre-existing schema they are at odds with, we need to focus on creating alternative accounts that are complete in and of themselves – building on a person’s existing beliefs. Imagine a process like a Socratic dialogue, where you start with simple, readily agreeable, facts and then build on these towards your final conclusion. Another cognitive limitation assists in this – knowledge partitioning: the observation that people construct different schema to explain different parts of the world and can be unconscious of discrepancies between these. For example, rather than repeating evidence for climate change from various scientific sources (which a sceptic may regard as unreliable when interpreted through their political schema), you could attempt to build on their potentially separate schema for basic science (the physics of the greenhouse effect, etc) towards the scientific consensus opinion that human activity is driving climate change.

The process described above is, of course, time consuming and difficult – requiring deep thought about how and why a person might be displaying biased thinking and about what other aspects of their knowledge you might be able to leverage in assisting them avoid this bias. It is, however, far more likely to work than standing at ten paces shouting “bias!” at one another.

Are there any strategies you have found useful for identifying and countering your own or other people’s cognitive biases? Do you have examples of ways to present information that might assist a recipient in avoiding confirmation (or other) biases?

To find out more about confirmation and other biases and how they can be countered:
Welsh, M. (2018). Bias in science and communication: A field guide. IOP Press: Bristol, United Kingdom;

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 2: 175-220.

Biography: Matthew Welsh PhD is a Senior Research Fellow at the Australian School of Petroleum, University of Adelaide, Australia. His research and teaching focus on how people’s inherent cognitive processes affect their judgements, estimates and decisions and the implications of this for real-world decision making.

Ten steps to strengthen the environmental humanities

Community member post by Christoph Kueffer and Marcus Hall

Christoph Kueffer (biography)

How might the environmental humanities complement insights offered by the environmental sciences, while also remaining faithful to their goal of addressing complexity in analysis and searching for solutions that are context-dependent and pluralistic?

There is a long and rich tradition of scholarship in the humanities addressing environmental problems. Included under the term ‘environmental studies’ until recently, fields such as the arts, design, history, literary studies, and philosophy are now gathering under the new umbrella of the ‘environmental humanities’. Continue reading

A checklist for documenting knowledge synthesis

Community member post by Gabriele Bammer

Gabriele Bammer (biography)

How do you write-up the methods section for research synthesizing knowledge from different disciplines and stakeholders to improve understanding about a complex societal or environmental problem?

In research on complex real-world problems, the methods section is often incomplete. An agreed protocol is needed to ensure systematic recording of what was undertaken. Here I use a checklist to provide a first pass at developing such a protocol specifically addressing how knowledge from a range of disciplines and stakeholders is brought together.


1. What did the synthesis of disciplinary and stakeholder knowledge aim to achieve, which knowledge was included and how were decisions made? Continue reading

Foundations of a translational health sciences doctoral program

Community member post by Gaetano R. Lotrecchiano and Paige L. McDonald

Gaetano R. Lotrecchiano (biography)

How can doctoral studies be developed to include innovation in practice and research, as well as systems and complexity thinking, along with transdisciplinarity? This blog post is based on our work introducing a PhD in Translational Health Sciences at George Washington University in the USA.

Innovation in Practice and Research

We suggest that innovation in practice and research is achieved by the integration of knowledge in three key foundational disciplines:

  • translational research
  • collaboration sciences
  • implementation science (Lotrecchiano et al., 2016).

Continue reading

Using the concept of risk for transdisciplinary assessment

Community member post by Greg Schreiner

Greg Schreiner (biography)

Global development aspirations, such as those endorsed within the Sustainable Development Goals, are complex. Sometimes the science is contested, the values are divergent, and the solutions are unclear. How can researchers help stakeholders and policy-makers use credible knowledge for decision-making, which accounts for the full range of trade-off implications?

‘Assessments’ are now commonly used. Following their formal adoption by the Intergovernmental Panel for Climate Change (IPCC) in the early 1990s, they have been used at the science-society-policy interface to tackle global questions relating to biodiversity and ecosystems services, human well-being, ozone depletion, water management, agricultural production, and many more. Continue reading

Five principles of holistic science communication

Community member post by Suzi Spitzer

Suzi Spitzer (biography)

How can we effectively engage in the practice and art of science communication to increase both public understanding and public impact of our science? Here I present five principles based on what I learned at the Science of Science Communication III Sackler Colloquium at the National Academy of Sciences in Washington, DC in November 2017.

1. Assemble a diverse and interdisciplinary team

  1. Scientists should recognize that while they may be an expert on a particular facet of a complex problem, they may not be qualified to serve as an expert on all aspects of the problem. Therefore, scientists and communicators should collaborate to form interdisciplinary scientific teams to best address complex issues.
  2. Science is like any other good or service—it must be strategically communicated if we want members of the public to accept, use, or support it in their daily lives. Thus, research scientists need to partner with content creators and practitioners in order to effectively share and “sell” scientific results.
  3. Collaboration often improves decision making and problem solving processes. People have diverse cognitive models that affect the way each of us sees the world and how we understand or resolve problems. Adequate “thought world diversity” can help teams create and communicate science that is more creative, representative of a wider population, and more broadly applicable.

Continue reading