How Can We Know Unknown Unknowns?

By Michael Smithson

Michael Smithson
Michael Smithson (biography)

In a 1993 paper, philosopher Ann Kerwin elaborated a view on ignorance that has been summarized in a 2×2 table describing crucial components of metacognition (see figure below). One margin of the table consisted of “knowns” and “unknowns”. The other margin comprised the adjectives “known” and “unknown”. Crosstabulating these produced “known knowns”, “known unknowns”, “unknown knowns”, and unknown unknowns”. The latter two categories have caused some befuddlement. What does it mean to not know what is known, or to not know what is unknown? And how can we convert either of these into their known counterparts?

relationship_known-to-unknown_after-kerwin1993
Source: Adapted from Kerwin (1993) by Smithson in Bammer et al. (2008)

In this post, I will concentrate on unknown unknowns, what they are, and how they may be identified.

Attributing ignorance

To begin, no form of ignorance can be properly considered without explicitly tracking who is attributing it to whom. With unknown unknowns, we have to keep track of three viewpoints: the unknower, the possessor of the unknowns, and the claimant (the person making the statement about unknown unknowns). Each of these can be oneself or someone else.

Various combinations of these identities generate quite different states of (non)knowledge and claims whose validities also differ. For instance, compare:

  1. A claims that B doesn’t know that A doesn’t know X
  2. B claims that A doesn’t know that A doesn’t know X
  3. A claims that A doesn’t know that B doesn’t know X
  4. A claims that A doesn’t know that A doesn’t know X

The first two could be plausible claims, because the claimant is not the person who doesn’t know that someone doesn’t know X. The last two claims, however, are problematic because they require self-insight that seems unavailable. How can I claim I don’t know that I don’t know X? The nub of the problem is self-attributing false belief. I am claiming one of two things. First, I may be saying that I believe I know X, but my belief is false. This claim doesn’t make sense if we take “belief” in its usual meaning; I cannot claim to believe something that I also believe is false. The second possible claim is that my beliefs omit the possibility of knowing X, but this omission is mistaken. If I’m not even aware of X in the first place, then I can’t claim that my lack of awareness of X is mistaken.

Current unknown unknowns would seem to be claimable by us only about someone else, and therefore current unknown unknowns can be attributed to us only by someone else. Straightaway this suggests one obvious means to identifying our own unknown unknowns: Cultivate the company of people whose knowledge-bases differ sufficiently from ours that they are capable of pointing out things we don’t know that we don’t know. However, most of us don’t do this—Instead the literature on interpersonal attraction and friendships shows that we gravitate towards others who are just like us, sharing the same beliefs, values, prejudices, and therefore, blind-spots.

Different kinds of unknown unknowns

There are different kinds of unknown unknowns, each requiring different “remedies”. The distinction between matters that we mistakenly think that we know about, versus matters that we’re unaware of altogether, is probably the most important distinction among types of unknown unknowns. Its importance stems from the fact that these two kinds have different psychological impacts when they are attributed to us and require different readjustments to our view of the world.

1. False convictions

A shorthand term for the first kind of unknown unknown is a “false conviction”. This can be a matter of fact that is overturned by a credible source. For instance, I may believe that tomatoes are a vegetable but then learn from my more botanically literate friend that its cladistic status is a fruit. Or it can be an assumption about one’s depth of knowledge that is debunked by raising the standard of proof—I may be convinced that I understand compound interest, but then someone asks me to explain it to them and I realize that I can’t provide a clear explanation.

What makes us vulnerable to false convictions? A major contributor is over-confidence about our stock of knowledge. Considerable evidence has been found for the claim that most people believe that they understand the world in much greater breadth, depth, and coherence than they actually do. In a 2002 paper psychologists Leonid Rozenblit and Frank Keil coined a phrase to describe this: the “illusion of explanatory depth”. They found that this kind of overconfidence is greatest in explanatory knowledge about how things work, whether in natural processes or artificial devices. They also were able to rule out self-serving motives as a primary cause of the illusion of explanatory depth. Instead, illusion of explanatory depth arises mainly because our scanty knowledge-base gets us by most of the time, we are not called upon often to explain our beliefs in depth, and even if we intend to check them out, the opportunities for first-hand testing of many beliefs are very limited. Moreover, scanty knowledge also limits the accuracy of our assessments of our own ignorance—greater expertise brings with it greater awareness of what we don’t know.

Another important contributor is hindsight bias, the feeling after learning about something that we knew it all along. In the 1970’s cognitive psychologists such as Baruch Fischhoff ran experiments asking participants to estimate the likelihoods of outcomes of upcoming political events. After these events had occurred or failed to occur, they then were asked to recall the likelihoods that they had assigned. Participants tended to over-estimate how likely they had thought an event would occur if the event actually happened.

Nevertheless, identifying false convictions and ridding ourselves of them is not difficult in principle, providing that we’re receptive to being shown to be wrong and are able to resist hindsight bias. We can self-test our convictions by checking their veracity via multiple sources, by subjecting them to more stringent standards of proof, and by assessing our ability to explain the concepts underpinning them to others. We can also prevent false convictions by being less willing to leap to conclusions and more willing to suspend judgment.

2. Unknowns we aren’t aware of at all

Finally, let’s turn to the second kind of unknown-unknown, the unknowns we aren’t aware of at all. This type of unknown unknown gets us into rather murky territory. A good example of it is denial, which we may contrast with the type of unknown unknown that is merely due to unawareness. This distinction is slightly tricky, but a good indicator is whether we’re receptive to the unknown when it is brought to our attention. A climate-change activist whose friend is adamant that the climate isn’t changing will likely think of her friend as a “climate-change denier” in two senses: he is denying that the climate is changing and also in denial about his ignorance on that issue.

Can unknown unknowns be beneficial or even adaptive?

One general benefit simply arises from the fact that we don’t have the capacity to know everything. The circumstances and mechanisms that produce unknown unknowns act as filters, with both good and bad consequences. Among the good consequences is the avoidance of paralysis—if we were to suspend belief about every claim we couldn’t test first-hand we would be unable to act in many situations. Another benefit is spreading the risks and costs involved in getting first-hand knowledge by entrusting large portions of those efforts to others.

Perhaps the grandest claim for the adaptability of denial was made by Ajit Varki and Danny Brower in their book on the topic. They argued that the human capacity for denial was selected (evolutionarily) because it enhanced the reproductive capacity of humans who had evolved to the point of realising their own mortality. Without the capacity to be in denial about mortality, their argument goes, humans would have been too fearful and risk-averse to survive as a species. Whether convincing or not, it’s a novel take on how humans became human.

Antidotes

Having taken us on a brief tour through unknown unknowns, I’ll conclude by summarizing the “antidotes” available to us.

  1. Humility. A little over-confidence can be a good thing, but if we want to be receptive to learning more about what we don’t know that we don’t know, a humble assessment of the little that we know will pave the way.
  2. Inclusiveness. Consulting others whose backgrounds are diverse and different from our own will reveal many matters and viewpoints we would otherwise be unaware of.
  3. Rigor. Subjecting our beliefs to stricter standards of evidence and logic than everyday life requires of us can quickly reveal hidden gaps and distortions.
  4. Explication. One of the greatest tests of our knowledge is to be able to teach or explain it to a novice.
  5. Acceptance. None of us can know more than a tiny fraction of all there is to know, and none of us can attain complete awareness of our own ignorance. We are destined to sail into an unknowable future, and accepting that makes us receptive to surprises, novelty, and therefore converting unknown unknowns into known unknowns. That unknowable future is not just a source of anxiety and fear, but also the font of curiosity, hope, aspiration, adventure, and freedom.

References:
Bammer G., Smithson, M. and the Goolabri Group. (2008). The nature of uncertainty. In, G. Bammer and Smithson, S. (eds.), Uncertainty and Risk: Multi-Disciplinary Perspectives, Earthscan: London, United Kingdom: 289-303.

Kerwin, A. (1993). None too solid: Medical ignorance. Knowledge, 15, 2: 166-185

Rozenblit, L. and Keil, F. (2002). The misunderstood limits of folk science: An illusion of explanatory depth. Cognitive Science, 26, 5: 521-562

Varki, A. and Brower, D. (2013). Denial: Self-deception, false beliefs, and the origins of the human mind. Hachette: London, United Kingdom

Biography: Michael Smithson PhD is a Professor in the Research School of Psychology at The Australian National University. His primary research interests are in judgment and decision making under ignorance and uncertainty, statistical methods for the social sciences, and applications of fuzzy set theory to the social sciences.

This blog post is part of a series on unknown unknowns as part of a collaboration between the Australian National University and Defence Science and Technology.

Published blog posts in the series:
Accountability and adapting to surprises by Patricia Hirl Longstaff
https://i2insights.org/2019/08/27/accountability-and-surprises/

Scheduled blog posts in the series:
September 24: What do you know? And how is it relevant to unknown unknowns? by Matthew Welsh
October 8: Managing innovation dilemmas: Info-gap theory by Yakov Ben-Haim

Using discomfort to prompt learning in collaborative teams

By Rebecca Freeth and Guido Caniglia

Image of Rebecca Freeth
Rebecca Freeth (biography)

We know that reflecting can make a marked difference to the quality of our collective endeavour. However, in the daily busyness of inter- and trans- disciplinary research collaborations, time for reflection slides away from us as more immediate tasks jostle for attention. What would help us put into regular practice what we know in theory about prioritising time to reflect and learn?

Image of Guido Caniglia
Guido Caniglia (biography)

Discomfort sometimes provides the necessary nudge in the ribs that reminds us to keep reflecting and learning. The discomfort of listening to the presentation of a colleague you like and respect, but having very little idea what they’re talking about. Or, worse, failing to see how their research will make a worthy contribution to the collective project. The discomfort when an intellectual debate with a colleague turns personal. The discomfort of watching project milestones loom, knowing you’re seriously behind schedule because others haven’t done what they said.

We draw on the work of German pedagogue Tom Senninger (2000) to explore varying degrees of discomfort as prompts for on-the-job learning to collaborate in research teams. Too much discomfort and our stress levels can block learning. But with zero discomfort, there may be little stimulus to challenge one’s own assumptions. As shown in the figure below, somewhere in-between is a sweet spot that Senninger calls the ‘learning zone’, where there is enough discomfort to prompt inquiry and learning.

The learning zone between comfort and discomfort, as adapted from Senninger (2000)
The spectrum from extreme comfort to extreme discomfort. The learning zone lies in between. In any shared experience, different members of a research team (identified as stars) may experience different degrees of dis/comfort. Adapted from Senninger (2000).

One of the complications in a team is that different individuals have different degrees of tolerance to discomfort and are triggered into discomfort by different things. We suggest actively and intentionally addressing discomfort in research teams, calling reflexive attention to it, even if not everyone is equally affected by it. In this way, both individual researchers and the whole team can enter the learning zone. The intention is to learn about oneself and each other, to capitalize on differences and find complementarities instead of getting stuck in controversies. In this way, discomfort has potential to prompt efforts to enhance communication, mutual understanding and integration for more robust research outputs.

This suggests that alongside regular ring-fenced times for team reflection, there is also value in using moments of discomfort to slow down and inquire: What’s happening now? What is causing discomfort? Such inquiry can lead to individual contemplation that opens back into a group conversation about different experiences and what these indicate for future learning needs as a team.

This then leads to questions about how to stay engaged through discomfort, to keep learning to collaborate together. Previous blog posts by Rebecca Freeth and Liz Clarke provide clues for skilful conversations for integration as well as a rationale for embracing tension for energy and creativity in interdisciplinary research.

What have you discovered about working with your own discomfort in collaborations? How have you made visible moments of collective discomfort in a team for the purpose of strengthening collaborative capacity?

To find out more:
Freeth, R. and Caniglia, G. (2019). Learning to collaborate while collaborating: Advancing interdisciplinary sustainability research. Sustainability Science. (Online) (DOI): https://doi.org/10.1007/s11625-019-00701-z

Reference:
Senninger, T. (2000). Abenteuer Leiten – in Abenteuern lernen: Methodenset zur Planung und Leitung kooperativer Lerngemeinschaften für Training und Teamentwicklung in Schule, Jugendarbeit und Betrieb. Oekotopia Verlag: Aachen, Germany.

Biography: Rebecca Freeth is a senior fellow at the Institute for Advanced Sustainability Studies (IASS) in Potsdam, Germany. As a practitioner, teacher and researcher, she has an abiding interest in how to strengthen collaboration, especially in situations where competition, conflict or controversy are the more familiar ways of engaging. She accompanies interdisciplinary and transdisciplinary teams in support of realizing more meaningful collaboration and more satisfying project outcomes.

Biography: Guido Caniglia PhD has research interests which revolve around three main areas: (1) epistemology of inter- and trans- disciplinary sustainability science, with a focus on forms of real-world experimentation; (2) higher education for sustainable development, especially internationalization of the curriculum; (3) evolutionary biology, with a focus on the evolution of complex bio-social systems, from cities to insect societies. He is the Scientific Director of the Konrad Lorenz Institute for Evolution and Cognition Research in Klosterneuburg, Austria and previously held a Marie-Curie post-doctoral fellowship at Leuphana University in Lueneburg, Germany.

Accountability and adapting to surprises

By Patricia Hirl Longstaff

Image of Patricia Hirl Longstaff
Patricia Hirl Longstaff (biography)

We have all been there: something bad happens and somebody (maybe an innocent somebody) has their career ruined in order to prove that the problem has been fixed. When is blame appropriate? When is the blame game not only the wrong response, but damaging for long-term decision making?

In a complex and adapting world, errors and failure are not avoidable. The challenges decision-makers and organizations face are sometimes predictable but sometimes brand new. Adapting to surprises requires more flexibility, fewer unbreakable rules, more improvisation and deductive tinkering, and a lot more information about what’s going right and going wrong. But getting there is not easy because this challenges some very closely held assumptions about how the world works and our desire to control things. Continue reading

Why model?

By Steven Lade

Steven Lade
Steven Lade (biography)

What do you think about mathematical modelling of ‘wicked’ or complex problems? Formal modelling, such as mathematical modelling or computational modelling, is sometimes seen as reductionist, prescriptive and misleading. Whether it actually is depends on why and how modelling is used.

Here I explore four main reasons for modelling, drawing on the work of Brugnach et al. (2008):

  • Prediction
  • Understanding
  • Exploration
  • Communication.

Continue reading

Fourteen knowledge translation competencies and how to improve yours

By Genevieve Creighton and Gayle Scarrow

Genevieve Creighton
Genevieve Creighton (biography)

Knowledge translation encompasses all of the activities that aim to close the gap between research and implementation.

What knowledge, skills and attitudes (ie., competencies) are required to do knowledge translation? What do researchers need to know? How about those who are using evidence in their practice?

As the knowledge translation team at the Michael Smith Foundation for Health Research, we conducted a scoping review of the skills, knowledge and attitudes required for effective knowledge translation (Mallidou et al., 2018). We also gathered tools and resources to support knowledge translation learning. Continue reading

Learning from interdisciplinary and transdisciplinary research ‘failures’

By Dena Fam and Michael O’Rourke

Dena Fam
Dena Fam (biography)

What makes interdisciplinary and transdisciplinary research challenging? What can go wrong and lead to failure? What has your experience been?

Modes of research that involve the integration of different perspectives, such as interdisciplinary and transdisciplinary research, are notoriously challenging for a host of reasons. Interdisciplinary research requires the combination of insights from different academic disciplines and it is common that these:

  • bear the stamp of different epistemologies; and,
  • involve different types of data collected using different methods in the service of different explanations.

Continue reading