Detecting non-linear change ‘inside-the-system’ and ‘out-of-the-blue’

Susan van ‘t Klooster and Marjolijn Haasnoot

1. Susan van ‘t Klooster (biography)
2. Marjolijn Haasnoot (biography)

Change can be expected, envisioned and known, and even created, accelerated or stopped. But change does not always follow a linear and predictable path, nor is it always controllable. Novelty and surprise are inescapable features of life. Non-linear change can involve threats or opportunities.

Although it defines the world we live in, who we are, the outlooks we have and what we do, we often do not relate to non-linear change in a meaningful way. What is holding us back from engaging with it? How do we deal with non-linear change? And what are promising ways forward?

Why is thinking about and anticipating non-linear change difficult?

Generally speaking, non-linearity is difficult to define and conceptualize, because there are multiple interacting forces at the intersection of many domains, manifesting on different spatial and temporal scales and many different actors and (often conflicting) perspectives are involved. As a result, both the nature of change, its underlying causalities, potential chain reactions, and its potential effects are uncertain and at best partially knowable.

Non-linearity is also difficult to grasp because it is about processes and events that may or may not happen. Such processes and events are complex and uncertain and can lead to different perspectives and disagreement between stakeholders. When non-linearity occurs, it may come as a surprise or shock and may have a disruptive effect. It is no understatement to say that we are inexperienced with respect to non-linearity.

Instead, we are much more experienced in reasoning from some evolutionary development. As a result of this general tendency to focus on logical consequences of causal patterns in the past and the present, being confronted with non-linear change may generate insecurity, confusion and a general feeling of discomfort. These feelings can result in ignorance and paralysis of decision making. Thus, refraining from thinking about it makes us vulnerable.

We describe two different, but potentially synergetic, approaches in which detecting and monitoring seeds of change are key: ‘inside-the-system’ and ‘out-of-the-blue’, illustrated in the figure below. Both approaches share the idea that systems can become unstable beyond a critical value. Induced by fluctuations within a system and by external disturbances, a system can instantaneously change. In his book Earth in the Balance (2000), former US Vice President Al Gore uses the analogy of a pile of sand. Dropping one grain after another on a pile of grain does not budge the ‘grain system’. Instead, it slowly builds a stable cone. However, at some point, a critical value is reached, which causes the cone to collapse. Each approach discussed below has a different search light for finding system thresholds and changes.

vantklooster_detecting-non-linear-change_ inside-the-system-out-of-the-blue
Different search lights: ‘Inside-the-system’ and ‘Out-of-the-blue’ (Copyright: Susan van ‘t Klooster)

Finding seeds of non-linear change – ‘inside-the-system’

This approach searches primarily for system indicators that may be an early warning prelude to change and adaptation tipping points. Coherent and longitudinal monitoring of these indicators can help to:

  • create a deeper insight into the system’s dynamics;
  • signal changes that jeopardize (or provide opportunities) for achieving defined objectives in a timely manner;
  • change plans and strategies to continue to achieve the objectives under changed conditions;
  • implement actions not too early nor too late; and to avoid investing too much or too little (Haasnoot et al., 2018).

By focusing on those seeds of change we know relatively well because they are part of the current system – so-called ‘known unknowns’, – we become more conscious about potential new conditions and situations.

An example is adaptive planning in the context of Dutch delta management, where important change indicators include observed and projected sea-level rise along the coast, global mean sea-level rise, storm surge frequency and frequency of alarms to close storm surge barriers.

Finding seeds of non-linear change – ‘out-of-the-blue’

This approach searches for so-called ‘wild cards’: low-probability/high-impact events that may be a prelude to a major, sudden and disruptive break with the status-quo.

Signals of change are found beyond the dominant frames and outside the system. The focus here is on cross-cutting trends and events that may surprise us by coming ‘out-of-the-blue’. We do not know if they may happen, the rapidity with which they may unfold nor their potential effects.

It is, therefore, explicitly not the objective to predict and control such disruptive threats (or opportunities). Instead, this approach is aimed towards:

  • scaling down our blind spots towards our future;
  • becoming more literate in understanding the nature of potential disruptors;
  • being more prepared once our current systems are challenged;
  • avoiding a panic reaction and creating an information and strategic advantage.

These changes are considered ‘unknown unknowns’, but it is possible to imagine what some of these changes could be, such as energy becoming available in a limitless supply, the collapse of a major currency, average global life expectancy increasing to 120 years or a new pest that wipes out grain crops.

Strengths and weaknesses

Both approaches have their own strengths and weaknesses. A risk of the ‘inside-the-system’ approach is that signals that are outside ‘the system’ and that do not fit into a dominant mental frame, remain unnoticed. To avoid tunnel vision, it is important to actively search for signals that could change current paradigms (potential game changers) beyond the system.

A weakness related to the ‘out-of-the-blue’ approach is the difficulty of making a solid link with decision-making.

At the same time, both approaches can reinforce one another: The strength of the first lies in its solid systems knowledge and link with decision-making. The second approach has a strong imaginative potential, that can be used to avoid perceptive and interpretive biases.

Four questions we should ask ourselves to detect change

How can we better integrate both perspectives and create synergies between them? Synergy starts by asking the following four questions:

  • Descriptive: What emergent trends (‘inside-the-system’) and potentially disruptive events (‘out-of-the-blue’) do we see or can we imagine? This involves using both systems knowledge and imaginative power.
  • Estimative: How (un)certain are we? This involves embracing, explicating and using multivocality, instead of assuming consensus.
  • Generative: What is the relative impact? This involves focusing on both direct and indirect effects.
  • Responsive: What can we do in response? This involves linking analysis to action, eg., adapting a plan or strategy, preparatory action, installing/altering a monitoring system, communication about outcomes for awareness raising, and further research.

Monitoring both in and outside the system helps to detect and anticipate change and to prepare in a timely fashion if needed.

Final remarks

Do you have suggestions for other ways to deal with non-linearity? Do you have useful examples where ‘inside-the-system’ and ‘out-of the-blue’ approaches have been successfully used to deal with non-linearities?

Gore, A. (2000). Earth in the balance. Ecology and the human spirit. Earthscan: New York, United States of America.

Haasnoot, M. van ’t Klooster, S. and van Alphen, J. (2018). Designing a monitoring system to detect signals to adapt to uncertain climate change. Global Environmental Change, 52: 273-285. (Online) (DOI):

Biography: Susan van ‘t Klooster PhD researches and advises strategic policy and decision-making processes as a freelance consultant in the Netherlands. She specializes in foresight methodology, practice and processes. Her research interests include evaluative foresight, evaluative risk assessment and anticipatory monitoring. Her research covers a wide range of areas, including adaptive water management, spatial planning, environmental policy, population health, education, (aviation) security, social security and employment and migration management.

Biography: Marjolijn Haasnoot PhD is a senior researcher/advisor at Deltares (an independent institute for applied research in the field of water and subsurface) and Associate Professor at Utrecht University in the Netherlands. She specializes in water management, climate adaptation, integrated assessment modeling and decision-making under deep uncertainty. Over the past 20 years she worked on international and national research and consultancy projects assessing impacts of climate change, sea level rise, socio-economic developments and alternative management options to develop robust and adaptive plans. She developed the Dynamic Adaptive Policy Pathways (DAPP) method to support decision making under uncertain change. She was one of the founders of the Society for Decision Making under Deep Uncertainty.

This blog post is part of a series on unknown unknowns as part of a collaboration between the Australian National University and Defence Science and Technology.

For the eight other blog posts already published in this series, see:

Scheduled blog posts in the series:
January 28, 2020: How can resilience benefit from planning? by Pedro Ferreira
February 11, 2020: Why do we protect ourselves from unknown unknowns? by Bem Le Hunte
February 25, 2020: Theory U: a promising journey to embracing unknown unknowns by Vanesa Weyrauch


11 thoughts on “Detecting non-linear change ‘inside-the-system’ and ‘out-of-the-blue’”

  1. Thanks for an enjoyable, different, and insightful article. Your analysis reminded me of deep national security analysis at the height of the Cold War with the then Soviet Union and about the fear of an “out of the blue” nuclear attack. That uncertainty was on both sides and to reduce the potential, both sides agreed to protocols for sharing information that could be misinterpreted and both sides created deterrent forces as threats to any foolishness. This example is not quite aligned with your article, but implies the need to be vigilant and to be aware of how things can go wrong. For more insights about this, a look at almost ancient history, the Cuban Missile Crisis, might be informative.

    In the late 90s, John Petersen produced Out of the Blue: Wild Cards and Other Big Future Surprises. He offers some thoughts about the framework to look for surprises and suggests looking at almost 150 pages of wild cards (some of which appear to have or be happening). The idea is to use them as internal preparation exercises, what-if exercises or to create similar ones for the organization. A way to do this is to suggest an unusual end point and work backwards to see how that would happen. In the futures community this is called backcasting.

    This approach is not always embraced by organizational leadership. One question I’ve used to stimulate thinking is: Your organization has just been put out of business by a competitor; how did that happen? Frequently, the question produces derision, then insight, then fear, then resolve to not let that dire scenario happen. That range of reactions is, however, anything but linear and looks more like the spaghetti line that defines design innovation.
    Your concept of “adaption tipping point” is intriguing and I took it to mean how adaptations to emerging, non-linear reality could produce a shift away from the problem state, e.g., what would be the “adaption tipping point” for climate change when we have created enough adaptions to not have to worry about it any more?

    I do have a minor nit-pick about the use of the sand grains–they would be known and if a situation has continual pressures of similar types, the analysis of the results would be more constrained. Consider complementing the sand grains with other conditions, e.g., wind, rain, and then traipse down a wild card path, adding a really unexpected force–earthquake, asteroid strike, etc. Doing that stretches the thinking which is one of the purposes of the exercise. The challenge I think is to define the right system boundaries and the external forces.

    Again, I found the article very useful and appreciate your insights.

    • Dear Jim, thanks for you inspiring and very useful feedback! I am very much intrigued by the Cold War example (on protocols for sharing information), I love the examples from your obviously rich experience in foresight, and your remark about the ‘sand grain’-metaphor is correct, indeed! Thanks a lot!

      We use Adaptation Tipping Points (ATP) to refer to situations in which the magnitude of change is such that current management strategies will no longer be able to meet objectives. For example, in the case of accelerated sea level rise, it may no longer be sufficient to raise dikes to guarantee safety. An ATP analysis can be used to evaluate under what conditions strategies may fail and alternative strategies are needed. I can send you a recent article on this subject, if you are interested.

      I also took the chance to reread your own contribution (Can foresight and complexity play together) :-). Your work on combining the four domains (‘clear’, ‘complicated’, ‘complex’ and ‘chaotic’) is very important. It reminds me of foresight work by the PBL Netherlands Environmental Assessment Agency. This agency has a long tradition in combining Research (in cases of ‘clear’ cause-effects), Prognoses (for understanding complicated systems), Scenarios (quantitative and/or qualitative, for complex systems) and Speculation (or ‘what if – analyses, to understand new, unprecedented developments and events and to stimulate ‘out of the box thinking’). See also where these combined methods are explained in much more detail. On their website, you may also find some recent examples of such combined approaches. I hope you find this useful.

      Thanks again for your comments! And let’s stay in touch,

  2. Thanks for your post. Mark Bonchek talks about creating an “Exponential Mindset” to help deal with non-linear change or progress. His article is within this post: As you say, being able to scan the external and internal environment is essential for picking up feedback about significant developments and changes that could affect and shape ongoing progress and innovation, etc. Central to this scanning is being able to listen to noise and manage fluctuations. I explore this and provide some examples of when this has and has not happened here:

  3. Thanks for the concise approach to the topic – During the research for my master thesis ‘Effects of deep uncertainty on firm risk management’ I was astounded by the level of ignorance/ lack of consideration for deep uncertainty/non-linear, low probability – high impact effects by firms.

    The ‘inside-the-system’ change/event appears to be covered (at least from the perspective of firms) by an ever-greater data collection that is then analysed based on algorithms and ‘AI-based’ analytics, which both entail built-in biases that are most often not reflected upon. Leaving this approach as sub-optimal but highly sellable.

    I think the questions you raised at the end of this article are excellent starting points. Especially the point ‘Estimative’ is intriguing as it appears to be the approach that is most ready-made for quantification and as quantification/data-driven analytics are currently the most widely accepted basis for decision making in firms – it could serve as a starting point for more widespread adaptation.

    I find the whole topic-complex fascinating – but sadly its lack of adaptation in the area of private businesses offers no pathway to make this fascination into a career.

    Thanks again for the article

    • Thanks a lot! I share your observation that many people/organisations are still inexperienced in and often reluctant to engage in dealing with non-linear change and ‘unknowns’. It asks for a different mind-set: when uncertainty is considered limited or the dimensions of uncertainty are perceived to be well-understood, it makes sense to resort to mainly quantitative approaches (e.g. based on linear extrapolation, trend-based and a forecasting way of representing uncertainty). However, for dealing with non-linear change and ‘unknowns’, data cannot be gathered, nor analyzed in a classical sense. This also means that when you want to ‘estimate’ uncertainties regarding newly emerging phenomena, you have to think about alternative ways to construct evidence and solidity (and comfort for that matter). Not instead of quantification and data-driven analysis, but in addition to it! I hope we have provided some useful vocabulary and a starting point for a more widespread adaptation.

  4. A very thoughtful piece from a very high level. I have always wondered about how an organization keeps up-to-date information about what is going on in the internal level. The news about Boeing’s corporate culture makes me wonder if people at the top can really knew what was happening and how an internal crisis (like two crashes) can change the whole thing. I think Boeing tells us that many things cause something that was an unknown (or at least an unacknowledged unknown). Can a large organization come to grips with what is really going on if managers only send up good information? Or are these organizations just bound to comes to grips with unknowns sooner or later? And when the bad stuff happens, do they respond by saying, “i didn’t know”? But how could you know what is happening in a large organization? How do you look for those grains of sand? Thank you again for a very interesting piece.

    • Thanks Patricia. In our blog post we mainly focused on the ‘external level’: we tried to present an alternative approach and useful vocabulary for identifying shocks and mutations in the dynamics of systems and contextual trends and developments. I totally agree that the internal level is highly relevant here. To be able to deal with non-linear change, you need to invest in the strategic competence of your organization and the capacity to translate relevant information into policy, strategy and/or innovation. In this process, there may be many cognitive, perceptual and interpretive biases at work. Interesting in this context is, for example, the work of Ansoff (1979, 1984) and Senge (1990), who have written about the role of filters and mental models and the way they hamper our sensitivity towards and receptiveness of new kinds of information and information flows in organizations in general. In this blog post series, also Smithson has shared interesting viewpoints on this topic! Another piece of literature that I found very inspiring in this is the book ‘The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA’ of Diane Vaughan! She analyzed the work group culture and the environment in which experts (NASA engineers and managers) work: how they negotiated risk and took decisions under uncertainty. It gave me new and valuable insights into how to look for those ‘internal’ grains of sand .

  5. Thanks for sharing, it is very helpful, although I must still read it a few more times. The rate of change can have have a negative or positive impact on the mental ability to react and make decisions. When immature leaders and managers, as part of a system, cannot assess and cope properly, it can paralyze the system. This is all I that I can add, Susan and Marjolijn.
    (Comment copied from LinkedIn by Gabriele Bammer)

    • Thanks, Quinton, for your interesting feedback. In our work, we tend to focus on the question how to get timely and reliable signposts that are convincing for decision-makers, leaders and managers to act upon. And here, indeed, the mental ability to react and make decisions is very relevant as well! Thanks for mentioning this!


Leave a Reply to Susan van 't KloosterCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.