Synthesis centers as critical research infrastructure

Community member post by Andrew Campbell

andrew-campbell
Andrew Campbell (biography)

When we think of research infrastructure, it is easy to associate astronomers with telescopes, oceanographers with research vessels and physicists with particle accelerators.

But what sort of research infrastructure (if any) do we need in order to do more effective multidisciplinary, interdisciplinary and transdisciplinary research on big, complex, ‘wicked’ challenges like climate change or food security?

Some eminent colleagues and I argue in a new paper (Baron et al., 2017) that the answers include:

  • good coffee, beer, wine and food;
  • in distraction-free places that are nevertheless supported by leading-edge informatics;
  • which attract diverse groups of scientists (by discipline, gender, age, career stage, location);
  • to work on and across heterogeneous datasets; and,
  • in skilfully facilitated processes designed to foster ‘a balanced mix of rationality and adventurous association… creative unstructured thought and discussion.’

More than twenty years ago, the US National Science Foundation (NSF), the Ecological Society of America and the Association of Ecological Research Centers identified the need for a place to undertake “multidisciplinary analysis of complex environmental problems” with the core functions being seen as advancing basic science, organising complex information so as to be more useful for decision-makers, and making better use of existing data.

The NSF funded the National Center for Ecological Analysis and Synthesis (NCEAS) at the University of California, Santa Barbara from 1995, and subsequently invested in a further three centers, the most recent being the National Socio-Environmental Synthesis Center (SESYNC) at the University of Maryland. Over that period more than a dozen other synthesis centres have been established around the world, funded by a range of organisations.

The most common activity of synthesis centres is support for working groups of up to 20 people, who come together for intensive collaboration:

  • for several days at a time;
  • often across a series of meetings over up to three years; and,
  • supported by dedicated research staff and sophisticated informatics to assist with integration and analysis of heterogeneous data.

Teams are usually constructed with care to deliberately combine experts with different backgrounds, expertise and perspectives to explore a given topic through multiple lenses.

In terms of physical infrastructure, scientific synthesis centres may indeed look like boutique hotels in cool places with top notch WiFi, characterised more by their break-out spaces and nearby restaurants and mountain bike trails than their labs or auditoriums. But the real infrastructure is mostly not hardware but informatics software and insight about dynamic social processes of scientific discourse and inquiry.

The six critical ingredients identified in our paper (the authors of which include ten current or former directors of synthesis centres) are:

  1. active management of social dynamics and intellectual space;
  2. cutting edge informatics;
  3. organisational flexibility;
  4. support for students, postdocs and sabbatical fellows;
  5. diversity within working groups; and,
  6. offering time and space (physical and intellectual) for group associative thinking.

These are in line with factors identified by Margaret Palmer and colleagues in their blog post on eight institutional practices to support interdisciplinary research.

Parker and Hackett (2012) note that focused time away from outside distraction led to “hot spots and hot moments” of unusually high creativity, enabling potentially transformative science.

There is strong bibliometric evidence that collaborations fostered in synthesis centres (reflected in co-authorship) last well beyond the synthesis-centre activity, and that interdisciplinary collaboration and the number of co-authors increases research productivity and impact. Bob Costanza and colleagues (1997) produced one of the most highly-cited papers of all time through an NCEAS workshop.

Telescopes, research vessels and particle accelerators are undoubtedly important tools for enabling humans to understand more about our world. But coming up with policy and management solutions for grand societal challenges requires much more than fancy scientific ‘kit’. It requires the combined insights of talented people from multiple perspectives (not all of them scientific), using multiple, diverse and often incomplete data, to develop new ideas interactively. We are learning from experience some of the ingredients for fostering such processes, and our new paper attempts to distil these lessons.

I’d welcome readers’ comments on your experiences in synthesis centers, or other focused time away from outside distraction.

To find out more:
Baron, J. S., Specht, A., Garnier, E., Bishop, P., Campbell, A., Davis, F. W., Fady, B., Field, D., Gross, L. J., Guru, S. M., Halpern, B. S., Hampton, S. E., Leavitt, P. R., Meagher, T. R., Ometto, J., Parker, J. N., Price, R., Rawson, C. H., Rodrigo, A., Sheble, L. A., and Winter, M. (2017). Synthesis centers as critical research infrastructure. BioScience, 67(8): 750-759. Online (Free): https://www.sciencebase.gov/catalog/item/594800f9e4b062508e3442f7. Online (DOI): 10.1093/biosci/bix053

References:

Costanza, R., d’Arge, R., de Groot, R., Farber, S., Grasso, M., Hannon, B., Limburg, K., Naeem, S., O’Neill, R. V., Paruelo, J., Raskin, R. G., Sutton, P. and van den Belt, M. (1997). The value of the world’s ecosystem services and natural capital. Nature, 387: 253-260.

Parker J. N. and Hackett E. J. (2012). Hot spots and hot moments in scientific collaborations and social movements. American Sociological Review, 77: 21–44.

Biography: Andrew Campbell is the Chief Executive Officer of the Australian Centre for International Agricultural Research, in Canberra Australia. He is also a Visiting Fellow at the Australian National University’s Fenner School of Environment and Society, and a Commissioner with the International Union for Conservation of Nature (IUCN) World Commission on Protected Areas. His research interests span the interactions between climate, water, energy and agrifood systems, and the interface between knowledge, science and policy.

Managing deep uncertainty: Exploratory modeling, adaptive plans and joint sense making

Community member post by Jan Kwakkel

jan-kwakkel
Jan Kwakkel (biography)

How can decision making on complex systems come to grips with irreducible, or deep, uncertainty? Such uncertainty has three sources:

  1. Intrinsic limits to predictability in complex systems.
  2. A variety of stakeholders with different perspectives on what the system is and what problem needs to be solved.
  3. Complex systems are generally subject to dynamic change, and can never be completely understood.

Deep uncertainty means that the various parties to a decision do not know or cannot agree on how the system works, how likely various possible future states of the world are, and how important the various outcomes of interest are. This implies that, under deep uncertainty, it is possible to enumerate possible representations of the system, plausible futures, and relevant outcomes of interest, without being able to rank order them in terms of likelihood or importance.

There is an emerging consensus that effort needs to be devoted to making any decision regarding a complex system robust with respect to such uncertainties. A plan is robust if its expected performance is only weakly affected by deep uncertainty. Alternatively, a plan can be understood as being robust if no matter how the future turns out, there is little cause for regret (the so-called “no regrets” approach to decision making).

Over the last decade a new paradigm, known as ‘decision-making under deep uncertainty’, has emerged to support the development of robust plans. This paradigm rests on three key ideas: (i) exploratory modeling; (ii) adaptive planning; and, (iii) joint sense-making.

Exploratory modelling

Exploratory modeling allows examination of the consequences of the various irreducible uncertainties for decision-making. Typically, in the case of complex systems this involves the use of computational scenario approaches (see also the blog post by Laura Schmitt-Olabisi on Dealing with deep uncertainty: Scenarios).

A set of models that is plausible or interesting in a given context is generated by the uncertainties associated with the problem of interest, and is constrained by available data and knowledge. A single model drawn from the set is not a prediction. Rather, it is a computational ‘what-if’ experiment that reveals how the real world system would behave if the various assumptions this particular model makes about the various uncertainties were correct.

A single ‘what-if’ experiment is typically not that informative, other than suggesting the plausibility of its outcomes. Instead, exploratory modeling aims to support reasoning and decision-making on the basis of the set of models. Thus exploratory modeling involves searching through the set of models using (many-objective) optimization algorithms, and sampling over the set of models using computational design of experiments and global sensitivity analysis techniques. By searching through the set of models, one can identify which (combination of) uncertainties negatively affects the outcomes of interest. In light of this, actions can be iteratively refined to be robust with respect to these uncertainties.

Adaptive planning

Adaptive planning means that plans are designed from the outset to be altered over time in response to how the future actually unfolds. In this way, modifications are planned for, rather than taking place in an ad hoc manner. The flexibility of adaptive plans is a key means of achieving decision robustness.

This means that a wide variety of futures has to be explored. Insight is needed into which actions are best suited to which futures, as well as what signals from the unfolding future can be monitored in order to ensure the timely implementation of the appropriate actions. Adaptive planning thus involves a paradigm shift from planning in time, to planning conditional on observed developments.

Joint sense-making

Decision making on uncertain complex systems generally involves multiple actors who have to come to agreement. In such a situation, planning and decision-making require an iterative approach that facilitates learning across alternative framings of the problem, and learning about stakeholder preferences and trade-offs, in pursuit of a collaborative process of discovering what is possible.

Various decision analytic techniques can be used to enable a constructive learning process amongst the stakeholders and analysts. Decision analysis in this conceptualization must shift away from the a priori agreement on (or imposition of assumptions on) the probability of alternative states of the world and the way in which competing objectives are to be aggregated, with the aim of producing a preference ranking of decision alternatives. Instead decision analysis must shift to an a posteriori exploration of trade-offs amongst objectives and their robustness across possible futures. Decision analysis should move away from trying to dictate the right choice, and instead aim at enabling deliberation and joint sense-making amongst the various parties to decision. (For more on sense-making, see Bethany Laursen’s blog post on Making sense of wicked problems).

Closing remarks

Exploratory modeling, adaptive planning, and joint sense-making are the three key ideas that underpin the emerging paradigm of decision making under deep uncertainty. Various specific approaches that exemplify these ideas include (many-objective) robust decision-making, dynamic adaptive policy pathways, decision scaling, info-gap decision theory, adaptive policy making and assumption based planning. Notwithstanding the many technical differences that exist between these approaches, there is an increasing emphasis on what is shared. In practice also, increasingly people are adopting aspects from multiple approaches in order to offer context-specific support for making decisions under deep uncertainty.

What has your experience been with decision making under deep uncertainty? What methods have you found to be useful?

Further reading:
Bankes, S. C. (1993). Exploratory Modeling for Policy Analysis. Operations Research, 4, 3: 435-449.

Haasnoot, M., Kwakkel, J. H., Walker, W. E. and ter Maat, J. (2013). Dynamic adaptive policy pathways: A method for crafting robust decisions for a deeply uncertain world. Global Environmental Change, 23: 485-498. Online (DOI): 10.1016/j.gloenvcha.2012.12.006

Herman, J. D., Reed, P. M., Zeff, H. B. and Characklis, G. W. (2015). How should robustness be defined for water systems planning under change. Journal of Water Resources Planning and Management, 141, 10. Online (DOI): 10.1061/(ASCE)WR.1943-5452.0000509

Kwakkel, J. H., Walker, W. E. and Haasnoot, M. (2016). Coping with the Wickedness of Public Policy Problems: Approaches for Decision Making under Deep Uncertainty. Journal of Water Resources Planning and Management. Online (DOI): 10.1061/(ASCE)WR.1943-5452.0000626

Lempert, R. J., Groves, D. G., Popper, S. W. and Bankes, S. C. (2006). A General, Analytic Method for Generating Robust Strategies and Narrative Scenarios. Management Science, 52: 514-528. Online (DOI): 10.1287/mnsc.1050.0472

Walker, W. E., Haasnoot, M. and Kwakkel, J. H. (2013). Adapt or Perish: A Review of Planning Approaches for Adaptation under Deep Uncertainty. Sustainability, 5: 955-979. Online (DOI): 10.3390/su5030955

Biography: Jan Kwakkel is an associate professor at Delft University of Technology in the faculty of Technology, Policy and Management. He has a background in systems engineering and policy analysis for transport systems. His current research focuses on supporting decision making under deep uncertainty. This involves the development of taxonomies and frameworks for uncertainty analysis and adaptive planning, as well as research on model-based scenario approaches for designing adaptive plans. He has applied his research in various domains, including transportation, energy systems, and health. His primary application domain is climate adaptation in the water sector. A secondary research interest is in text mining of science and patent databases. His research is currently funded for four years through a personal development grant of the Dutch National Science Foundation.

Four best practices for scaling up effective innovations

Community member post by Amanda Fixsen, Karen Blase and Dean Fixsen

What is involved in effective scaling up of innovations in order to achieve social impact? Here are four best practices, drawn from our experience in scaling up human services innovations and programs for children and families. We also provide definitions of the key terms used.

1. Understand the target audiences

Effectively scaling innovations first requires attention to defining the denominator, or population of interest for the scale-up effort, as well as the numerator, or the number of children and families who are receiving the innovation with fidelity and good outcomes.

2. Purposeful design leads to high-fidelity use

Human service systems are legacy systems comprised of an accumulation of fragments of past mandates, good ideas, beliefs, and ways of work that evolved over many decades as legislators, leaders, and staff have come and gone. These legacy systems can be fragmented, siloed and inefficient.

To realize social impact, organizations and systems need to be designed, or re-designed, on purpose to produce and sustain high-fidelity use of effective innovations.

3. Focus on scaling proven programs

Attempts to scale ineffective or harmful programs are a waste of time, money and opportunity, so programs must reliably produce positive outcomes for the population of interest.

Given that we are focused on scaling interaction-based programs that require service providers to use the program within a larger systems context, there is a great deal of complexity involved in “scaling up.” It may be difficult to assess the quality of the program for the children and families who are receiving it, as good fidelity measures for programs are not common.

amanda-fixsen
Amanda Fixsen (biography)

karen-blase
Karen Blase (biography)

dean-fixsen
Dean Fixsen (biography)

Continue reading

Productive multivocal analysis – Part 2: Achieving epistemological engagement

Community member post by Kristine Lund

kristine-lund
Kristine Lund (biography)

In a previous blog post I described multivocalityie., harnessing multiple voices – in interdisciplinary research and how research I was involved in (Suthers et al., 2013) highlighted pitfalls to be avoided. This blog post examines four ways in which epistemological engagement can be achieved. Two of these are positive and two may have both positive and negative aspects, depending on how the collaboration plays out.

Once a team begins analyzing a shared corpus from different perspectives — in our case, it was a corpus of people solving problems together — it’s the comparison of researchers’ respective analyses that can be a motor for productive epistemological encounters between the researchers. Continue reading

Productive multivocal analysis – Part 1: Avoiding the pitfalls of interdisciplinarity

Community member post by Kristine Lund

kristine-lund
Kristine Lund (biography)

Many voices are expressed when researchers from different backgrounds come together to work on a new project and it may sound like cacophony. All those voices are competing to be heard. In addition, researchers make different assumptions about people and data and if these assumptions are not brought to light, the project can reach an impasse later on and much time can be wasted.

So how can such multivocality be positive and productive, while avoiding trouble? How can multiple voices be harnessed to not only achieve the project’s goals, but also to make scientific progress? Continue reading

Toolkits for transdisciplinary research

Community member post by Gabriele Bammer

gabriele-bammer
Gabriele Bammer (biography)

If you want to undertake transdisciplinary research, where can you find relevant concepts and methods? Are there compilations or toolkits that are helpful?

I’ve identified eight relevant toolkits, which are described briefly below and in more detail in the journal GAIA’s Toolkits for Transdisciplinarity series.

One toolkit provides concepts and methods relevant to the full range of transdisciplinary research, while the others cover four key aspects: (i) collaboration, (ii) synthesis of knowledge from relevant disciplines and stakeholders, (iii) thinking systemically, and (iv) making change happen. Continue reading