By Moein Khazaei, Mohammad Ramezani, Amin Padash and Dorien DeTombe
How can services that are provided to citizens be overhauled so that they will survive, be competitive and be fair (eg., accessible to all)? Is there a systematic way in which shared value can be created? By shared value we mean combining social and environmental interests with corporate interests.
We have developed a methodology that we call “System redesign toward creating shared value” or SYRCS. It comprises 4 stages, shown in the figure below. They are:
How does your team make decisions? Do you vote? Does the loudest voice usually win? Does everyone on the team generally feel heard? Does your team have a charter to provide guidance? Or maybe there is often just silence and the team assumes agreement?
The next time your team makes a decision, here is something new you can try! Kaner (2014) proposes using a gradients of agreement scale. The gradients of agreement, also known as the consensus spectrum, provides an alternative to yes/no decision-making by allowing everyone to mark their response along a continuum, as shown in the figure below.
Do you get frustrated when decision-makers avoid doing their jobs? Do you wish you could identify the techniques they use to avoid making decisions so that you can better hold them to account?
Here I identify 14 aspects of the art of non-decision-making based on my experience serving in, and observing, a range of international organisations.
1. Definitional games: This is the process of defining categories in one way in one document or organizational unit, and then defining them in another way elsewhere or at some later time. The art is to use this approach to obscure opportunities or to selectively advance particular strategies. At the same time competing definitions may be used to justify apparently incompatible strategies.
Improved resilience can contribute to the ability to deal with unknown unknowns. Dealing with uncertainty is also at the core of every planning activity. The argument put forward here is that planning processes should be considered a cornerstone for any given resilience approach. An outline of planning and resilience is given, before presenting fundamental aspects of planning that should be strengthened within a resilience strategy.
From attempting to do as much as possible within a day’s work, to launching rockets into space or managing a nation, everything requires planning.
Change can be expected, envisioned and known, and even created, accelerated or stopped. But change does not always follow a linear and predictable path, nor is it always controllable. Novelty and surprise are inescapable features of life. Non-linear change can involve threats or opportunities.
Although it defines the world we live in, who we are, the outlooks we have and what we do, we often do not relate to non-linear change in a meaningful way. What is holding us back from engaging with it? How do we deal with non-linear change? And what are promising ways forward?
Sometimes, we wonder why decisions in Asia are being made at gargantuan speed. How do Asians deal with uncertainty arising from unknown unknowns? Can yin-yang thinking that is typical for several Asian cultures provide a useful answer?
Let’s look at differences between Asian and Western thinking first. Western people tend to prefer strategic planning with linear extrapolation of things past. The underlying mantra is risk management to buffer the organization and to protect it from harmful consequences for the business. But juxtaposing risk and uncertainty is critical. Under conditions of uncertainty, linearity is at stake and risk management limited.
What’s a productive way to think about undesirable outcomes and how to avoid them, especially in an unpredictable future full of unknown unknowns? Here I describe the technique of vulnerability analysis, which essentially has three steps:
Step 1: Identify undesirable outcomes, to be avoided
Step 2: Look for conditions that can lead to such outcomes, ie. vulnerabilities
Step 3: Manage the system to mitigate or adapt to vulnerable conditions.
The power of vulnerability analysis is that, by starting from outcomes, it avoids making assumptions about what led to the vulnerabilities.
In interdisciplinary research it’s common for at least some data to be analysed using statistical techniques. Have you been taught to look for ‘p < 0.05’ meaning that there is a less than 5% probability that the finding occurred by chance? Do you look askance at your statistician colleagues when they tell you it’s not so simple? Here’s why you need to believe them.
The whole focus on p < 0.05 to the exclusion of all else is a historical hiccup, based on a throwaway line in a manual for research workers. That manual was produced by none other than R.A. Fisher, giant of statistical inference and inventor of statistical methods ranging from the randomised block design to the analysis of variance. But all he said was that “[p = 0.05] is convenient to take … as a limit in judging whether a deviation is to be considered significant or not.” Convenient, nothing more!
By Russell Gorddard, Matthew Colloff, Russell Wise and Michael Dunlop
Adapting to climate change can require profound alterations in environmental management and policy. However the social context of a decision process limits options and resists change, often dooming attempts to adapt to climate change even before they begin. How can decision makers in policy and management more effectively see the institutional and social water they swim in, in order to better drive change?
Values, rules and knowledge (vrk) provide a useful heuristic to help decision makers analyze how the social system shapes their decision context. Put simply, decisions require:
knowledge of options and their implications
values to assess the options
rules that enable implementation.
Viewing the decision context as an interconnected system of values, rules and knowledge can reveal limits to adaptation and suggest strategies for addressing them (Gorddard et al. 2016).
Values are the set of ethical precepts that determine the way people select actions and evaluate events.
Rules are both rules-in-use (norms, practices, habits, heuristics) and rules-in-form (regulations, laws, directives).
Knowledge is both evidence-based (scientific and technical) knowledge and experiential knowledge.
Decision context is the subset of interacting subsystems that are at play in a particular decision process. One core idea is that the decision context may exclude relevant values, knowledge or rules from being considered in decisions. Adaptation may therefore involve change in the decision context.
By Tuomas J. Lahtinen, Joseph H. A. Guillaume, Raimo P. Hämäläinen
How can we identify and evaluate decision forks in a modelling project; those points where a different decision might lead to a better model?
Although modellers often follow so called best practices, it is not uncommon that a project goes astray. Sometimes we become so embedded in the work that we do not take time to stop and think through options when decision points are reached.
One way of clarifying thinking about this phenomenon is to think of the path followed. The path is the sequence of steps actually taken in developing a model or in a problem solving case. A modelling process can typically be carried out in different ways, which generate different paths that can lead to different outcomes. That is, there can be path dependence in modelling.
Recently, we have come to understand the importance of human behaviour in modelling and the fact that modellers are subject to biases. Behavioural phenomena naturally affect the problem solving path. For example, the problem solving team can become anchored to one approach and only look for refinements in the model that was initially chosen. Due to confirmation bias, modelers may selectively gather and use evidence in a way that supports their initial beliefs and assumptions. The availability heuristic is at play when modellers focus on phenomena that are easily imaginable or recalled. Moreover particularly in high interest cases strategic behaviour of the project team members can impact the path of the process.
Imagine this scenario. You are confronted by a wicked problem, such as the obesity epidemic. You know it’s a wicked problem – many previous attempts to resolve it have failed.
Suppose that you wish to develop a plan to remedy obesity. You have identified as many relevant areas of expertise and experience as you can and approached appropriate people – researchers, health practitioners, people with political influence, and so on.
You bring them together to pool their expertise—only to find that you now have another problem. Encouraging them to work collaboratively is more difficult than you expected. They talk in jargon. Their understanding is narrow. Their commitment is to their own discipline. Some of their understanding is tacit. Some of them are argumentative. And more. What are you to do?
When to advocate and when to be an honest broker is a question that deserves serious attention by those working on collaborative and engaged research initiatives. In my role as the Integrated Assessment director at the University of Michigan’s Graham Sustainability Institute I facilitate a wide array of collaborative research efforts. For most of our initiatives we strive to work within an honest broker frame. Following the work of Pielke (2007), the honest broker engages in decision-making by clarifying and sometimes expanding the scope of choice to decision-makers. Our recent analysis of options for High Volume Hydraulic Fracturing in Michigan (fracking) and outlining sustainability goals for our Ann Arbor campus are two examples which involved teams of faculty, students, practitioners and decision-makers.
The honest broker approach was particularly important for the project on fracking given the polarized views that can sometimes be associated with this topic.