Unintended consequences of honouring what communities value and aspire to

Community member post by Melissa Robson

melissa-robson
Melissa Robson (biography)

It seems simple enough to say that community values and aspirations should be central to informing government decisions that affect them. But simple things can turn out to be complex.

In particular, when research to inform land and water policy was guided by what the community valued and aspired to rather than solely technical considerations, a much broader array of desirable outcomes was considered and the limitations of what science can measure and predict were usefully exposed.

Context and process

Let me start by setting some context. In New Zealand both national regulation and regional strategies require quantitative water quality and water quantity limits to be set for each body of water – lakes, rivers, groundwater aquifers and so on.

These requirements were introduced in the last few years and I was part of a multi-disciplinary technical team which was among the first in the country to go through this ‘limit-setting’ exercise for a water catchment (a catchment is all of the land that contributes water to a water body).

A zone committee was established to decide on the limits and to support their implementation. It was made up of representatives of the community, including the Indigenous Māori community (Iwi), and government. The local government agency responsible for environmental management decided that a radical overhaul was needed in the role of the technical team to support this new committee. One of the first manifestations of this overhaul was to change how the technical team established what the project would cover and what the criteria would be for assessing future land use policy options.

A community process was used to establish what was valued locally, to determine local priority outcomes covering social, economic, cultural and environmental wellbeing and what was aspired to by the community for their catchment. This information shaped the technical assessment framework, shown in the figure below, in four major ways.

First, it was used for setting the boundaries of what the research would cover. Instead of the scope of the research being constrained by what the traditional technical team of government scientists – who were mainly from biophysical disciplines – said they could measure or model, the community said what they valued and where and when they valued it. It was then the job of the technical team to see what information they needed to generate and, therefore, what disciplines and sources of knowledge they needed to include in the technical assessment.

Second, it was used to identify the stakeholder groups within the community.

Third, it was used to build a range of scenarios of possible futures for the catchment. There were already water resource conflicts in the catchment and different parts of the community had different aspirations for the future. Instead of ‘picking a winner’, a suite of scenarios was built to explore the range of aspirations articulated by the community.

Fourth, the outcomes the community valued and aspired to were used to establish the assessment criteria against which all of the future scenarios would be tested. The key elements in the process of establishing the assessment criteria were that:

  1. Zone committee members described what the catchment would look like if each of the community’s priority outcomes was realised.
  2. Based on these descriptions, the technical team determined the relevant indicators that described these priority outcomes and what techniques (eg., models) could be used to try and predict these indicators and assess the consequent likelihood of those outcomes being met.
  3. The indicators were used to evaluate each of the future scenarios that the zone committee wanted to test.
robson_unintended-consequences
Figure adapted from original in Wedderburn and Kelly (2010)

What happened

The technical team found that we did not have the available techniques to predict the consequences of the future scenarios across all of the priority outcomes – some outcomes we could only partially cover and others we could not cover at all.

It was generally straightforward to predict the likelihood of priority outcomes being met under different scenarios when the community aspirations:

  • were expressed as a direction of change eg., “customary fisheries are improved”
  • were the subject of an absolute standard eg., “domestic drinking water meets national standards without treatment”.

There were priority outcomes that the technical team were only able to cover partially, for example “wahi tapu (sacred places) and mahinga kai (food and material gathering) are respected, understood, protected and enhanced”. In this case the technical team were able to predict the likelihood of the ‘protection and enhancement’ element of this outcome being met across different scenarios, but were not able to predict the ‘respect and understanding’ element.

There were also priority outcomes that the technical team could not predict the likelihood for being met under different scenarios. These were outcomes that were concerned with:

  • the basis of management decisions (eg., “nutrient and water management is based on clear and agreed science including Matauranga (Māori knowledge and understanding)”)
  • policy implementation (eg., “land managers use optimal water and nutrient practices for their land classes, soil type and farm system”).

We found an additional problem when the priority outcome and narrative description were based on a value judgement eg., “thriving and sustainable communities” as opposed to a direction of desired change or a verifiable state. In this case, population size was chosen as one of the indicators. However, it was not clear from the narrative whether “thriving and sustainable communities” were based on an increased, static or a decreased population. The technical team therefore needed to make a judgement about whether population increases or decreases best delivered the outcome.

Lessons learnt

A lesson learnt for boundary setting was that starting with what the community wants leads to a much broader array of desirable outcomes, and exposes the limitations of what can be predicted. This was useful as it helped to manage expectations about the research from the start and identify where other sources of knowledge would be required. It also required that researchers from different disciplines work together to respond to the outcomes.

Another lesson learnt was the benefit of using community values and aspirations for assessment criteria and to develop scenarios in situations where there was pressure and conflict around the use of water resources. The benefit gained was through the systematic exploration of the different aspirations of the community through scenarios and the likely consequence of these scenarios on the community values, as this allowed people to understand the consequences of their aspirations on what was valued by others.

A third lesson learnt for establishing assessment criteria and indicators was that, where the outcomes contained a value judgement, the technical team needed to work with the zone committee to identify at least a direction of desirable change. This was better than the technical team making a value judgement on its own.

Have you found similar challenges? If so, how did you manage them?

To find out more:
Robson, M. (2014). Technical report to support water quality and quantity limit setting in Selwyn Waihora catchment: Predicting consequences of future scenarios – Overview Report. Report No. R14/15. Environment Canterbury Regional Council Kaunihera Taiao ki Waitaha: Christchurch, New Zealand. (Online): http://files.ecan.govt.nz/public/lwrp/variation1/tech-report-sw-overview.pdf (PDF 3.9MB)

Reference:
Wedderburn, E. and Kelly, S. (2010). Informing decision-making through deliberative approaches: A procedural guideline. Overview Report. Report produced for Environment Waikato on behalf of the “Creating Futures” project, Hamilton, August. (Online): http://www.creatingfutures.org.nz/assets/CF-Uploads/Publications/Deliberation-Matrix/Informing-decision-making-through-deliberative-approaches—a-procedural-guideline.pdf (PDF 899KB)

Biography: Melissa Robson is an environmental scientist in the Governance and Policy team at Landcare Research in Lincoln, New Zealand. She has spent the past five years leading complex co-developed or stakeholder focused research programmes to support and inform land use and water quality decision-making by local communities and policy makers.

Melissa Robson leads The Collaboration Lab project funded by Our Land and Water National Science Challenge.

12 thoughts on “Unintended consequences of honouring what communities value and aspire to

  1. The process described is a radical change from the way land and water policy development had been carried out by regional/local government in New Zealand prior to about 2010. The outcomes as adopted policy are partly in place and still being implemented. It is too soon to make clear judgements about desired outcomes of water quality improvement, appropriate water allocation and associated land use, although there are promising directions of movement.

    I make these comments, not as an academic or research peer in science integration or complexity, but as a multi-decadal observer of, and participant in, land and water management in New Zealand. My comments are informed by participation as a community member in the zone committee Melissa describes and an observer of the associated community process involving a larger number of community and interest group participants.

    First, I confirm that Melissa’s description of the process from her point of view as the main facilitator and mouthpiece of the technical team, necessarily and commendably brief though it is, is in harmony with my impressions as a participant in the meetings for both processes. So my comments are ‘additional thoughts’.

    Trust by ‘lay’ participants in technical information provided was enhanced, not lessened, by the willingness of technical team members to say “I don’t know” when this was appropriate. Sometimes this was followed by “But I will find out.” The intention that the process should be ‘informed’ by science, but not ‘led’ by it, required resolve on the part of the technical team; once trust was established it was tempting for participants to believe that team members might have the ‘best’ answers.

    The overall process took a lot of ‘people-time’ compared to previous ways policy had been developed, both people in paid remuneration and people giving their time voluntarily. In the zone committee, it took time for the group dynamics [and again, trust] to become settled. Then there was a period of useful, often iterative, progress, including the community feedback process, on development of policy recommendations. Ideally, that should have led to a further period of community consultation and cooperative policy development with the regional government. Real-world political imperatives ensured that the final period was shorter than ideal.

    Notoriously, there has been a gap, sometimes a canyon, between academic ‘decision aiders’ and real-world ‘decision makers’. I see in the approach Melissa has briefly described promise of both narrowing the gap and simultaneously combining science in action with land and water management.

    • Hi David, many thanks for your great comment and useful insight. You are absolutely right that this new way of land and water policy development hasn’t been in place long enough to see whether or not it can or will help deliver improved outcomes. My main focus was on the role of the technical team and I do wonder whether the radical overhaul of the way technical teams operate would have been seen as a useful or necessary change regardless of the policy process.
      What do you think of this based on the long experience that you have had with land and water management in NZ?

      • “Useful”, yes; but “necessary”, I think, would be too strong a claim. (I acknowledge that yours was a “wonder”, not a “claim”; I chose the more definite word deliberately, just to clarify my own thinking about the substantive question.) The context is a technical team advising a community group who recommend, to local/regional government, policy on land and water management. Such a technical team can be ‘in-house’ or put together from multiple source agencies for the purpose. Your “radical overhaul of the way technical teams operate” could apply to either. One thing I consider to have been “useful” was the mix of disciplines and context experience on-team, customised for the purpose, with flexibility to add as answering the questions required.
        The “lessons learnt” are also very “useful” in the context. I am aware of relevant situations where more conventional, in-house, technical teams have carried out good research and data gathering over many years, and kept up-to-date with sophisticated techniques, including state-of-the-art computer modelling, only to be completely blind-sided by questions from their colleagues or their community that they hadn’t thought of asking or whose relevance has but recently become clear.
        I consider “necessary” too strong a claim as it is “regardless of the policy process”. There are other processes for developing policy where different technical team operation might well “usefully” apply. Your “radical overhaul of the way technical teams operate” was arguably “necessary” for the community-led, science-informed, policy development of which you were part.

        • The observation that “… conventional, in-house, technical teams have carried out good research and data gathering over many years, … only to be completely blind-sided by questions from their colleagues or their community that they hadn’t thought of asking or whose relevance has but recently become clear.” is not uncommon. One can find examples even in ordinary news reports.

          It applies to all aspects of policy development, not only technical work. The systematic nature of technical analysis encourages a tacit assumption that everything we need to know is included in the models we use. However, anyone can fall into the same trap. Each time we see a carefully refined policy overtaken by events, it is quite likely that the surprise could have been avoided if the complex nature of community interests had been acknowledged and allowed to have a voice, not just a small reference group but a continuous broad-based interaction that allowed for emergent insights.

  2. Thanks Stephen, really appreciate your comments.
    The process that we used was to engage the committee in a discussion of ‘what does this outcome look like’. This is similar I think to what you suggest in your second point. We then tried to fit component indicators to this description, as you suggest in your first point. I think where we fell down was not realising, until it was too late, that we hadn’t done a good enough job on the first stage, i.e. we didn’t have a clear enough picture of what the committee meant for all of their outcomes. We had all their narrative descriptions of successful outcomes, but in the light of the component indicators that we then used, we didn’t have quite enough information. I think at this point, we should have gone back and clarified. I just looked up the Future Backwards methodology. It looks really interesting. I can see how it could have generated really valuable information for the situation that we were in by generating a rich description of what success looked like and also what complete failure looked like as well. Great – thanks!

    • It sounds as though your instincts were leading you in a useful direction. Two points come to mind concerning the way the way that the process interacted with the committee.

      What the participants were really concerned about might not have been clear to them at a conscious level when you started out. A process that allowed that to emerge from the interactions between them around the narrative input might have brought you closer to the answer but, even then, every time they think about the subject, read about what others propose or see something actually being put into practice, their positions can shift. Continuously refreshing a process that allows for emergent insights could help to bring the exercise closer to a stable conclusion more quickly.

      Communities are generally quite large, even in NZ (!), while steering committees, focus groups and advisory bodies are limited by logistical challenges and constraints on cost and time. Not only is there a danger of them being unrepresentative but they can become captive to their own entrained view of the world. Consultative processes that have the capacity to engage a large number of people without undue expense can help with this. The Cognitive Edge SenseMaker method and tools offer this capability. {I have no financial interest in SenseMaker but see it as an as a valuable resource in work of this sort that is as yet underutilised}

      • We experienced both of those things! When you have used the Future backwards tools, did you repeat the exercise several times in the process to capture the evolving and clarifying position?

        With the second point, we had a committee and a wider community group of about 80 people and various groups that we kept up with and had regular workshops with. However, there was a real concern that although they may have been representative to begin with, that as the committee went on their amazing journey, that there was a risk of them becoming less representative of the wider community. Some of the members spent a lot of additional time and effort trying to bring others in their community along, but any tools that help with this would be invaluable.

  3. This is a very interesting concise statement of the challenges with such decision analysis.

    Two possible approaches to the value judgments come to mind.

    One is some form of multi attribute utility structure. It isn’t necessary to get into the mechanistic detail of utility functions. Simply breaking a broad topic such as “thriving and sustainable” into contributory factors or component indicators might simplify the task by allowing judgments about narrower or more straightforward elements to be made and then drawn together into a description of the outcome for the umbrella concept. Once the structure, typically a small hierarchy of one or two levels, is developed, the process can proceed with quantitative measures if you want to, qualitative measures where you combine scale levels using an agreed matrix, or simply in words as a descriptive exercise.

    The second is to engage the stakeholders in an exploration of a topic, such as “thriving and sustainable” in such a way that insight and understanding of the consequences emerges from their interaction. This is possible using methods based in complexity science, typically using narrative collection as the starting point, whether via a workshop or an online, and therefore scaleable, process. The Cognitive Edge Future Backwards method might be a useful starting points, probably conducted with multiple stakeholder groups. It should at least flush out the sticking points at a level where they can be addressed more easily than trying to trade-off thriving against acknowledgement of traditional lore, for instance.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s