By Lawrence Susskind
How can interdisciplinary teams avoid getting stuck on questions like:
- What kinds of data do we need to collect?
- What methods or techniques should we use to analyze our data?
- How should we handle gaps or incongruities in our findings?
- What are the policy implications or prescriptions that follow from our findings?
I want to share some lessons I’ve learned about handling disagreements on these four questions.
One interdisciplinary project I worked on many years ago involved trying to assess the feasibility of burying nuclear waste in the ocean floor. The engineers assumed the problem had an engineering solution. The physical scientists wanted to concentrate on modeling the dynamics of the natural systems involved. The social scientists were sure, from the outset, that the whole idea was a mistake, and wanted to focus on all the things that could go wrong (ie., all the risks).
The only way we could agree on a research design was to “suspend disbelief.” We each needed to set aside what we already knew, and work toward a research design that would allow each of us to pursue our highest priority concerns. That meant we had to get past our pre-conceptions and biases.
The engineers had to accept that the problem might not have a technical solution. The scientists had to agree that they might not know enough about the complex systems involved to model them effectively. The social scientists had to focus on how to minimize adverse social impacts rather than stopping the project all together.
We eventually completed the study. While I don’t think many of the participants were eager to do something like that again, the idea of “suspending disbelief” and considering a range of (including unpopular) possibilities made it work. If the team members commit to operating by consensus, they will need to recast or redefine the problem they are trying to solve to make room for all the participants to pursue their favorite version of the problem.
Data Analysis Tools
When parties in an interdisciplinary team favor different methods of data analysis, it is easy to fall into old patterns and assume that the way each of us has always done it is the right way.
When agreement is needed, though, the question isn’t what’s the best method of data analysis, but rather what are some possible ways of collecting and analyzing the priority data the group has identified. We call this joint fact finding.
The kind of negotiation that works best in these situations is to ask each team member to help the others understand what methodological approach they favor and why, given the question the team has agreed to address. Getting the parties to listen to each other without interrupting is difficult.
The team leader’s (or facilitator’s) job is to shift the discussion to possible criteria for assessing the value of any method or approach to answering the same question. If the parties can agree on key criteria for assessing the likely value of each method, they can usually reach agreement on a consolidated analytical approach.
Limitations on Inference-Drawing
No matter what analytic methods are selected, there are going to be gaps in the data or forecasts produced by the group. This a fact of life. In addition, there are likely to be certain data points that seem like outliers, or anomalies.
Each discipline has its own approach to handling these. Some are willing to interpolate (ie., figure out how to handle gaps or smooth over anomalies), some are not. Limits on time and money usually require teams to agree on a “second best” way of proceeding when it is not possible to start over to fill missing gaps or double-check outlying data points.
Usually, an agreement can be negotiated if options are not presented as either-or. That is, there needs to be a way for a “minority report” to be included in the final document; not as a footnote, but in the text, with a discussion of the group’s best effort to interpret the sensitivity of their findings to a missing bit of data or an anomaly.
Moving From “Is” to “Ought”
Most teams will want to draw conclusions or formulate recommendations or prescriptions of some kind. For fact- or data-oriented specialists, moving from “is” to “ought” requires making a “normative leap,” and it troubles them. The key thing to remember in these discussions is that recommendations reflect values not just facts, and thus they involve non-objective judgments.
Trying to negotiate agreements when values are at stake is not the same thing as trying to get agreement on “what the facts say.” What teams can do is identify the key findings from their work that convince each of them that specific recommendations will work as expected. Then, they can invite others with different views to challenge those recommendations and offer alternatives.
At the end of these interactions, the team can prepare recommendations that present:
- a summary of their findings (given the context they studied),
- an analysis of the reasons they think their findings are accurate and compelling,
- their recommendations and the reasoning behind them,
- reactions to their recommendations from others with an interest in the topic, but who were not directly involved, and
- the limitations on the relevance or usefulness of their recommendations for other contexts.
Even interdisciplinary teams with strong methodological biases are usually able to work out written agreements on these five points.
When scientists, engineers, designers, social scientists or artists are recruited to work together on a technical study, differences are sure to emerge in how they think about the most appropriate research or study design, the choice of tools for data analysis, managing the limits on inference drawing, and the task of generating recommendations or policies.
There are well-established negotiation guidelines that can help researchers through the bramble of disagreements that typically emerge in interdisciplinary teams. Often, however, no one on the team is skilled at facilitating the search for ways of reframing the problem to accommodate conflicting points of view, engaging in joint fact finding, incorporating non-objective judgments along with science-based or fact-based analysis, and making the “normative leap” from findings to prescriptions. When this is the case, teams would do well to seek the help of neutral facilitators, who have no interest in telling them what to do, but can walk them through the negotiations that can lead to informed agreement – what I would call consensus building.
What has your experience been? Do these lessons resonate with you? Do you have others to share?
To find out more:
Movius, H., and Susskind, L. (2009). Built to Win: Creating a World-class Negotiating Organization. Harvard Business Press. Boston, Massachusetts, United States of America.
Susskind, L. and MIT xPRO. (2022). MIT’s Negotiating and Applying Influence and Power Online Course. i2Insights readers are offered a 10% discount for the course starting November 7, 2022 (use the discount code: BLGDSC). No specific prior experience or background is required.
Biography: Lawrence Susskind PhD is Ford Professor of Urban and Environmental Planning at MIT (Massachusetts Institute of Technology) and Vice-Chair of the Program on Negotiation at Harvard Law School, both in Cambridge, Massachusetts, USA. He is a teacher, trainer, mediator, and urban planner. He is one of the founders of the field of public dispute mediation and is a practicing international mediator through the Consensus Building Institute, a Cambridge-based not-for-profit, which he founded.