Integration and Implementation Insights

Productive multivocal analysis – Part 1: Avoiding the pitfalls of interdisciplinarity

By Kristine Lund

kristine-lund
Kristine Lund (biography)

Many voices are expressed when researchers from different backgrounds come together to work on a new project and it may sound like cacophony. All those voices are competing to be heard. In addition, researchers make different assumptions about people and data and if these assumptions are not brought to light, the project can reach an impasse later on and much time can be wasted.

So how can such multivocality be positive and productive, while avoiding trouble? How can multiple voices be harnessed to not only achieve the project’s goals, but also to make scientific progress?

When researchers from different domains begin working on a project, they use their disciplinary knowledge and biases about the world to define their view of the project’s goal, what type of data is important to attend to, and how it should be gathered, prepared, analyzed, and visualized. Different researchers will have different views on these issues because their training and experience influence them to focus on questions that can be answered with methods they normally use.

This is one definition of a multidisciplinary project, where researchers from different disciplines work in parallel in a shared context, but focus on different questions and/or objects of research that all contribute to the project, yet do so in isolation from each other. This works well for some projects, but in others, it’s productive to consider how researcher contributions may interact together.

In a 5-year project with researchers from education, psychology, language sciences, and computer science (Suthers et al., 2013), we put together some tips on avoiding the pitfalls and reaping the benefits of a type of interdisciplinarity we call multivocal analysis. This blog post deals with avoiding the pitfalls and a second blog post addresses how the benefits, specifically epistemological engagement, can be achieved.

The integration between disciplines which occurs in interdisciplinarity can happen in many different ways. Researchers can:

Achieving one of these types of integration helps researchers to move outside of their bubble by forcing them to question their assumptions.

So, how can we avoid the pitfalls in interdisciplinary projects? I’ll use our own experience of studying group interaction as a way to share insights. In our project, five interdisciplinary scientific teams each studied a corpus involving group interaction and compared multiple analytical perspectives around what each analyst described as a ‘pivotal moment’.

Avoiding the pitfalls of interdisciplinarity

We ran up against four pitfalls that with hindsight we can give advice on. The first two are macro-level functioning pitfalls and involve team constitution and communication outside of the team. The second two are more micro-level functioning pitfalls and involve data transfer and analysis sharing during team members’ work together.

  1. Team constitution: In a project where data is shared, particular team members may approach the data in a way that doesn’t align with the team leader’s expectations, thus provoking tension about project goals. In addition, the data provided may not meet a particular analytic approach’s needs, thus leading to an unsatisfactory application of the method.
  2. Communication outside the team: Presenters may poorly manage the needs of the different audiences interested in the analyses. In particular, those who provided data should be treated with sensitivity, as these stakeholders have taken a risk in sharing data, so that extensive criticism about the data’s nature or quality should be avoided. In addition, a presenter can either not adapt their discourse or overly specify it. For example, she may not succeed in helping different varieties of non-experts to understand the analyses because she isn’t speaking in general enough terms. On the other hand, she may target a particular stakeholder in her talk because of a specific argument they have, thus hijacking what should have been a public presentation, and alienating the other stakeholders.
  3. Data transfer pitfalls: If the data provider has unspoken assumptions regarding data, this is likely to end in disappointment. Expectations about what the data are and how they should be analyzed should be mutually understood. Some methods require contextual information about data that isn’t considered relevant for other methods. For example, performing sampling or carrying out processes of reformatting (including removing aspects that are irrelevant for one method) may render another method impossible to apply. If such data preparation is not communicated, some analysts may waste time doing analyses that become meaningless for them, once they understand how the data was manipulated before it got to them.
  4. Analysis sharing pitfalls: Data analysts may also have unspoken assumptions that cause problems of validity. For example, analysts may assume that the shared data is representative of the dataset when it is not. Or they may only take into account selected contextual information, while leaving other aspects out, such as heterogeneity caused by experimental manipulations. In these cases, the meaning of analyses are not what the researcher thinks.

Conclusions

We named the special type of interdisciplinarity discussed above “multivocal analysis”, given the multiple voices contributing to the analyses that were shared and compared. This analysis became productive by meeting head-on a set of pitfalls concerning how teams are constituted, how they communicate, and how they deal with data and analysis. All of the pitfalls involved unspoken assumptions of some sort that arose at different stages of the team’s collaboration. We organized our collaboration at the outset so as to render explicit researchers’ assumptions by asking them about their:

Such self-reflection and exchange set the stage for a more general awareness and sensitivity to the pitfalls we encountered, thus enabling us to deal openly with them.

What methods have you used for avoiding the kinds of pitfalls described here? Do you see other pitfalls? Do you think it takes a special kind of researcher to engage in interdisciplinarity?

References:
Suthers, D. D., Lund, K., Rosé, C. P., Teplovs, C. and Law, N. (Eds.). (2013). Productive Multivocality in the Analysis of Group Interactions. Springer: New York, United States of America. See https://link.springer.com/book/10.1007/978-1-4614-8960-3.

Biography: Kristine Lund works for the French CNRS (Centre National de la Recherche Scientifique) as a Senior Research Engineer in the Interactions, Corpus, Apprentissages, Représentation (Interactions, Corpora, Learning, Representations) language sciences laboratory at the University of Lyon. She leads an interdisciplinary research team in the study of human interaction and cognition. Her recent work focuses on connecting systems of different orders (linguistic, cognitive, interactional, social) in order to better understand collaborative knowledge construction. She is Chief Scientific Officer and co-founder of Cognik.net [Moderator update – In March 2023, the www[dot]cognik.net link was no longer available and so the link structure has been left in place but the active link deleted].

Exit mobile version