Productive multivocal analysis – Part 1: Avoiding the pitfalls of interdisciplinarity

Community member post by Kristine Lund

Kristine Lund (biography)

Many voices are expressed when researchers from different backgrounds come together to work on a new project and it may sound like cacophony. All those voices are competing to be heard. In addition, researchers make different assumptions about people and data and if these assumptions are not brought to light, the project can reach an impasse later on and much time can be wasted.

So how can such multivocality be positive and productive, while avoiding trouble? How can multiple voices be harnessed to not only achieve the project’s goals, but also to make scientific progress?

When researchers from different domains begin working on a project, they use their disciplinary knowledge and biases about the world to define their view of the project’s goal, what type of data is important to attend to, and how it should be gathered, prepared, analyzed, and visualized. Different researchers will have different views on these issues because their training and experience influence them to focus on questions that can be answered with methods they normally use.

This is one definition of a multidisciplinary project, where researchers from different disciplines work in parallel in a shared context, but focus on different questions and/or objects of research that all contribute to the project, yet do so in isolation from each other. This works well for some projects, but in others, it’s productive to consider how researcher contributions may interact together.

In a 5-year project with researchers from education, psychology, language sciences, and computer science (Suthers et al., 2013), we put together some tips on avoiding the pitfalls and reaping the benefits of a type of interdisciplinarity we call multivocal analysis. This blog post deals with avoiding the pitfalls and a second blog post addresses how the benefits, specifically epistemological engagement, can be achieved.

The integration between disciplines which occurs in interdisciplinarity can happen in many different ways. Researchers can:

  • develop ways of asking questions that are novel for each discipline involved;
  • integrate theoretical aspects to develop a multi-theoretic view; and,
  • focus on a common object to which they can apply different analytical methods that culminate in broadening a conceptual construct.

Achieving one of these types of integration helps researchers to move outside of their bubble by forcing them to question their assumptions.

So, how can we avoid the pitfalls in interdisciplinary projects? I’ll use our own experience of studying group interaction as a way to share insights. In our project, five interdisciplinary scientific teams each studied a corpus involving group interaction and compared multiple analytical perspectives around what each analyst described as a ‘pivotal moment’.

Avoiding the pitfalls of interdisciplinarity

We ran up against four pitfalls that with hindsight we can give advice on. The first two are macro-level functioning pitfalls and involve team constitution and communication outside of the team. The second two are more micro-level functioning pitfalls and involve data transfer and analysis sharing during team members’ work together.

  1. Team constitution: In a project where data is shared, particular team members may approach the data in a way that doesn’t align with the team leader’s expectations, thus provoking tension about project goals. In addition, the data provided may not meet a particular analytic approach’s needs, thus leading to an unsatisfactory application of the method.
  2. Communication outside the team: Presenters may poorly manage the needs of the different audiences interested in the analyses. In particular, those who provided data should be treated with sensitivity, as these stakeholders have taken a risk in sharing data, so that extensive criticism about the data’s nature or quality should be avoided. In addition, a presenter can either not adapt their discourse or overly specify it. For example, she may not succeed in helping different varieties of non-experts to understand the analyses because she isn’t speaking in general enough terms. On the other hand, she may target a particular stakeholder in her talk because of a specific argument they have, thus hijacking what should have been a public presentation, and alienating the other stakeholders.
  3. Data transfer pitfalls: If the data provider has unspoken assumptions regarding data, this is likely to end in disappointment. Expectations about what the data are and how they should be analyzed should be mutually understood. Some methods require contextual information about data that isn’t considered relevant for other methods. For example, performing sampling or carrying out processes of reformatting (including removing aspects that are irrelevant for one method) may render another method impossible to apply. If such data preparation is not communicated, some analysts may waste time doing analyses that become meaningless for them, once they understand how the data was manipulated before it got to them.
  4. Analysis sharing pitfalls: Data analysts may also have unspoken assumptions that cause problems of validity. For example, analysts may assume that the shared data is representative of the dataset when it is not. Or they may only take into account selected contextual information, while leaving other aspects out, such as heterogeneity caused by experimental manipulations. In these cases, the meaning of analyses are not what the researcher thinks.


We named the special type of interdisciplinarity discussed above “multivocal analysis”, given the multiple voices contributing to the analyses that were shared and compared. This analysis became productive by meeting head-on a set of pitfalls concerning how teams are constituted, how they communicate, and how they deal with data and analysis. All of the pitfalls involved unspoken assumptions of some sort that arose at different stages of the team’s collaboration. We organized our collaboration at the outset so as to render explicit researchers’ assumptions by asking them about their:

  • theoretical assumptions;
  • purpose of analysis;
  • representations of data and analytic constructs;
  • manipulations and transformations of data used to draw conclusions.

Such self-reflection and exchange set the stage for a more general awareness and sensitivity to the pitfalls we encountered, thus enabling us to deal openly with them.

What methods have you used for avoiding the kinds of pitfalls described here? Do you see other pitfalls? Do you think it takes a special kind of researcher to engage in interdisciplinarity?

Suthers, D. D., Lund, K., Rosé, C. P., Teplovs, C. and Law, N. (Eds.). (2013). Productive Multivocality in the Analysis of Group Interactions. Springer: New York, United States of America. See

Biography: Kristine Lund works for the French CNRS (Centre National de la Recherche Scientifique) as a Senior Research Engineer in the Interactions, Corpus, Apprentissages, Représentation (Interactions, Corpora, Learning, Representations) language sciences laboratory at the University of Lyon. She leads an interdisciplinary research team in the study of human interaction and cognition. Her recent work focuses on connecting systems of different orders (linguistic, cognitive, interactional, social) in order to better understand collaborative knowledge construction. She is Chief Scientific Officer and co-founder of

12 thoughts on “Productive multivocal analysis – Part 1: Avoiding the pitfalls of interdisciplinarity

  1. Very interesting post. Thank you Kristine! A welcome reminder and development of the importance of taking into consideration the multiplicity of voices (an organized polyphony and not a disorganized cacophony) in interdisciplinary work, a salutary recognition also of the interculturality that is built in and from the diversity of disciplines. Multivocal analysis can also evoke a less competitive but more cooperative and dialogic approach to the exchanges between conceptions, modeling, epistemological postures and methods specific to each discipline. Less rivalry and mutual inter-understanding for more intercomprehension, tolerance and empathy between specialist researchers, actors for a good co-management of knowledge as a common good. A reasoned cocktail of principles and values likely to avoid the various pitfalls on the path of interdisciplinarity and above all to emerge from the top of the impasses that disciplinary partitioning and the de facto cognitive and cultural confinements. The analysis of interactions and discursive practices in multidisciplinary research groups aimed at constructive dialogue and integration (not fusion) between disciplinary horizons is a real added value brought by the multivocal approach. An approach to promote and further develop for the productive and reflective benefit of any researcher engaging in interdisciplinary work and who must be sensitized and trained there.

  2. Easier said than done but we all will be more successful if we heed Ms. Lund’s cautions and advice. A mutual peek at Derek Cabrera’s Distinctions, Systems, Relationships and Perspectives framework can help.

    • Dear Jring281,
      Thank-you for this pointer to Derek Cabrera’s work which I am not familiar with. I have looked briefly at this:
      Are you then suggesting that the questions below will help multivocal analyses during interdisciplinary work?
      If so, do you have insight on how this could be operationalized during a research project?
      Best, Kris

      Distinctions (identity and other): What is __? What is not __?
      Systems (part and whole): Does __ have parts? Can you think of __ as a part?
      Relationships (inter and action): Is __ related to __? Can you think of __ as a relationship?
      Perspectives (point and view): From the perspective of __, [insert question]? Can you think about __ from a different perspective?

      • Kristine, Sorry I can’t spend more time on this. Your ideas and work are worthwhile. Suggest that one proven way to harmonize multifocal is described by John Warfield and Roxanna Cardenas in Handbook of Interactive Management. It works.

  3. The writer is not describing research but support of preconceived notions. Research implies following the data to a conclusion. Hypothesis comes after research not before.

    • Ohhh my…just finished a PhD course in statistics…one can have a working hypothesis, AND a null hypothesis…. plus an alternative hypothesis…or none at all as in an exploratory research piece. If I am doing a mixed methods study…I would state the null and the alternative…or…well, you get it. I may prove or disprove my hypothesis yet, I believe you need to know what you are looking at in order to use a hypothesis. Now qualitative research is a different horse…you can have an open ended methodology …yet, you would also have a research question. The findings come at the end…when you have analyzed the data. Hope this helps…

      • Dear Terry,
        You responded to Dennis, and if I may, I would say that you are giving an argument for what I wrote to him. In other words, whether a hypothesis comes after research or before research is a matter of how we regard the research process, in a larger context. On the one hand, we have the hypothetico-deductive method where we do indeed have specific hypotheses and we set up experiments to confirm or infirm them. On the other hand, we have research where the goal is descriptive. In conversation analysis, for example, the objective is to identify and describe fundamental practices in the production and recognition of actions and sequences of action (Antaki, 2011). Now, some people think that such work is hypothesis generating and that it comes before experimental methodology in a larger research program where the hypothesis that was generated would be taken up and tested. This is indeed one way to present a mixed methods approach (e.g. do qualitative research to generate hypotheses and then do quantitative research to test them). However, many conversation analysts are not interested in experimental validation of hypotheses. Their view is that such description is an end in and of itself, as long as the phenomenon of interest is sufficiently described in multiple naturally occurring contexts. So again, these are assumptions about doing research and they differ according to the theoretical and methodological frameworks people have been trained in.
        Best, Kris

        Antaki, C. (2011). (Ed.), Applied Conversation Analysis: Intervention and Change in Institutional Talk. New York, NY: Palgrave Macmillan.

    • Dear Dennis,
      The argument I am making is that our research is guided by our assumptions. Problems arise when we do not make these assumptions explicit in our collaboration and in our communication with others. These assumptions have to do with many aspects of doing research. About thirty of us, from 13 countries, worked together for the duration of this project on the study of group interaction in pedagogical contexts. As time went on, it became evident that the difficulties we encountered had to do with how people thought research “should” be done. You stating “hypothesis comes after research not before” is a good example of such an assumption. This is an arguable point, depending on one’s outlook on the research process.
      Best, Kris

  4. Very interesting article. Is there any way the pit falls could be avoided by being up front on all levels in terms of personnel and leadership engaged in conducting interdisciplinary data sharing and exchange exercises within and across teams? I have found it quite fascinating to learn and engage various stakeholders in a team-science project, with regards to personalities, in-depth subject matter expertise, and above all humility and willingness to listen, learn and inculcate varied point of view and perspective. The demand on time, priority of publishing together or submitting of joint collaborative multi-component grant proposals to various funding agencies, patents, taking bench side discoveries into clinical realm takes lot of patience, respect for each other and critical judicious thinking that what we can accomplish as a group in one year might take us at least five years individually to figure out nuances and in providing solutions to the problem at hand.
    Thank You!
    Anil Wali

    • Dear Anil,
      I agree that this kind of interdisciplinary work will only succeed if people are willing to engage in a humble manner. We must be open to the ways that other people do research, while making sure that we can communicate about the assumptions we are basing our own decisions on.
      So I would say yes, that the pitfalls can all be avoided by being up front on all levels. These include clearly stating analytical goals and methodological requirements, and adapting discourse to stakeholders.
      It’s not easy work, but I personally think it’s worth it!
      Best, Kris

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s