Community member post by Kristine Lund
In a previous blog post I described multivocality – ie., harnessing multiple voices – in interdisciplinary research and how research I was involved in (Suthers et al., 2013) highlighted pitfalls to be avoided. This blog post examines four ways in which epistemological engagement can be achieved. Two of these are positive and two may have both positive and negative aspects, depending on how the collaboration plays out.
Once a team begins analyzing a shared corpus from different perspectives — in our case, it was a corpus of people solving problems together — it’s the comparison of researchers’ respective analyses that can be a motor for productive epistemological encounters between the researchers.
Leveraging the project’s boundary object in order to broaden epistemological views
In our project we proposed that researchers identify ‘pivotal moments’ in the group interactions they were analyzing. We considered pivotal moments to be a boundary object in that it was a concept that was sufficiently stable to be generally understood — for example it’s a period of time, of a duration to be defined, where something of significance happens that separates the ‘before’ from the ‘after’.
In addition to having some definitional stability, the term had to be flexible enough for each researcher to give it meaning that was pertinent for particular analytical goals.
Comparing the pivotal moments that different researchers found in the same corpus allowed for the discovery of new hypothesized mechanisms that could change the group interactions. These mechanisms were located across case studies and could be subsequently experimentally tested. Comparisons of pivotal moments also led to seeing data in new ways.
Alternative operationalization brings out different aspects of a complex analytical construct
One of the researcher groups in our project analyzed students solving a chemistry problem together. Researchers used an ethnographic approach, a social network analysis approach, and two coding and counting approaches. All the researchers were interested in studying leadership in problem solving, but their unvoiced assumptions about leadership roles did not become apparent until they compared how they viewed individual students and the roles students took during group work.
Each operationalization of a complex construct like leadership brought out new layers of meaning without necessarily taking away from existing meanings. Such consensus was also possible because the data providers had already published an interesting analysis and were looking for additional insights from colleagues with different approaches.
Reaching consensus may not be so easy in other situations, but this is not a negative thing. For example, if data providers have a stake in confirming new hypotheses, or in testing the usefulness of technology, and colleagues’ analytical approaches challenge the hypotheses and technology, this is good science, as long as all assumptions are made explicit.
Enriching a method’s key analytic constructs with new meanings in an isolated manner
Listening to how others do analyses can tempt a researcher to modify well-known, existing analytic constructs without explicitly taking into account the epistemological assumptions that underlie these terms, as they were originally defined. This, however, is dangerous. It seems likely that changing the definition of analytical terms already in widespread use will hamper researchers’ ability to effectively communicate within the larger academic community. On the other hand, if analytic constructs are new, cross-perspective integration can be positive.
Recognizing incommensurability radicalizes researcher positions but also makes researchers more aware of their constraints
Sometimes integration isn’t possible, but the tensions researchers experience in attempting to achieve it can be productive. For example, the foundations of their own views can become clearer to them when they are led to defend them in discussions with colleagues who don’t agree. When researchers are forced to explicate their epistemological positions, they will more clearly understand the constraints of their own frameworks and this makes them more careful researchers.
Integration across theories and methods from different disciplines does not always work, but even when it doesn’t, it can be productive. In other words, even when researchers do not share assumptions about the world, or about how to do research, epistemological engagement allows them to become aware of their differences, perhaps acknowledging that they are irreconcilable for the particular goals of the current project.
Multivocal analyses are more than mixed method analyses. In a mixed methods approach, qualitative and quantitative approaches are combined in different manners in order to answer research questions and these methods are concerted and conceived of in an integrated way from the beginning.
In multivocal analyses, a researcher must first share data, then an analytic construct must be defined as a boundary object that analysts can make pertinent for their own methods, and finally analyses must be compared. Assumptions about theory, the purpose of analysis, the phenomenon paid attention to, data representations and manipulations should be made explicit from the outset, throughout all steps of the collaborative process.
Both mixed methods and multivocal analyses can strengthen conclusions and broaden insights, but multivocal analyses go further. In our case, research questions were answered, but in the process researchers from different domains who study groups explored their assumptions about doing research, even changed their views on gathering, preparing, analyzing, and visualizing data, and published papers together that illustrated the different ways that integration can occur between disciplines.
What has your experience been in achieving epistemological engagement? Do you know of ways to encourage researchers to be open to engaging with different epistemological positions?
I acknowledge the important contributions of all my colleagues in the Productive Multivocality Project, especially Dan Suthers and Carolyn Rosé, in addition to Michael Baker. I also wish to thank the CNRS and the ASLAN project (ANR-10-LABX-0081) of University of Lyon, for its financial support within the program “Investissements d’Avenir” (ANR-11-IDEX-0007) of the French government operated by the National Research Agency (ANR).
Suthers, D. D., Lund, K., Rosé, C. P., Teplovs, C. and Law, N. (Eds.). (2013). Productive Multivocality in the Analysis of Group Interactions. Springer: New York, United States of America. See https://link.springer.com/book/10.1007/978-1-4614-8960-3.
Biography: Kristine Lund works for the French CNRS (Centre National de la Recherche Scientifique) as a Senior Research Engineer in the Interactions, Corpus, Apprentissages, Représentations (Interactions, Corpora, Learning, Representations) language sciences laboratory at the University of Lyon. She leads an interdisciplinary research team in the study of human interaction and cognition. Her recent work focuses on connecting systems of different orders (linguistic, cognitive, interactional, social) in order to better understand collaborative knowledge construction. She is Chief Scientific Officer and co-founder of www.Cognik.net.