Considerations for choosing frameworks to assess research impact

By Elena Louder, Carina Wyborn, Christopher Cvitanovic and Angela T. Bednarek

authors_elena-louder_carina-wyborn_christopher-cvitanovic_angela-t-bednarek
1. Elena Louder (biography)
2. Carina Wyborn (biography)
3. Christopher Cvitanovic (biography)
4. Angela Bednarek (biography)

What should you take into account in selecting among the many frameworks for evaluating research impact?

In our recent paper (Louder et al., 2021) we examined the epistemological foundations and assumptions of several frameworks and drew out their similarities and differences to help improve the evaluation of research impact. In doing so we identified four key principles or ‘rules of thumb’ to help guide the selection of an evaluation framework for application within a specific context.

Read more

Addressing societal challenges: From interdisciplinarity to research portfolios analysis

By Ismael Rafols

author_ismael-rafols
Ismael Rafols (biography)

How can knowledge integration for addressing societal challenges be mapped, ‘measured’ and assessed?

In this blog post I argue that measuring averages or aggregates of ‘interdisciplinarity’ is not sufficiently focused for evaluating research aimed at societal contributions. Instead, one should take a portfolio approach to analyze knowledge integration as a systemic process over research landscapes; in particular, focusing on the directions, diversity and synergies of research trajectories.

There are two main reasons:

Read more

‘Measuring’ interdisciplinarity: from indicators to indicating

By Ismael Rafols

author_ismael-rafols
Ismael Rafols (biography)

Indicators of interdisciplinarity are increasingly requested. Yet efforts to make aggregate indicators have repeatedly failed due to the diversity and ambiguity of understandings of the notion of interdisciplinarity. What if, instead of universal indicators, a contextualised process of indicating interdisciplinarity was used?

In this blog post I briefly explore the failure of attempts to identify universal indicators and the importance of moving from indicatORS to indicatING. By this I mean: An assessment of specific interdisciplinary projects or programs for indicating where and how interdisciplinarity develops as a process, given the particular understandings relevant for the specific policy goals.

This reflects the notion of directionality in research and innovation, which is gaining hold in policy.

Read more

Acknowledging and responding to criticisms of interdisciplinarity / Reconnaître et répondre aux critiques de l’interdisciplinarité

By Romain Sauzet

A French version of this post is available

author_romain-sauzet
Romain Sauzet (biography)

What are the core arguments that critics of interdisciplinarity employ? Which of these criticisms can help to clarify what interdisciplinarity is and what it isn’t?

While some of the criticisms of interdisciplinarity stem from a general misunderstanding of its purpose or from a bad experience, others seem well-founded. Thus, while some must be rejected, others should be accepted.

I outline five different types of criticisms drawn from three main sources:(1) academic writings (see reference list), (2) an empirical survey on interdisciplinarity (Sauzet 2017) (3) informal discussions.

Read more

Providing a richer assessment of research influence and impact

By Gabriele Bammer

author - gabriele bammer
Gabriele Bammer (biography)

How can we affirm, value and capitalise on the unique strengths that each individual brings to interdisciplinary and transdisciplinary research? In particular, how can we capture diversity across individuals, as well as the richness and distinctness of each individual’s influence and impact?

In the course of writing ten reflective narratives (nine single-authored and one co-authored), eleven of us stumbled on a technique that we think could have broader utility in assessing influence and impact, especially in research but also in education (Bammer et al., 2019).

Read more

A framework to evaluate the impacts of research on policy and practice

By Laura Meagher and David Edwards

author-laura-meagher
Laura Meagher (biography)

What is meant by impact generation and how can it be facilitated, captured and shared? How can researchers be empowered to think beyond ‘instrumental’ impact and identify other changes generated by their work? How can the cloud of complexity be dispersed so that numerous factors affecting development of impacts can be seen? How can a way be opened for researchers to step back and reflect critically on what happened and what could be improved in the future? How can research teams and stakeholders translate isolated examples of impact and causes of impact into narratives for both learning and dissemination?

Read more

Three “must have” steps to improve education for collaborative problem solving

By Stephen M. Fiore

stephen-fiore_aug-2017
Stephen M. Fiore (biography)

Many environmental, social, and public health problems require collaborative problem solving because they are too complex for an individual to work through alone. This requires a research and technical workforce that is better prepared for collaborative problem solving. How can this be supported by educational programs from kindergarten through college? How can we ensure that the next generation of researchers and engineers are able to effectively engage in team science?

Drawing from disciplines that study cognition, collaboration, and learning, colleagues and I (Graesser et al., 2018) make three key recommendations to improve research and education with a focus on instruction, opportunities to practice, and assessment. Across these is the need to attend to the core features of teamwork as identified in the broad research literature on groups and teams.

Read more

Assessing research contribution claims: The “what else test”

By Jess Dart

Jess Dart (biography)

In situations where multiple factors, in addition to your research, are likely to have caused an observed policy or practice change, how can you measure your contribution? How can you be sure that the changes would not have happened anyway?

In making contribution claims there are three levels of rigour, each requiring more evaluation expertise and resourcing. These are summarised in the table below. The focus in this blog post is on the basic or minimum level of evaluation and specifically on the “what else test.”

Read more

Producing evaluation and communication strategies in tandem

By Ricardo Ramírez and Dal Brodhead

authors_ricardo-ramírez_dal-brodhead
1. Ricardo Ramírez (biography)
2. Dal Brodhead (biography)

How can projects produce evaluation and communication strategies in tandem? Why should they even try? A major benefit of helping projects produce evaluation and communication strategies at the same time is that it helps projects clarify their theories of change; it helps teams be specific and explicit about their actions. Before returning to the benefits, let us begin with how we mentor projects to use this approach.

Read more

Institutionalising interdisciplinarity: Lessons from Latin America / Institucionalizar la interdisciplina: Lecciones desde América Latina

By Bianca Vienni Baptista, Federico Vasen and Juan Carlos Villa Soto

authors_bianca-vienni_federico-vasen_juan-carlos-villa-soto
1. Bianca Vienni Baptista (biography)
2. Federico Vasen (biography)
3. Juan Carlos Villa Soto (biography)

A Spanish version of this post is available

What lessons and challenges about institutionalising interdisciplinarity can be systematized from experiences in Latin American universities?

We analyzed three organizational structures in three different countries to find common challenges and lessons learned that transcend national contexts and the particularities of individual universities. The three case studies are located in:

  • Universidad de Buenos Aires in Argentina. The Argentinian center (1986 – 2003) was created in a top-down manner without participation of the academic community, and its relative novelty in organizational terms was also a cause of its instability and later closure.
  • Universidad de la República in Uruguay. The Uruguayan case, started in 2008, shows an innovative experience in organizational terms based on a highly interactive and participatory process.
  • Universidad Nacional Autónoma de México. The Mexican initiative, which began in 1986, shows a center with a network structure in organizational terms where the focus was redefined over time.

Read more

Doing a transdisciplinary PhD? Four tips to convince the examiners about your data

By Jane Palmer, Dena Fam, Tanzi Smith and Jenny Kent

author-mosaic_jane-palmer,_dena-fam_tanzi-smith_jenny-kent
1. Jane Palmer (biography)
2. Dena Fam (biography)
3. Tanzi Smith (biography)
4. Jenny Kent (biography)

How can research writing best be crafted to present transdisciplinarity? How can doctoral candidates effectively communicate to examiners a clear understanding of ‘data’, what it is and how the thesis uses it convincingly?

The authors have all recently completed transdisciplinary doctorates in the field of sustainable futures and use this experience to highlight the challenges of crafting a convincing piece of research writing that also makes claims of transdisciplinarity (Palmer et al., 2018). We propose four strategies for working with data convincingly when undertaking transdisciplinary doctoral research.

Read more

Using the concept of risk for transdisciplinary assessment

By Greg Schreiner

greg-schreiner
Greg Schreiner (biography)

Global development aspirations, such as those endorsed within the Sustainable Development Goals, are complex. Sometimes the science is contested, the values are divergent, and the solutions are unclear. How can researchers help stakeholders and policy-makers use credible knowledge for decision-making, which accounts for the full range of trade-off implications?

‘Assessments’ are now commonly used.

Read more