Community member post by Laura R. Meagher
I am a firm believer in looking at interdisciplinary collaboration and knowledge exchange – or impact generation – as processes. If you can see something as a process, you can learn about it. If you can learn about it, you can do it better!
I find that this approach helps people to feel enfranchised, to believe that it is possible for them to open up what might have seemed to be a static black box and achieve understanding of the dynamics of how nouns like ‘interdisciplinarity’ or ‘knowledge exchange’ or ‘research impact’ can actually come to be.
In addition to using sources such as master classes, briefing guides or articles, individuals involved in interdisciplinary work can take several (inter-related) pro-active approaches to learning about interdisciplinary processes:
- formative evaluation
- facilitated reflection
- critical friends.
Deep reflection as to what is working or not, which factors are facilitators or obstacles, whether progress is or is not being made toward what might be a set of objectives, whether or not some of those objectives need to be altered to make the most of changing circumstances – that is in essence formative evaluation. This typically takes place during an initiative.
However, even when I am brought in at or shortly after the end of a programme, such that my analysis might be assumed to be ‘summative evaluation’ or a final marking, I operate under the assumption that participants – researchers, stakeholders, even funders themselves – may well have gained insights that could be useful for those involved in similar processes in the future. By gathering such insights, I am recasting my role, at least in part, into ‘formative evaluation’, evaluation of something that is ‘in progress’, if one takes a sufficiently long-term, big-picture view.
For example, gathering learning from across several of my evaluations shed light on key mechanisms and roles, such as the key roles played by knowledge intermediaries – ‘making the invisible visible’ (Meagher and Lyall, 2013). My evaluations typically offer lessons learned to funders, researchers and stakeholders in the hope that processes will become more effective over time.
Asking questions about processes and related factors and roles spurs people into reflection. Once they are given the time and space to ‘step back’, some people surprise themselves by the depth of insights they actually possess.
Some become very engaged with the searching activity of reflection; as an evaluator coming in at the end, I have had programme leaders say to me that they wish they’d had someone asking them such questions during their programme – when they had been so busy they had never stopped to think about just how things were unfolding and therefore did not make conscious efforts to shape the processes involved.
It is that sort of reflection, interwoven with the delivery of a complex programme, that I believe strengthens the warp and woof of a programme, adding to its resilience and its capacity to evolve and adapt in real-time to opportunities and challenges that arise.
Formative evaluation can take the form of structured ‘self-evaluation’ by the leadership team of an initiative, or, better yet, shared self-evaluation by a broader group of participants. Bringing stakeholders into this sort of reflection can enhance both their engagement and future pathways for the initiative, as well as, potentially, its impacts. A related mechanism would be to draw deeply upon an advisory group, in facilitated, carefully designed workshops/think tanks – at the start, mid-way and near-end of an initiative, for example.
A ‘critical friend’ is someone external who is associated with the leadership team of an initiative from the start, who strikes a happy balance between being an advocate for success of the initiative and maintaining a helpful level of objectivity. Sensitive to processes, a critical friend can act as a sounding board, but can also ask questions that are: naive, searching, challenging, unexpected – and in so doing, stimulate reflection. Utility of a critical friend is not associated with a particular subject matter, but rather with a programme’s complexity and ambition.
By virtue of having a designated critical friend, an initiative is likely to maintain a commitment to periodic reflection. My most recent such role was with an interdisciplinary team of mathematicians, medics in paediatric cardiac surgery, and specialists in operational research and in public engagement. I worked with the team’s leader to help her design several key meetings and facilitated the stage-appropriate reflection that went on at each, beginning with conceptualisation of which component group could contribute what and how the ‘pieces’ could be integrated. (When one of the exercises generated illustrations of the interrelationship among components, I was surprised but pleased to see myself portrayed, complete with large glasses, ‘CF’ on my front and … a cape! Critical friend as super-hero certainly raised my own aspirations!)
All of us involved in the challenging (but rewarding) processes of interdisciplinarity, knowledge exchange and/or impact generation can be helped by deconstructing processes, timeframes and roles in real-time in order to progress toward effective collaborations and/or a full range of impacts. Early framing of expectations and identification of what would be telling ‘indicators’ of progress will inform necessary mid-course corrections.
Ongoing reflection and self-evaluation can simultaneously:
- draw all players together
- increase understanding of processes
- facilitate integration
- enhance the probability of impacts.
All in all, this can generate learning for evolution of a particular programme or indeed for increasing capacity of participants in future programmes.
What has your experience been with reflection, including facilitated reflection? How about self-evaluation or formative evaluation? Have you ever had a critical friend? Even an imaginary one? Could you be a critical friend for someone else’s endeavour?
Meagher, L. and Lyall, C. (2013). The invisible made visible: Using impact evaluations to illuminate and inform the role of knowledge intermediaries. Evidence and Policy, 9, 3: 409-18.
Biography: Laura Meagher PhD is the Senior Partner in the Technology Development Group, an Honorary Fellow at the University of Edinburgh and at the James Hutton Institute and an Associate at the Research Unit for Research Utilisation at the University of St Andrews. She has spent over 25 years working in the US and the UK with and within research and education institutions, along with industry and government, focussing on strategic change. Two key elements of her portfolio are complementary: facilitating change and evaluating results of change efforts. She has evaluated interdisciplinary capacity-building schemes, interdisciplinary research programmes, interdisciplinary evaluation mechanisms and has recently collaborated on an analysis of interdisciplinary higher education across the UK.