By Laura Meagher and David Edwards
What is meant by impact generation and how can it be facilitated, captured and shared? How can researchers be empowered to think beyond ‘instrumental’ impact and identify other changes generated by their work? How can the cloud of complexity be dispersed so that numerous factors affecting development of impacts can be seen? How can a way be opened for researchers to step back and reflect critically on what happened and what could be improved in the future? How can research teams and stakeholders translate isolated examples of impact and causes of impact into narratives for both learning and dissemination?
We have developed a framework to evaluate research impact in a way that addresses these questions. It has been piloted on 12 case studies led by Forest Research, a government research agency in UK (Edwards and Meagher 2019) and is likely to be useful to researchers more generally, perhaps especially but not exclusively those in applied fields. To date the framework has been found to be user-friendly and fit for purpose.