Combining and adapting frameworks for research implementation

By Kirsty Jones and Sara Bice

1. Kirsty Jones (biography)
2. Sara Bice (biography)

How can combining frameworks help plan a research implementation process? What specific contributions can different frameworks make?

In our research with industry, we found combining three frameworks to be an effective way to get handles on a complex implementation landscape and to design the necessary steps to systematically work our way through it. The frameworks we found useful were: a logic model, a pathway to impact and the Consolidated Framework for Implementation Research, which we adapted to our context.

We provide four figures to show how we used each framework and briefly describe the benefits we derived from each of them. Although fully understanding the detail in the figures requires familiarity with the specifics of our research, we trust the figures provide insight into how each framework was used.

A simplified logic model

A simplified logic model, as shown in the figure below, was used as a first step in thinking through what would be required to achieve implementation, by identifying the impacts and outcomes we were working towards. The model placed these impacts and outcomes alongside the potential inputs, activities, processes and outputs for the research. This provided us with a clear structure for considering how these distinct but important aspects of our research were interrelated.

It also encouraged attention to the various levels at which we would need to achieve outcomes in order to support implementation, namely, individual, organisational and sectoral levels. The logic model provided a clear articulation of what we aimed to achieve. At the same time, it also revealed a need to consider how the identified outcomes could be achieved.

Simplified logic model to help systematically think through which inputs, activities, processes and outputs were required to achieve desired outcomes and impacts (Jones and Bice, 2021b)

Pathway to impact

Our second step focused on what would be required for a research-derived intervention to be implemented successfully. What was our pathway to impact?

Traditionally pathways to impact are flat and linear. Instead, we visualised our pathway to impact as a step-wise series of building blocks, each block providing a reinforcing foundation for progress to reach the next step. As shown in the figure below, our five steps were:

  • buy-in;
  • ownership;
  • adoption;
  • adaptation; and,
  • implementation.
Pathway to impact conceived as a step-wise series of building blocks (Jones and Bice, 2021b)

Paying attention to individual, organisational and sectoral levels in our pathway to impact helped us to identify specific outcomes required at each level in each of the different steps in the pathway, as shown in the figure below. Through this process we also identified preliminary, mid-stream and late-stage outcomes at each level that needed to be achieved to make progress.

Pathway to impact for individual, organisational and sectoral levels (Jones and Bice, 2021b)

Adapting the Consolidated Framework for Implementation Research (CFIR)

The Consolidated Framework for Implementation Research outlines characteristics of an intervention which are important for implementation success. It also details research-derived “enablers” and “barriers” to research implementation. Although it was developed for healthcare interventions, we found it easy to adapt to our very different implementation context.

We interrogated the key constructs and characteristics identified in the Consolidated Framework for Implementation Research to pinpoint specific outcomes at the individual, organizational, and sectoral levels necessary to achieve our implementation outcomes. Importantly, we also identified the points along our pathway to impact where it would be important for these outcomes to be achieved.

By methodically thinking through our pathway to impact we could see that our research process would require a high level of engagement and involvement with our industry partners (both organisations and individuals). It also forced us to consistently question whether and how early phase co-design and knowledge translation activities could influence the achievement of later-stage outcomes.

We had three aims: identify the needs of industry, encourage swift adoption and support long-term implementation.

The Consolidated Framework for Implementation Research helped us connect each part of the research process to an implementation outcome, as shown in the figure below. Identifying and visualising these complex interconnections helped us to understand exactly how our research approach could support our pathway to impact. This also allowed us to revisit our logic model and adjust the resources needed and stakeholders necessary to engage at various stages, based on these new insights.

Using the Consolidated Framework for Implementation Research to link activity categories, activity breakdown and implementation outcomes (Jones and Bice, 2021b)


Our implementation research involved developing Australia’s first quality assurance standards for community engagement in major infrastructure projects. Each year in Australia governments at various levels invest tens of billions of dollars in infrastructure development, including transport, energy, water, urban and regional development, health and social infrastructure. All these projects require formalised engagement with impacted communities and stakeholders, but, as yet, there are no standards for these.

Could the way we approached implementation also be useful in your area of research? What other approaches have you found useful?

To find out more:
Jones, K. and Bice, S. (2021a). Improving research impact: lessons from the infrastructure engagement excellence standards. Evidence and Policy: A Journal of Research, Debate and Practice. (Online – early release):

Jones, K. and Bice, S. (2021b). Research for impact: Three keys for research implementation. Policy Design and Practice: 1-21. (Online – open access):

Biography: Kirsty Jones PhD is a research fellow at the Institute for Infrastructure in Society, Crawford School of Public Policy at the Australian National University in Canberra. She is knowledge translation lead and a primary researcher for the ‘Next Generation Engagement Program’, Australia’s largest study of community engagement in infrastructure.

Biography: Sara Bice PhD is a professor at the Crawford School of Public Policy, The Australian National University in Canberra and director of the Institute for Infrastructure in Society. She is Vice Chancellor’s Futures Scheme Senior Fellow and leads the ‘Next Generation Engagement Program’, Australia’s largest study into community engagement in infrastructure.

5 thoughts on “Combining and adapting frameworks for research implementation”

  1. I appreciate the rich potential of this framework, which could be considered a roadmap for an engagement. I would assume that there are also feedback loops throughout the process to continually refine and adapt to the emerging realities of engagement. Your “competitive pressures” is an intriguing step/action; is that an internal or an external assessment, or both? I do wonder where resource identification, allocation, and monitoring fit into the framework.
    Again, kudos for this work, especially with the concept of beginning “with the end in mind” (per Ackoff). Jim

    • Thank you, Jim, for the very thoughtful comment. We do consider feedback loops and also have an overarching loop that we see as informing the framework. You can check it out here: (see, Figure 2).

      Competitive pressures has been a really interesting and important consideration. We’ve found that they are both internal and external. This will likely be dependent on the size/type of organisation with which you’re working. In our case, community engagement teams are an important component of project management within much larger teams. This means that they will experience internal competitive pressures – for everything from attention to budget.

      We considered resource identification/allocation and monitoring from the earliest design stages via our Logic Model (see, Figure 1 in the link above).

      And yes, always start where you want to finish!

      • Thank you Sara. I appreciate the added details. Your comment about internal competitive pressures reminds me of times when we thought we had buy in for an effort and really didn’t discover we did not until it was implementation time. Getting that out in the open as early as possible can save a lot of pain and wasted resources. Thank you again. Jim

  2. This is very interesting but a bit confusing. There seem to be two different bodies of research here. One is that to be implemented and the other the research on implementation. Also we do not usually talk of research being implemented via an “intervention”, at least in the sciences I am familiar with, which are not health sciences. Or is something other than research being implemented, a mandate perhaps?

    • Hi David, thanks for your comment. There were two components to our work. The first component was to carry out research to determine what the quality assurance standards for community engagement in infrastructure might be. The second component was to consider how we could design this research in a way that would support later implementation. In healthcare settings, implementation research has looked at the methods to support the uptake of polices or programmes or individual practices, and these can collectively be called interventions. We viewed our quality assurance standards as an intervention to be implemented; and so drew on implementation research to inform our research design.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: