Producing evaluation and communication strategies in tandem

Community member post by Ricardo Ramírez and Dal Brodhead

Ricardo Ramírez (biography)

How can projects produce evaluation and communication strategies in tandem? Why should they even try? A major benefit of helping projects produce evaluation and communication strategies at the same time is that it helps projects clarify their theories of change; it helps teams be specific and explicit about their actions. Before returning to the benefits, let us begin with how we mentor projects to use this approach.

Dal Brodhead (biography)

We co-lead DECI (Designing Evaluation and Communication for Impact), an action-research project that provides capacity building to research teams in the global south. We mentor these projects to develop their own evaluation and communication plans, something that we refer to as a hybrid decision-making approach. We recently published an on-line guide (Ramirez and Brodhead 2017) that includes the diagram below which summarizes the steps.

ramirez_mentoring - steps in evaluation and communication planning

(Source: Ramirez and Brodhead 2017)

The steps on the left side of the diagram are derived from Utilization-focused Evaluation, UFE (Patton 2008; Ramirez and Brodhead 2013), an approach to evaluation that emphasizes practical use of the findings and the process. Those on the right come from communication planning. Both share the notion of a readiness assessment, and situational analysis, which are depicted in the middle. The rotating arrows underline the iterative nature of the process.

How to do this?

We support partners at a pace that suits their needs through a process of coaching combined with peer learning. Each step in the hybrid approach includes a set of questions, which challenge the partner team to be clear on outcomes, procedures, stakeholders, networks, assumptions and methodology. On the evaluation side, we ask a set of readiness questions:

  • How capable is the project to work with this approach?
  • Is there a power relationship with funders that allows the team to co-own the evaluation design?
  • Is there an organizational culture that respects learning and adaptation?
  • Is there buy-in from senior management; and,
  • Are there engaged staff and resources able to carry out evaluation plans and implement a communication strategy?

These readiness requirements are sufficiently strategic to the point that we don’t sign a memorandum of understanding until we have most of these questions addressed (Ramírez and Brodhead 2014).

We follow with the main Utilization-focused Evaluation steps and ask who the primary users of the evaluation may be (internal staff, trusted partners, representatives of the funder, etc) and we engage with them to confirm their interest and availability. We also explore their expected uses or purposes. This process begins to show us the internal dynamics of an organization or project, especially the hierarchy, level of trust, and willingness to work in a collaborative manner.

In our experience, most evaluation users are quick to elicit evaluation questions, but they find it more difficult to step back and explore the underlying uses or purposes. The latter are important as a means of mapping how an evaluation will be utilized, rather than having a report sit on a shelf collecting dust. We find that it is also important to link the evaluation uses to the key evaluation questions. A good key evaluation question is clear, linked to the evaluation uses, and based upon this clarity one can easily determine the type of data collection tools and evidence needed.

In tandem with the Utilization-focused Evaluation steps, we begin asking questions about the project’s existing communication practices. Most projects have a communication way of doing things, even though it may be neither explicit nor strategic; yet it is often based on experience and some intuition. We ask about overall communication purposes, and we help the team identify them. The following purposes are common:

  • Communication for networking
  • Communication for active-listening and engagement
  • Communication for knowledge sharing for a community or practice
  • Communication for public relations, for visibility
  • Communication for dissemination of findings and lessons learned
  • Communication for advocacy
  • Communication for policy influence.

Differentiating the purposes and the audiences is helpful, as a means of setting priorities. We encourage the partners to do some ‘audience research’ to confirm each audience preference such as for media channels, methods, timing, etc. For each communication purpose, we explore the best combination and recommend testing of materials and methods. We also remind our partners that often the most effective communication is difficult to plan, but being ready to respond to windows of opportunity is possible and desirable, especially in the policy-making arena.

The benefits

Often, we witness how a hybrid effect begins to unfold. The partner realizes that an evaluation use could focus on the effectiveness of their communication strategy. Conversely, they realize that evaluation findings can often feed into and strengthen a communication strategy. However, most important, is that the process of clarification creates a space for organizational reflection and adaptation. The process begins as a planning process and creates the conditions for adaptive management. This result, in a nutshell, is the benefit we flagged earlier.

These concepts and processes can be useful to practitioners, facilitators, and researchers. Our website ( contains multiple tools and case studies. We are keen to hear about comparable practices and experiences that others may wish to share.

To find out more:
Ramírez, R. and Brodhead, D. (2017). Evaluation and communication decision-making: A practitioner’s guide. Developing Evaluation and Communication Capacity in Information Society Research, DECI-2 Project. Ontario, Canada. (Online):

Patton, M. Q. (2008). Utilization-focused evaluation. 4th edn. Sage: California, United States of America.

Ramírez, R. and Brodhead, D. (2013). Utilization-focused evaluation: A primer for evaluators. Southbound: Penang, Malaysia. (Online):

Ramírez, R. and Brodhead, D. (2014). Readiness and mentoring: Two touchstones for capacity development in evaluation. CDI Conference: Improving the use of M&E processes and findings, 20-21 March 2014, Wageningen, The Netherlands.

Biography: Ricardo Ramírez PhD is a researcher and consultant, based in Guelph, Ontario, Canada and is active in the fields of evaluation, communication for development, rural planning and natural resource management. He is an adjunct professor in the School of Environmental Design and Rural Development, University of Guelph. He is a Credentialed Evaluator (Canadian Evaluation Society) and co-principal investigator of DECI-3: Designing Evaluation and Communication for Impact: an action-research project in evaluation & communication funded by International Development Research Centre, Canada.

Biography: Dal Brodhead has been the CEO of the New Economy Development Group Inc., a value–based consulting firm located in Ottawa, Ontario, Canada since 1990. He brings a strong background in community development, project management, evaluation, and applied research in Canada and internationally. Previously, he held senior posts in various Federal departments, and directed a national research project on regional development for the Economic Council of Canada. Internationally, he has led numerous evaluation and monitoring missions in Asia and Africa with an emphasis upon participatory and recipient-driven and inclusive approaches. He is co-principal investigator of DECI-3: Designing Evaluation and Communication for Impact: an action-research project in evaluation & communication funded by International Development Research Centre, Canada.

A checklist for documenting knowledge synthesis

Community member post by Gabriele Bammer

Gabriele Bammer (biography)

How do you write-up the methods section for research synthesizing knowledge from different disciplines and stakeholders to improve understanding about a complex societal or environmental problem?

In research on complex real-world problems, the methods section is often incomplete. An agreed protocol is needed to ensure systematic recording of what was undertaken. Here I use a checklist to provide a first pass at developing such a protocol specifically addressing how knowledge from a range of disciplines and stakeholders is brought together.


1. What did the synthesis of disciplinary and stakeholder knowledge aim to achieve, which knowledge was included and how were decisions made? Continue reading

Linking learning and research through transdisciplinary competences

Community member post by BinBin Pearce

BinBin Pearce (biography)

What are the objectives of transdisciplinary learning? What are the key competences and how do they relate to both educational goals and transdisciplinary research goals? At Transdisciplinarity Lab (TdLab), our group answered these questions by observing and reflecting upon the six courses at Bachelor’s, Master’s, and PhD levels that we design and teach in the Department of Environmental Systems Science at ETH Zurich, Switzerland.

Six competence fields describe what we hope students can do with the help of our courses. A competence field contains a set of interconnected learning objectives for students. We use these competence fields as the basis for curriculum design. Continue reading

Structure matters: Real-world laboratories as a new type of large-scale research infrastructure

Community member post by Franziska Stelzer, Uwe Schneidewind, Karoline Augenstein and Matthias Wanner

What are real-world laboratories? How can we best grasp their transformative potential and their relationship to transdisciplinary projects and processes? Real-world laboratories are about more than knowledge integration and temporary interventions. They establish spaces for transformation and reflexive learning and are therefore best thought of as large-scale research infrastructure. How can we best get a handle on the structural dimensions of real-word laboratories?

What are real-world laboratories?

Real-world laboratories are a targeted set-up of a research “infrastructure“ or a “space“ in which scientific actors and actors from civil society cooperate in the joint production of knowledge in order to support a more sustainable development of society.

Although such a laboratory establishes a structure, most discussions about real-world laboratories focus on processes of co-design, co-production and co-evaluation of knowledge, as shown in the figure below. Surprisingly, the structural dimension has received little attention in the growing field of literature.

Overcoming structure as the blind spot

We want to raise awareness of the importance of the structural dimension of real-world laboratories, including physical infrastructure as well as interpretative schemes or social norms, as also shown in the figure below. A real-world laboratory can be understood as a structure for nurturing niche development, or a space for experimentation that interacts (and aims at changing) structural conditions at the regime level.

Apart from this theoretical perspective, we want to add a concrete “infrastructural” perspective, as well as a reflexive note on the role of science and researchers. Giddens’ use of the term ‘structure’ helps to emphasize that scientific activity is always based on rules (eg., rules of proper research and use of methods in different disciplines) and resources (eg., funding, laboratories, libraries).

The two key challenges of real-world laboratories are that:

  1. both scientists and civil society actors are involved in the process of knowledge production; and,
  2. knowledge production takes place in real-world environments instead of scientific laboratories.
Franziska Stelzer (biography)


Uwe Schneidewind (biography)


Karoline Augenstein (biography)


Matthias Wanner (biography)


Continue reading

Using Ostrom’s social-ecological systems framework to set context for transdisciplinary research: A case study

Community member post by Maria Helena Guimarães

Maria Helena Guimarães (biography)

How can Elinor Ostrom’s social-ecological systems framework help transdisciplinary research? I propose that this framework can provide an understanding of the system in which the transdisciplinary research problem is being co-defined.

Understanding the system is a first step and is necessary for adequate problem framing, engagement of participants, connecting knowledge and structuring the collaboration between researchers and non-academics. It leads to a holistic understanding of the problem or question to be dealt with. It allows the problem framing to start with a fair representation of the issues, values and interests that can influence the research outcomes. It also identifies critical gaps as our case study below illustrates. Continue reading

The university campus as a transdisciplinary living laboratory

Community member post by Dena Fam, Abby Mellick Lopes, Alexandra Crosby and Katie Ross

How can transdisciplinary educators help students reflexively understand their position in the field of research? Often this means giving students the opportunity to go beyond being observers of social reality to experience themselves as potential agents of change.

To enable this opportunity, we developed a model for a ‘Transdisciplinary Living Lab’ (Fam et al., forthcoming). This builds on the concept of a collaborative test bed of innovative approaches to a problem or situation occurring in a ‘living’ social environment where end-users are involved. For us, the social environment is the university campus. We involved two universities in developing this model – the University of Technology Sydney and Western Sydney University. We aimed to help students explore food waste management systems on campus and to consider where the interventions they designed were situated within global concerns, planetary boundaries and the UN Sustainable Development Goals.

The Transdisciplinary Living Lab was designed and delivered in three largely distinct, yet iterative phases, scaling from individual experiences to a global problem context. These phases of the living lab, which work to integrate personal and professional knowledge and practice, are also shown in the figure below:

1. Entering the living lab was the phase where students were introduced to collaborative teamwork processes, expectations of joint problem formulation and critical reflection on their own position within the system being explored: ‘digging where they stand’. This meant helping students consider their relationships with the food waste system as consumers of food and producers of waste, as well as their potential impact as designers of interventions in that system.

2. Transdisciplinary learning was the second phase where students were introduced to the concept of research as a process of system intervention, as well as skills for co-producing and integrating knowledge in collaboration with diverse partners in the food system. For the Transdisciplinary Living Lab at the University of Technology Sydney this meant listening to, questioning and collaborating with relevant stakeholders in the system to investigate historical and current approaches to the issue, and exploring precedents for dealing with food waste in other parts of the world. Central to this phase was ensuring the sharing of knowledge among the students as it was produced. This meant organising a publically accessible class blog that can be viewed at and weekly debriefs and discussions on insights gained.

Dena Fam (biography)

Abby Mellick Lopes (biography)

Alexandra Crosby (biography)

Katie Ross (biography)

Continue reading