Producing evaluation and communication strategies in tandem

By Ricardo Ramírez and Dal Brodhead

authors_ricardo-ramírez_dal-brodhead
1. Ricardo Ramírez (biography)
2. Dal Brodhead (biography)

How can projects produce evaluation and communication strategies in tandem? Why should they even try? A major benefit of helping projects produce evaluation and communication strategies at the same time is that it helps projects clarify their theories of change; it helps teams be specific and explicit about their actions. Before returning to the benefits, let us begin with how we mentor projects to use this approach.

We co-lead DECI (Designing Evaluation and Communication for Impact), an action-research project that provides capacity building to research teams in the global south. We mentor these projects to develop their own evaluation and communication plans, something that we refer to as a hybrid decision-making approach. We recently published an on-line guide (Ramirez and Brodhead 2017) that includes the diagram below which summarizes the steps.

ramirez_mentoring - steps in evaluation and communication planning

(Source: Ramirez and Brodhead 2017)

The steps on the left side of the diagram are derived from Utilization-focused Evaluation, UFE (Patton 2008; Ramirez and Brodhead 2013), an approach to evaluation that emphasizes practical use of the findings and the process. Those on the right come from communication planning. Both share the notion of a readiness assessment, and situational analysis, which are depicted in the middle. The rotating arrows underline the iterative nature of the process.

How to do this?

We support partners at a pace that suits their needs through a process of coaching combined with peer learning. Each step in the hybrid approach includes a set of questions, which challenge the partner team to be clear on outcomes, procedures, stakeholders, networks, assumptions and methodology. On the evaluation side, we ask a set of readiness questions:

  • How capable is the project to work with this approach?
  • Is there a power relationship with funders that allows the team to co-own the evaluation design?
  • Is there an organizational culture that respects learning and adaptation?
  • Is there buy-in from senior management; and,
  • Are there engaged staff and resources able to carry out evaluation plans and implement a communication strategy?

These readiness requirements are sufficiently strategic to the point that we don’t sign a memorandum of understanding until we have most of these questions addressed (Ramírez and Brodhead 2014).

We follow with the main Utilization-focused Evaluation steps and ask who the primary users of the evaluation may be (internal staff, trusted partners, representatives of the funder, etc) and we engage with them to confirm their interest and availability. We also explore their expected uses or purposes. This process begins to show us the internal dynamics of an organization or project, especially the hierarchy, level of trust, and willingness to work in a collaborative manner.

In our experience, most evaluation users are quick to elicit evaluation questions, but they find it more difficult to step back and explore the underlying uses or purposes. The latter are important as a means of mapping how an evaluation will be utilized, rather than having a report sit on a shelf collecting dust. We find that it is also important to link the evaluation uses to the key evaluation questions. A good key evaluation question is clear, linked to the evaluation uses, and based upon this clarity one can easily determine the type of data collection tools and evidence needed.

In tandem with the Utilization-focused Evaluation steps, we begin asking questions about the project’s existing communication practices. Most projects have a communication way of doing things, even though it may be neither explicit nor strategic; yet it is often based on experience and some intuition. We ask about overall communication purposes, and we help the team identify them. The following purposes are common:

  • Communication for networking
  • Communication for active-listening and engagement
  • Communication for knowledge sharing for a community or practice
  • Communication for public relations, for visibility
  • Communication for dissemination of findings and lessons learned
  • Communication for advocacy
  • Communication for policy influence.

Differentiating the purposes and the audiences is helpful, as a means of setting priorities. We encourage the partners to do some ‘audience research’ to confirm each audience preference such as for media channels, methods, timing, etc. For each communication purpose, we explore the best combination and recommend testing of materials and methods. We also remind our partners that often the most effective communication is difficult to plan, but being ready to respond to windows of opportunity is possible and desirable, especially in the policy-making arena.

The benefits

Often, we witness how a hybrid effect begins to unfold. The partner realizes that an evaluation use could focus on the effectiveness of their communication strategy. Conversely, they realize that evaluation findings can often feed into and strengthen a communication strategy. However, most important, is that the process of clarification creates a space for organizational reflection and adaptation. The process begins as a planning process and creates the conditions for adaptive management. This result, in a nutshell, is the benefit we flagged earlier.

These concepts and processes can be useful to practitioners, facilitators, and researchers. Our website (https://evaluationandcommunicationinpractice.net/) contains multiple tools and case studies. We are keen to hear about comparable practices and experiences that others may wish to share.

To find out more:
Ramírez, R. and Brodhead, D. (2017). Evaluation and communication decision-making: A practitioner’s guide. Developing Evaluation and Communication Capacity in Information Society Research, DECI-2 Project. Ontario, Canada. (Online): https://evaluationandcommunicationinpractice.net/e-primer/

References:
Patton, M. Q. (2008). Utilization-focused evaluation. 4th edn. Sage: California, United States of America.

Ramírez, R. and Brodhead, D. (2013). Utilization-focused evaluation: A primer for evaluators. Southbound: Penang, Malaysia. (Online): https://evaluationandcommunicationinpractice.net/knowledgebase/utilization-focused-evaluation-a-primer-for-evaluators/

Ramírez, R. and Brodhead, D. (2014). Readiness and mentoring: Two touchstones for capacity development in evaluation. CDI Conference: Improving the use of M&E processes and findings, 20-21 March 2014, Wageningen, The Netherlands.

Biography: Ricardo Ramírez PhD is a researcher and consultant, based in Guelph, Ontario, Canada and is active in the fields of evaluation, communication for development, rural planning and natural resource management. He is an adjunct professor in the School of Environmental Design and Rural Development, University of Guelph. He is a Credentialed Evaluator (Canadian Evaluation Society) and co-principal investigator of DECI-3: Designing Evaluation and Communication for Impact: an action-research project in evaluation & communication funded by International Development Research Centre, Canada.

Biography: Dal Brodhead has been the CEO of the New Economy Development Group Inc., a value–based consulting firm located in Ottawa, Ontario, Canada since 1990. He brings a strong background in community development, project management, evaluation, and applied research in Canada and internationally. Previously, he held senior posts in various Federal departments, and directed a national research project on regional development for the Economic Council of Canada. Internationally, he has led numerous evaluation and monitoring missions in Asia and Africa with an emphasis upon participatory and recipient-driven and inclusive approaches. He is co-principal investigator of DECI-3: Designing Evaluation and Communication for Impact: an action-research project in evaluation & communication funded by International Development Research Centre, Canada.

6 thoughts on “Producing evaluation and communication strategies in tandem”

  1. The innovative approaches and the value they bring can easily be lost once a collaborative project comes to an end, especially if evaluation is done almost as an afterthought and purely for reporting purposes. Combining evaluation and communication strategies not only encourages earlier and ongoing consideration of evaluation approaches but also makes it more likely that a project will ensure that its achievements become widely known and adopted and adapted within mainstream organisational and institutional practice.

    There can be nothing worse than a great innovation gathering dust on the shelf of lost evaluation studies: all for lack of publicising.

    Reply
    • Thanks for this Charles. Indeed we have found that some of our partners have taken up an evaluation and communication ‘way of thinking’ as a result of the mentoring; which signals capacity building gains that will bear fruit in other initiatives that they undertake.

      Reply
  2. Thank for sharing your experience
    We have been developed a evaluation model as part of an intervention we conducted as researchers to support the institutionalisation of the ‘Social Innovation Lab for Digital Government’ (LAB) within the National Agency of Electronic Government and the Information and Knowledge Society (AGESIC by its acronym in Spanish), in Uruguay.

    Evaluating public innovation is currently a significant challenge for governments due to the emerging nature of the phenomenon whose complexity does not conform to widespread conventional models of evaluation. The understanding of the scale and nature of public innovation is still limited, and evaluation in this area does not have a theoretical framework or conclusive definition (Dayson 2016, Graddy-Reed and Feldman 2015, Bund et al. 2015). However, it can be said there is a certain agreement that the purpose of evaluating these innovations is not only to improve the management of an organization, but also to communicate the public value they generate (Jeppensen 2017, Dayson 2016, Bund et al. 2015, Preskill and Beer 2012, Bason 2010; Phills et al. 2008).

    Considering recent emergent approaches to evaluation, the international experiences of public innovation labs and our work with the LAB’s team, we jointly decided that the evaluation proposal should contribute to their strategic learning in order to improve systemic innovation in the public sector. Besides creating strategic learning, another purpose of this roadmap is to provide the LAB with a conceptual framework that helps them to communicate the value of public innovation to hierarchies.

    The pilot roadmap proposed by the LAB draws primarily from DE (Westley, Patton and Zimmerman, 2006, Patton 2010, 2011), organizational learning (Argyris and Schön 1978, 1996; Stringer 2007), and reflexive monitoring (Arkesteijn et al. 2015, Van Mierlo et al. 2010 2011). It was also inspired by the confluence of other relevant approaches, from public design evaluative thinking (Bason), social innovation evaluation (Antadze and Westley 2012), to systemic evaluation of learning (Midgley et al. 2002, Miyaguchi and Uitto 2017).

    The fundamental principle underlying an evaluation of this type is to produce collective knowledge that supports the team when making decisions and improving its strategy as it unfolds. The evaluation seeks to address one of the organizations main challenges such as the evaluation of co-creation processes and transdisciplinary knowledge generation, which are the basis of LAB’s strategy (Bammer, 2005, Pohl et al. 2008, Lang et al 2012, Polk, 2014). We therefore assumed that there is a reciprocal relationship between strategy and evaluation because, as observed by Preskill and Beer (2012: 4), when both elements are comprehended and carried out in this way, the organization ‘is better prepared to learn, grow, adapt and continuously change in meaningful and effective ways, and can communicate internally and externally the value it generates

    Reply

Leave a Reply to Ricardo RamirezCancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Discover more from Integration and Implementation Insights

Subscribe now to keep reading and get access to the full archive.

Continue reading