Communicating Amidst Complexity: Principles to Guide Developmental Evaluators

Funders that have ambitious goals to change large systems often create partnerships with intermediaries and evaluators to realize their visions. But what does it take to make these partnerships work in practice? This blog is the first in a series that shares lessons learned from Linked Learning Regional Hubs of Excellence, a three-year initiative to accelerate the demand for and number of graduates from Linked Learning pathways that blend high school and college success with career readiness. The initiative also seeks to elevate the scale and sustainability of quality pathways systems in each of the four participating regions. We offer the blog series in a learning-focused, transparent spirit. This first post addresses setting expectations for how information from a developmental evaluation is shared among partners.

When The James Irvine Foundation first embarked on this work, program leaders anticipated that this initiative would require two types of expertise. First, it needed a skilled national intermediary with deep content expertise who could bring varied learning opportunities to support the development of regional infrastructure and capacity—Jobs for the Future (JFF) was selected for this role. Irvine also saw the importance of incorporating a developmental evaluation that could inform overall management of the initiative as well as document progress and impact over time. That led to our firms, Equal Measure and Harder+Company Community Research, forming a partnership to provide developmental evaluation support to Irvine and JFF.

While all four organizations (Irvine, JFF, Equal Measure, and Harder+Company) were equally committed to the goals of the investment and brought deep expertise to advance the work, this unique mix of players had never before collaborated in such a capacity.  To coordinate our activities, we created an “initiative management team” composed of colleagues from our four organizations. Over the past 18 months, members of this team have reflected on the evolution of the initiative and on our working relationships as a group. Learning about one another’s work styles has helped us continually adjust our expectations of the initiative and of one another. We have had some fun times together and some difficult conversations.

An important part of our growth as a team started with a conversation about how we share information with each other and, in particular, how we communicate about evaluation findings. Developmental evaluators walk a fine line when it comes to exchanging information with key partners in a complex initiative, relying both on intuition and experience to identify the best mechanisms for sharing observations and feedback. In this initiative, Equal Measure and Harder+Company recognized the need to clarify roles and agreements across partners, and to become more explicit about how and when we communicate evaluation findings. Insights learned from our interactions in this initiative include:

  • A transparent approach. Developmental evaluators should be transparent when conveying information and explain what, how, and with whom information will be shared. Because developmental evaluation emphasizes collective sense-making, it is important to build in opportunities to exchange insights from this work.
  • Clarity of roles. Developmental evaluators often work directly with grantees to share insights about an initiative as the work develops. In this case, JFF, as the intermediary organization, plays a primary role in providing technical assistance to grantees. However, having two partners at the table supporting grantee learning and improvement can complicate lines of communication. We find it helpful to coordinate our efforts to share information with grantees and to consistently reinforce the primary role of the intermediary in supporting grantee success.
  • Connecting theory and practice. Evaluators of complex initiatives can play a useful role by developing frameworks that clarify and codify emergent strategies for achieving progress. By bringing strategies into clearer focus, intermediaries can compare progress across approaches and act with intentionality in different grantee contexts. Tensions can emerge, however, when frameworks appear too definitive or static and are unable to reflect the ongoing evolution of adaptive work.
  • Relationships and trust. Sharing observations and findings can be challenging in the early stages of an initiative when partners are getting to know one another and establishing ways of working effectively together. It is worthwhile to dedicate time outside of formal meetings to socialize, connect, and build the trust needed to exchange information openly. In this initiative, we established a practice of scheduling an informal dinner the night before our day-long management team meetings.

Once you have established trust, the intermediary and funder can hunger for feedback from the evaluation team. Responding to this demand can be challenging because developmental evaluators must balance any partner’s desire for additional perspective with the need for time and space to identify important insights and the best mechanisms for sharing them. We find it helpful to articulate the principles that guide our team’s information-sharing approach:

  • We honor confidentiality and avoid sharing information that might compromise the trust that grantee sites, the intermediary, or the Foundation have placed in us.
  • We strive to share information that adds value—that is useful, well-supported, reflective of themes, timely, and has the potential to positively influence thinking and practice.
  • The Hippocratic Oath. While honoring confidentiality, all of us have committed to raising topics or elevating circumstances that we believe could “do harm” to the success of the initiative or the stakeholders and partners involved. We have an agreement that if and when such concerns arise, they will be brought up for discussion by anyone of the partners—the Foundation, intermediary, or evaluation team—with an intent to assess the issue as well as its implications and to chart a corrective and productive course.
  • We attempt to avoid coming to premature conclusions and recognize the value of stepping back and looking across grantee sites with our evaluation team before making broader evaluative judgments. When we share data, we pose questions that prompt reflection and consideration of multiple perspectives other than our own.
  • We are humble as researchers—we only see “slices” of what’s happening and may not have the entire context needed to make a strong assessment or recommendation. We see “hard” data collection, such as interviews and surveys, as more reliable sources of themes and findings than “soft” data collection methods, such as meeting observations. The level of rigor in our data collection and analysis influences our readiness and willingness to share conclusions beyond the evaluation team.

We invite you to view theseWays of Working Principles,” which we created to clarify the developmental evaluation role vis-à-vis the role of other initiative management partners and the grantee organizations. Developing and discussing this tool has helped all partners better understand the role of evaluation, and has resulted in a stronger synergy between the Foundation’s vision for this work, JFF’s technical assistance, and Harder+Company and Equal Measure’s evaluation. This has in turn improved the quality of guidance and support for grantees—ultimately enhancing this initiative’s impact for participating communities.


In the second blog post in this series, Building the Plane as You Fly: 5 Ways to Increase Learning During Education Innovation, Marty Alvarado of Jobs for the Future will reflect on the role of the intermediary in this collaboration.

In the third blog post in this series, Learning about Learning: The Funders Role in Evaluation, Elizabeth Gonzales of the James Irvine Foundation reflects on the role of the funder in this collaboration.

Meg Long is President of Equal Measure

Clare Nolan (formerly Chief Strategy Officer with Harder+Company) is an evaluation and strategy consultant