Helping California Arts Nonprofits Collect Meaningful Data on Participation

One size doesn’t fit all when it comes to supporting arts nonprofits as they collect data for learning and evaluation. This was especially true in our work with the Exploring Engagement Fund—an innovative grantmaking program that provided risk capital for California nonprofit organizations to experiment with arts engagement strategies. What began as a one-year contract expanded to a seven-year partnership involving Harder+Company (in partnership with strategy consultant Diane Espaldon), The James Irvine Foundation, and more than 100 nonprofit arts organizations across California.

As the initiative evolved over the years, expanding through several rounds of grantmaking and ultimately involving 113 grantees, we regularly adapted our evaluation approaches to keep up with this growth. As part of the Exploring Engagement Fund grant requirements, participating nonprofits had to provide data about grant-funded activities and events, characteristics of participants, and types of participant engagement. Early on, Irvine recognized that many grantees were new to evaluation and needed support to meet the data collection requirements of their grants. Irvine asked Harder+Company to provide data collection technical assistance to each grantee to help ensure data accuracy and consistency.

As we helped the grantees strengthen their data collection capacity, we generated our own insights about providing effective technical assistance to organizations that are less familiar with research and evaluation. We think these lessons can inform other funders and evaluators doing similar work.

    • Clearly explain the scope of the technical assistance. At the start of every grant period, our team initiated an introductory phone call with each grantee to get to know the organization. During the calls, we learned more about their grant-funded projects, reviewed the evaluation methods for the initiative, and introduced the data collection technical assistance process. These introductory calls allowed us to identify technical assistance needs right away and plan for the level of follow-up and support needed for each grantee.
    • Meet grantees where they are. Exploring Engagement Fund grantees included organizations across California with a wide range of staff sizes, budgets, and data collection experience and capacity. We made sure that our technical assistance was flexible and fit each individual grantee’s needs. Among other things, this meant talking openly about what was realistic and possible for each organization to implement.
    • Create resources to address common questions and needs. We developed general resources for the grantees based on the Exploring Engagement Fund’s evaluation approach. These documents included information about different data collection options and specific advice about how to collect information about participants’ race/ethnicity and low-income status. After providing technical assistance to the first three grantee cohorts, we created an additional toolkit that addressed the common questions we received across grantees.
    • Be flexible with participant demographic data collection methods. From the beginning, Irvine knew that some of the grantees might find it challenging to collect and share participants’ demographic information. We made it clear to grantees that informed estimates of participant demographics were acceptable—but that relying on visual observation to determine participants’ race or economic status is a flawed approach rooted in stereotypes. Instead, we highlighted how to use reliable primary or secondary data, such as census data, to make these estimates.
    • Assess technical assistance efforts as you go and refine as needed. We closely tracked how the first few grantee cohorts used our technical assistance in order to understand what was most needed and valued. We also surveyed grantees to gauge their satisfaction and gather any comments on the technical assistance. This feedback helped us refine the ongoing technical assistance and create the toolkit noted above.

By meeting grantees where they were, we were able to offer technical assistance that best fit their needs and capacity. This also resulted in more accurate data for our team to analyze and use to answer broader Exploring Engagement Fund evaluation questions. Ultimately, our insights about technical assistance were part of a broader set of lessons learned through this work: embrace the relationship between evaluation and program design, allow evaluation approaches and deliverables to evolve, and consider how shifts in a foundation’s overall focus may impact the ongoing program evaluation.

To learn more about the Exploring Engagement Fund, please read the concluding evaluation report. We offer our sincere thanks and gratitude to past and present Irvine staff and all of the Exploring Engagement Fund grantees who made this work very meaningful for so many years.