Partnering for the Long Haul: Building Evaluation from Implementation to Impact

Our five-year partnership with Genesys Works Bay Area (GWBA), a program that provides professional job skills training and technology-sector internships to low-income high school students, allowed us to witness its growth from the pilot stage and adjust our work (and theirs) as the program scaled up. Evaluating a program from inception through scale-up provides evaluators with opportunities to see and effect positive changes in program implementation that ultimately increase the chances of better outcomes for participants.

Genesys Works is a national program that expanded to the Bay Area as a pilot in 2013. By starting our engagement at the onset of a pilot program, we had the opportunity to help establish a true baseline for participants and consider different evaluation approaches to document feedback and progress at key moments. Over the course of the study, we learned and adapted alongside program staff, and had the benefit of time to hear how evaluation findings were contributing to changes in program implementation.

A Phase-by-Phase Study Approach

Our evaluation included three components: (1) an implementation study to document program structure and effectiveness, and examine fidelity to the program model; (2) an outcome study to measure how participants’ circumstances, attitudes, and outcomes change over time; and (3) an impact study to assess the program’s effect on high school graduation, and college enrollment and persistence.

During each of these phases, we used a variety of methods in order to strategically align data collection as much as possible with key touchpoints already built into the GWBA program. Conducting an evaluation that is organized into specific phases to respond to and take advantage of the increased number of participants over time—and that incorporates the tools to measure both short- and long-term impact—allowed us a wide lens to assess the effectiveness of the program. This approach also enabled us to provide real-time feedback to program staff and leadership, which strengthened the study and program overall.

Paving Paths of Improvement

During the implementation study, we gathered data to determine the extent to which GWBA was serving its target population—that is, low-income youth experiencing hardships individually or in their household. Due to feedback from youth participants, school champions, and corporate partners, GWBA made program improvements throughout the course of the study. Some of the improvements aimed at correcting GWBA’s participant pool to be more representative of students within its target population.

Improvements in recruitment strategies led to better targeting of low-income and first generation students along with other demographic groups (e.g., women, African-Americans, Latinos) underrepresented in IT/STEM professions. After we identified this discrepancy in the data, the following year GWBA increased the numbers of women and first generation students.

Using Findings to Strengthen the Program

The main purpose of the outcomes study was to track whether changes in student attitudes, experiences and skills were in line with the program’s theory of change from “Point A” to “Point B” of their participation in the program. Our multi-year partnership also allowed us to work with GWBA staff to use the outcomes study more holistically in order to identify and address program-related issues as they arose.

For example, young women involved in GWBA were more likely than young men to drop out of the program. Because of this finding, the evaluation team conducted an all-female focus group to learn more about why this may be occurring and the challenges that young women face while participating in the program. We learned about their experiences having more family-related responsibilities (e.g., caring for siblings and other family members) that made it difficult for them to meet all of the GWBA-related commitments. In addition, many of GWBA’s corporate partners were tech sector companies with fewer female employees, so young women participants often felt less comfortable navigating the social structure in their internships compared with their male counterparts.

Knowing this, we recommended that program coordinators provide more support for participants during training in terms of career exploration, as well as more training for program coordinators to identify these issues and provide appropriate support to women participants who may be struggling.

Understanding the Effects on High School Participants

A rigorous impact study works best when solid program implementation occurs. With GWBA, we had the time to develop and reinforce a strong foundation. The program demonstrated statistically-significant, positive impacts on all three of its key outcomes: high school graduation, college enrollment, and college persistence. GWBA was also able to scale up considerably from 29 participants and five schools in the first year to 162 participants and 32 schools in the fifth year. This success of an increased participation rate is due in part by the program’s ability to learn and adapt based on the study’s findings early on.

The Benefits of Long-Term Partnership

The long-term partnership between GWBA and Harder+Company provide the space for the program and evaluation to evolve together. Open communication and collaborative thought partnership were fundamental to our ability to work so closely and effectively over time. As evaluators, we particularly valued the time and space to revise methods as needed in order to answer important programmatic questions as they arise—and the trust from GWBA staff that we would keep evolving alongside them. The long-term nature of this engagement also ensured that program implementation is as strong as possible in order to reach the desired outcomes.

To learn more about the Genesys Works Bay Area program, read our final report. Also, make sure to check out the 2018 Annual Impact report for more information on the Genesys Works national program.