Yale Curriculum: Program Evaluation
The first thing students in Beth Daponte’s Program Evaluation course saw when they sat for the first day of class was a quote from Judith Rodin, the former Yale University provost who is currently the president of the Rockefeller Foundation. “We aren’t going to be remembered for how big we are, how smart we are, how hard we tried, or even how much we cared,” she said. “We’re going to be remembered for what we accomplished.” It might not seem like a cutting-edge statement, but for decades many foundations and charities were satisfied with believing they were accomplishing a lot of good without proving it.
Finding ways to be more scientific in measuring the impact of programs is a hot topic in the nonprofit world. With money tighter than in previous decades, foundations now demand results for their grants. “Competition for money has become fierce and society wants accountability from nonprofits,” said Daponte, a senior research scholar and lecturer who specializes in demography and program evaluation. “When programs fail they do so because the assumptions made were off. Evaluation helps programs get it right, whether it’s because they become more efficient or change their focus in order to have greater impact.”
The course, now in its fourth year, explores both theoretical and practical approaches to program evaluation, which as a relatively young field hasn’t yet matured to the point where there are a small number of accepted methods. The centerpiece of the semester is a series of consulting assignments with local organizations, in which groups of students evaluate current programs and help build new ones. This year students are working with the Connecticut Dance Conservancy to determine whether it should become a nonprofit; with the Stratford School District on an initiative to integrate mentally disabled special education students into general classrooms; and with the United Nations Development Program to draw up terms of reference for contracts with outside organizations. In the past, students in the course have consulted for Bayer, the Bridgeport Dental Clinic, and ESPN.
Program Evaluation, an elective, is for second-year SOM students, plus the occasional student from elsewhere around the university. The first half focuses on learning the basics of how to design and evaluate programs, using case studies to illuminate different trends and points. During the second half, Daponte brings in a handful of speakers to take students deeper into what can cause a program to fail and how to spot warning signs.
On Nov. 29, Sherry Cleary spoke to the class. Cleary is the executive director of the NYC Early Childhood Professional Development Institute, but her role that day was to walk students through the collapse of Early Childhood Initiative (ECI), a program in Allegheny County, Pa., that had a well-publicized implosion in the late 1990s. Cleary ran the University of Pittsburgh’s Child Development Center until last year and worked with the United Way to assess ECI after problems began to surface.
ECI launched in 1997 with nearly $60 million in funds, primarily from the Heinz Foundation and United Way. The plan was to provide childcare in economically disadvantaged areas throughout the county. But the people in charge of the initiative had little experience in building an organization and posed a series of goals that immediately struck Cleary as unrealistic. “I’d meet with the funders and they’d show me yet another iteration of the business plan and it just didn’t make sense,” she said. “They made a huge number of poor assumptions. They believed that as soon as they picked up steam the state government would come in and take them over. But Pennsylvania was the only state then that didn’t put any of its own money into Head Start. Suddenly the governor was going to have an epiphany?”
Cleary said the administrators of ECI were resistant to advice from the outside and discounted all other childcare programs in the region rather than working with them to develop a stronger system for children. But she said they didn’t base their dim view of other programs on data. “They just said they stink,” she said. As she spoke, she wrote the words “ignorance” and “arrogance” on the board. “Always run as fast as you can when you come upon ignorance and arrogance,” she said. “They’re a deadly combination.”
The story of ECI — which went under in 1999 — is not just a cautionary tale, but points to how the program evaluation movement has emerged. In the aftermath of the program, funding for childcare initiatives in the region dried up. But rather than just eschew anything similar to ECI, the big foundations began taking a harder look at applicants. “All of a sudden accountability became a watch word,” Cleary said. “People wanted to evaluate what you’d do. At Heinz, if you want money the project doesn’t even get to the desk of the person who decides without being evaluated. All parties learned that a lot is possible when you are careful, diligent, and honest.”
Read about other opportunities for students to acquire real-world nonprofit consulting skills at SOM:
SOM Outreach club
Yale SOM Management Clinic
Global Social Enterprise club