Title: Solicitation for Evaluations of Office of Juvenile Justice and Delinquency Prevention Discretionary Funds Projects Evaluability Assessment: "I Have a Dream" Foundation Series: Solicitation Author: National Institute of Justice Published: June 2003 Subject: Funding resources 25 pages 49,000 bytes ------------------------------- Figures, charts, forms, and tables are not included in this ASCII plain-text file. To view this document in its entirety, download the Adobe Acrobat graphic file available from this Web site. ------------------------------- Preface: "I Have A Dream" Foundation Staff Contact: Lai-Wan Wong Director of Programs 212-293-5480, ext. 14 NIJ has identified some key outcome variables and other parameters of interest for this project and also has provided some guidance on possible evaluation designs. Applicants may depart from this guidance by providing appropriate rationale. NIJ believes that evaluating the program's outcomes through a national sampling of "I Have a Dream" (IHAD) sites is imprudent given the information available at this time. Consistent site-specific information, such as student characteristics, dreamer attrition from the host school, and school achievement records, is extremely limited. Therefore, NIJ is soliciting a first-step gathering of baseline information that would inform a subsequent strategy for a broad outcome evaluation. The grantee would work with IHAD staff to (1) define a variety of descriptive, process, and outcome statistics that could be gathered economically by participating sites and (2) assist IHAD in developing a database of this information. The grantee would also analyze this data to identify possible clusters of similar sites (i.e., grades, activities, student populations) that could serve as bases for multi-site evaluations in the future and evaluation clusters that could highlight important theoretical issues. NIJ suggests a maximum project length of 1 year. NIJ expects the cost of this preparatory study to be no less than $100,000. The total funds available for all six evaluations covered by this solicitation are approximately $5 million. ------------------------------- Evaluability Assessment: "I Have a Dream" Foundation Grantee: "I Have a Dream" Foundation (2000-JG-FX-0001) Grant Period: July 1, 2002-June 30, 2003 Grant Award: $1,350,000 Funding History: This is the first time that the "I Have a Dream" (IHAD) program has received Federal funding. A grant manager's memorandum indicates a grant period from May 1, 2000, to June 30, 2003; the date of the award was July 25, 2002. This information is inconsistent with the award sheet. Project Summary: IHAD provides support to an adopted class of students (typically a fourth- to fifth-grade class) in a single school. Support includes tutoring and mentoring during the elementary, middle, and high school years as well as partial tuition reimbursement for students who eventually enroll in higher education. IHAD has several goals: o To enhance the school experiences for participants. o To reduce the rate at which children engage in negative behaviors (delinquency) and experience undesirable consequences (drug use, pregnancy, and so on). o To retain children in school. o To encourage participation in higher education. There is considerable variation across IHAD programs because sponsors, who provide the funding, play the central role in determining how programs operate. Some sponsors are individuals or couples, others are corporations, and still others are foundations. Although programs vary, all IHAD programs have seven principal components. The following is adapted from program documentation and includes comments based on the evaluability assessment: o The sponsor provides program funds, oversight, and direction. By design, the sponsor is expected to provide mentoring to program participants, who are known as "dreamers." In practice, some sponsors take an active role; some pay for surrogates; and other sponsors have little direct contact with the dreamers. o Dreamers are the recipients of program services. Although IHAD attempts to serve the families of dreamers, or at least engage them in supportive activities, most evaluations indicate that family members (i.e., parents, guardians, siblings) are not active IHAD participants. o A project coordinator is a full-time paid staff member who coordinates program activities and serves as a mentor for program participants. Most evaluators consider the project coordinator to be the key contributor to an IHAD program, although some programs also get substantial mentoring input from sponsors and volunteers. o The community is expected to provide "support, meeting space, recreational space, resources, volunteers, and expertise . . . making it possible to offer an extraordinary range of services without incurring prohibitive costs."[1] Extant evaluations do not stress community contribution, with the exception of schools and housing projects that often provide space for the project coordinator and program activities. According to some evaluations, the community has shunned the project, seeing it as already receiving extensive resources from its benefactors.[2] o The program is described as providing ". . . a diverse array of educational, mentoring, counseling, employment, cultural, community, and recreational activities that must keep evolving to keep pace with the changing needs of dreamers as they mature."[3] o The resources are the financial commitments of the sponsor. Sponsor commitments vary from program to program, with some programs seeming to have endless funding and others having definite upper limits. Minimum commitments are extensive. Currently, a sponsor is expected to make an immediate commitment of $300,000 and must make an incremental commitment of an additional $150,000 over time. In practice, it appears that programs typically cost more than $450,000. o The program guarantees at least partial funding of education at a community college, State-funded university, or accredited technical school. Some programs are more generous than others at subsidizing postsecondary education. IHAD has two project models. Prior to 1993, all IHAD projects were school based. Thereafter, some IHAD projects were public housing based. As described by IHAD documentation A good school partner is supportive of the program's mission and is cooperative in allowing "I Have a Dream" to work in the school and with school personnel. Typically, the project coordinator works out of the school, and dreamers may participate in "I Have a Dream" activities there both during and after school. The elementary school often provides the project's primary access to the dreamers and becomes a key player in the project's work.[4] When it describes the public housing-based project model, program documentation says that The public housing-based model includes the traditional partners-- sponsors, project coordinator, school, a community-based organization, dreamers, and families--and adds the local housing authority and a local college or university as primary partners. The local housing authority helps identify eligible residents and may provide assistance such as rent-free office space, a cash grant, and program guidance. The college partner provides volunteers, program expertise, facilities, staff support, and access to special programs for dreamers and parents.[5] The evaluations that we reviewed were all for school-based programs. Possibly the parent participation component is greater for public housing- based programs. The participation by college and university partners seems to be unique to the public housing-based programs. Scope of Evaluation: Limiting an evaluation to school-based programs is recommended principally because comparison groups are readily identified; identifying comparison groups for public housing-based programs may be problematic. Furthermore, more is known about school-based programs based on extant evaluations. Finally, if school-based programs are effective, it seems reasonable to assume that public housing-based programs should have similar effects. Summary of Evaluability Assessment Activity: The IHAD Foundation evaluability assessment data were collected from the National Foundation. It was unnecessary to visit individual programs. Given the long timeframe for an IHAD program, little could have been observed that would have informed this assessment. It might have been useful to visit one or more local foundations, but the importance of the local foundations to this evaluation was not identified until late in the review process. Fortunately, extant evaluations provided good descriptions of how programs operated, and professional evaluators provided very useful debriefings on their personal experiences. The assessment team was unable to speak with the OJJDP project officer. But it learned a great deal about the program prior to scheduling a site visit, thanks partly to evaluations provided by Lai-Wan Wong (IHAD's Director of Programs) and discussions with researchers who had done evaluations or had considered evaluation designs for the IHAD program. The team was impressed that the program is evaluable, but any evaluation would be difficult to accomplish for reasons that will be discussed. Bill Rhodes (Abt Associates, Inc.) and Ed Zedlewski (NIJ) visited the national headquarters in New York City. They met with Marina Winston (President and CEO), Kristina Berger (Director of Planning and Grants), and Lai-Wan Wong. ANALYSIS What is already known about projects like these? Several evaluations of local IHAD programs were available (see "For Further Reading" at the end of this review). The evaluations were of mixed quality, for reasons that are understandable. First, an outcome evaluation of IHAD requires years of observations, given that the program begins in elementary school and its final outcomes cannot be fully judged until the class graduates from high school, enters college, and (arguably) graduates from college. Second, compared with the demands of a longitudinal design, evaluations appear to have been comparatively low-budget operations. Evaluations have provided some credible, if not compelling, evidence that IHAD has been successful in some places and not as successful in others. There are two key ingredients. The first is the quality and perseverance of the project coordinator. The second is the level of funding by the sponsors and what can be leveraged from that funding. However, these judgments are impressionistic. Although its guarantee of a college subsidy differentiates IHAD from other mentoring programs, the IHAD program is, at its heart, a mentoring program. In that respect, then, what is known about the effectiveness of mentoring programs?[6] Programmatic mentoring--the practice of matching at-risk or disadvantaged youth with an adult who is expected to develop a caring, supportive relationship with that youth--has been growing dramatically over the past 20 years. The most widely cited study of program impacts was the random assignment study of Big Brothers/Big Sisters conducted by Public/Private Ventures in the early to mid-1990s.[7] This evaluation found that youth in the treatment group were less likely to have initiated drug and alcohol use; had better school attendance and grades and felt more competent in school; were less likely to hit someone; and had better relationships with their peers and parents compared with youths in the comparison group. Although youths in both groups reported increases in problems over time, these problems increased at a slower rate for the mentored group than for those on the waiting list. The findings held for both boys and girls and across race/ethnicity. Evaluations that examined the effect of mentoring in conjunction with other, primarily academic, services found that youth improved in a number of academic outcomes. Students with mentors tended to have better grades, were more likely to participate in college preparatory activities, and were more likely to enroll in college and remain in college for a longer period than students not receiving the program's services.[8] Most recently, DuBois and colleagues conducted a meta-analysis of 55 mentoring program evaluations to examine the documented benefits of mentoring.[9] The results of this meta-analysis provide support for the effectiveness of mentoring programs for youths. The researchers concluded that the benefits occur for both stand-alone mentoring programs and programs in which mentoring is embedded among other program services. Further, the benefits were found for youths of different ages, both genders, various racial/ethnic groups, and different family structures. DuBois and his colleagues also concluded that mentoring holds the greatest potential benefits for youths who might be considered at risk, particularly if at risk is defined in environmental terms. The Office of Juvenile Justice and Delinquency Prevention is currently supporting a national evaluation of the Juvenile Mentoring Program (JUMP). What could an evaluation of this project add to what is known? IHAD evaluations are challenging for reasons that will be explained later. Whether or not extant evaluations have met that challenge is judgmental, but there is evidence that IHAD can be successful. A large-scale evaluation could better identify how IHAD is successful and why it is successful. Although IHAD has a putative goal of allowing impoverished children to enter and succeed at postsecondary education, many sponsors have expressed other goals. Qualitatively these are expressed as making the educational experience more worthwhile even if it does not result in postsecondary education or even graduation from high school. Quantitatively the goals might be measured in terms of educational achievement, student judgments of their self-worth, avoidance of negative behaviors/outcomes (e.g., delinquency, substance abuse, pregnancy), and promotion of positive behaviors (e.g., staying in school). The IHAD National Foundation has endorsed these additional goals. Which audiences would benefit from this evaluation? Having a firmer grasp of the measurable outcomes would be valuable for the national and local IHAD foundations as they restructure the IHAD program and attempt to attract financial support and sponsorship. A solid evaluation would be valuable to potential sponsors as they decide whether or not their investments are worthwhile and how those investments should be channeled. There is another audience. Researchers have considerable interest in understanding the factors that inhibit the ability of the poor to receive postsecondary education.[10] An IHAD evaluation would provide additional information about this process with the prospect of informing future programs geared toward increasing postsecondary educational participation by underprivileged children. What could they do with the findings? The IHAD national and local foundations could improve their programs and better target their recruitment efforts. Is the grantee interested in being evaluated? Yes. The national foundation expressed great interest. Are local sites interested in being evaluated? The national organization indicated that local foundations would be interested in evaluation. This willingness to participate would be instrumental to an evaluation plan. Experienced evaluators, however, have said that many other sponsors would not be interested in evaluation. Many, in fact, would be hostile to evaluation. According to evaluators who have discussed the potential for evaluation with sponsors, some sponsors have had bad experiences with IHAD. Those problems stem from the sponsors' (1) realization that IHAD costs more than they had anticipated and that they were expected to provide more than a financial commitment to IHAD, (2) naivety about what the program could accomplish, and (3) judgment that dreamers were not suitably thankful for the sponsorship. According to the same evaluators, some other sponsors see no need for evaluation because they know that the program is effective, so no outside evaluation is necessary. Possibly, after making a considerable investment in program activities, these sponsors would not want to find that the specific program that they financed was ineffective. What is the background/history of this program? The following was adapted from the programs Web site (http://www.ihad.org): In 1981, businessman Gene Lang returned to the elementary school he had attended 50 years earlier, to give a speech to a class of graduating sixth graders. He intended to tell them, "Work hard and you'll succeed." But on the way to the podium, P.S. [Public School] 121's principal told Lang that three-quarters of the school's students would probably never finish high school, prompting Lang to make an extraordinary impromptu change to his speech: He promised college tuition to every sixth grader who stayed in high school and graduated . . . . . . . Lang realized (students) would need more help than he could give all 61 of them on his own. So he hired a full-time social worker as project coordinator and enlisted the support of a local community-based organization to provide services and support to the children throughout the years that separated them from high school graduation, while at the same time maintaining close personal relationships with each of them. Thousands of inquiries began to pour in, and in early 1986 Lang organized the national "I Have a Dream" Foundation to help launch a new generation of "I Have a Dream" projects. Since then, the program has grown . . . . There are now nearly 180 "I Have a Dream" projects in 27 States and 64 cities, serving more than 13,000 dreamers, and the program continues to grow. At the same time, "I Have a Dream" has freely shared its experience with others (i.e., individuals, corporations, church and community groups and government agencies) in creating other educational support programs that help many more thousands of disadvantaged children at the primary and secondary school levels. In 1998, the success of "I Have a Dream" prompted the U.S. Congress to adopt the "I Have a Dream" model as the basis for a new program. GEAR UP (Gaining Early Awareness and Readiness for Undergraduate Programs) is administered by the U.S. Department of Education and has already awarded more than 250 grants to State agencies, universities, school districts, and other consortia to help thousands of low-income students prepare to succeed in college. Program documentation provides the following timeline: 1981--Eugene Lang begins the first "I Have a Dream" project for students at P.S.121 in Harlem. 1985--Bill and Elaine Farrell of Dallas, Texas, start the first "I Have a Dream" project outside of New York. 1986--The national "I Have a Dream" Foundation is created to handle the growing number of requests for technical assistance in starting local projects. 1993--With the support of HUD Secretary Jack Kemp and the cooperation of the New York City Housing Authority, Jeff Gural and Gloria Jarecki begin the first "I Have a Dream" project based in a public housing development. 1998--"I Have a Dream" assists Congress and the U.S. Department of Education in crafting the GEAR UP program, a Federal replication of the "I Have a Dream" model.[11] GEAR UP is funded by the U.S. Department of Education. Although it has components similar to those of IHAD, program funding comes from the government (50 percent matching funds), and the program targets older students. While IHAD currently attempts to begin its program when youths are in third grade, GEAR UP begins no later than seventh grade. It requires partnerships between colleges and universities, schools, and outside organizations. Does headquarters monitor fidelity of implementation? No, not formally. IHAD programs have considerable discretion, which rests in the hands of the sponsor. The national foundation is developing a management information system (MIS). What are the headquarters' roles in the program? As adapted from the national Web site, the national foundation ". . . assists in the launching of new local projects and provides information and support to existing projects. The national office currently provides an array of support services, including conferences and professional development workshops, the guidance of project specialists, technical assistance, identification and acquisition of resources, a national Data Management System, evaluation services, and an online messaging and information system." The national "I Have a Dream" Foundation is also the outreach arm of the program. It builds national partnerships with corporations, State and Federal agencies, foundations and corporations, and other organizations committed to education and youth development. It appears that the national foundation does not have an active role in ongoing programs. Some evaluations were critical of the national foundation for its failure to provide ongoing support. At what stage of implementation are sites? As a mature program, some projects have ended, others are ongoing, and still others are being planned. What are the project's outcome goals in the view of the project director? This question was not posed to the CEO, but there is no reason to presume that the CEO would have given a different answer than was expressed above. As noted previously, however, sponsors' goals vary. Does the proposal/project director describe key project elements? This question was not posed to the CEO, but there is no reason to presume that the CEO would have given a different answer than was expressed above. Do they describe how the project's primary activities contribute to goals? This question was not posed to the CEO, but the IHAD Web site provided a thorough description of how the program operates. Several evaluations commissioned by the IHAD National Foundation provide additional documentation.[12] Can you sketch the logic by which activities should affect goals? The program has a formal logic model, although formality seems unnecessary. Poor youths are seen by the program as facing barriers to postsecondary education that result from the cost of education but, more importantly, from the lack of role models, educational assistance, and motivation. IHAD works to overcome all these barriers by providing mentoring, tutoring, community activities, and financial assistance. Again, however, some sponsors do not necessarily see postsecondary education as the goal or, at least, not the only goal. Fostering self-worth, achievement, and avoidance of negative behaviors are seen as both instrumental and valuable in their own right. Are there other local projects providing similar services that could be used for comparisons? This is inapplicable. There are suitable comparison groups, which will be discussed later. Will samples that figure in outcome measurement be large enough to generate statistically significant findings for modest effect sizes? A typical dreamer class has 60-80 students according to program documentation.[13] There are several ways to develop a comparison group. If the class of 2010 is the dreamer group, then classes in 2009 and 2011 from the same school might be seen as the comparison group. Or, if the dreamers make up one classroom at an elementary school, then the comparison group might be members of a different classroom. Housing projects might pose a different problem for assembling comparison groups. Thus, a single project would have 60-80 participants, and it could potentially have 120-160 comparison subjects. Selecting one class before and one class after the dreamer class has advantages because it allows an evaluator to control for trends. Selecting a comparison group that is much larger than 1.5 times the participant group would have only a marginal effect on statistical power, so larger comparison samples are probably wasteful unless they further allow the researcher to control for trends. Given the expense of an IHAD program, effect sizes would have to be sizable for the program to be judged effective, so an evaluation of a single program might be seen as informative of program effectiveness despite relatively small sample sizes. There are two reasons for rejecting the funding of a single site evaluation, however. o One problem is that dropouts are numerous. Students leave one school and reenter another school with frequency. IHAD evaluators have found it very difficult to track students who leave a school district because schools do not maintain records of transfer. Consequently, the effective sample size is smaller than 60-80 participants. o Moreover, if extant evaluations are to be believed, there is considerable variation in the success of IHAD programs. This means that a useful evaluation should include several IHAD programs, enough that the evaluator is able to estimate the variation across IHAD programs and the factors to which that variation is attributable. Fortunately, there are many IHAD programs. If each serves 60-80 students, there appears to be more than enough participants to support an evaluation. Is the grantee planning an evaluation? The IHAD National Foundation has supported evaluations. It is designing and implementing an information system for monitoring current program operations. The past evaluations have been small in scale and do not provide compelling evidence of program effectiveness or of the reasons why some programs are more successful than others. By itself, MIS will not provide the basis for an evaluation, because it will not track comparison subjects. However, the MIS could be the nucleus of an evaluation if it was modified and/or expanded to include other areas. What data systems exist that would facilitate evaluation? At the national level, the MIS--with changes--could be important to evaluation. At the school level, school records will be important for an evaluation. What are the key data elements contained in these systems? The national foundation's MIS has major sections for the following information: o Administration. o Alumni (apparently ex-dreamers). o Contributors. o Dreamers. -- Demographics. -- Address and contacts. -- Academic performance (an extensive battery of questions). -- Activities. -- Health care. -- Household. o Foundation. o Mentors. o Project. o School. o Sponsors. o Staff. o Volunteers. One problem to be faced by an evaluator is that the MIS only pertains to dreamers. The MIS could be adapted to include comparison group members. Using the system for this purpose would require additional expenses for entering data for comparison group members. Another problem is that the MIS would not be useful for collecting retrospective data unless the MIS were front loaded with retrospective data. The national foundation also shared a synopsis of data sources and data elements used by program coordinators. A great deal of information is available about students provided a researcher could gain access. One approach might be to hire school teachers to serve as data collectors during summer recess. Because there is no need for real-time data collection, this might be a practical way to assemble data on comparison group members. Are data available to estimate unit costs of services or activities? Bookkeeping at individual programs was not examined, but it seems reasonable to suppose that expenditure data would be available from sponsors. Are there data about possible comparison samples? Yes. As noted, there are natural comparison groups for school-based programs. Also as noted, data for those comparison group members could be collected, in theory, using an adaptation of the MIS. This would not be a trivial cost, but it would surely be less expensive than creating a new tracking system. Is there routine reporting of specific data from local sites? There will be when the MIS is in place. In general, how useful are the data systems to an impact evaluation? When extended to comparison groups, they would support a rigorous evaluation. Is the project being implemented as advertised? Yes. What is the intervention to be evaluated? Program components are not separable. The intervention is the guarantee coupled with support services for the dreamers. The support given to dreamers varies across the programs, and this should be a subject of investigation. What outcomes could be assessed? By what measures? The MIS provides measures of outcomes that would form the basis for an evaluation. These include o Grade point average. o Grade. o Standardized test scores. o Absences and tardiness. o Graduation status. o Indication of having left school. There is also an alumni section that in principal would allow dreamers to be tracked beyond high school. An evaluation might also include attitudinal responses on questionnaires. Although valuable for evaluation, these would be expensive to administer to a large number of dreamers and comparison group members. Costs would be especially high given that students disperse across multiple schools, making them difficult to track. Furthermore, access to students is likely to be inhibited by both schools and parents. Some data items are not available from the MIS. It does not seem to record reports of misconduct or delinquency. For purposes of evaluation, it may be necessary to access school and police records. Are there valid comparison groups? Yes. Provided the evaluation is conducted with school-based programs, other classes would serve as acceptable comparison groups. Is random assignment possible? No. Given the way the program operates, random assignment would not add a great deal to the strength of this evaluation because matching classes is a strong design. What threats to a sound evaluation are most likely to occur? Discussions with Professor Robert McGratch at Fairleigh Dickinson University and with Professor Adriana Lleras-Muney at Princeton University were beneficial. Reading evaluation reports of the IHAD program was also helpful. Based on both the discussions and readings, the following problems are very likely to arise: o Most evaluators would seek to identify the dreamers and comparison group and then track their records retrospectively back to when the dreamer group was formed. In practice, however, many schools are unwilling to release student names, and access will be a major problem for evaluators. Furthermore, schools do not maintain records indefinitely. It appears that most schools have a 5-6 year maximum on maintaining student records. Consequently, retrospective data collection may be problematic, restricting the evaluators' ability to assess mature ongoing programs. o There is considerable variation across sites. This requires that multiple schools participate in an evaluation. It also means that an evaluator would probably decide to eliminate some IHAD programs from consideration because they are atypical of the IHAD model. Eliminating aberrational programs would allow the evaluation to focus on the effectiveness of the basic IHAD formulation. As an illustration, a program in Chicago was funded at such a level that it could afford to send dreamers to private schools. Including such a program would confuse the contribution from the IHAD program per se with the contribution of the private school. This is one reason why an evaluation that focuses exclusively on school-based programs and/or programs sponsored umbrella foundations is recommended. o Some sponsors will refuse to participate in an evaluation. This is another reason to prefer IHAD programs sponsored by umbrella organizations. o Apparently, evaluations cannot start tracking dreamers and comparison group members later in the children's educational career. For example, if the dreamer program started in fifth grade, an evaluator cannot start tracking children in the tenth grade, because a suitable comparison group cannot be readily identified. As an illustration of the problem, McGrath and Hayman formed their comparison groups well after the dreamers group had been formed.[14] They found that high school dreamers were more likely to get pregnant than were members of a comparison group. Because the sample size was small, differences were not significant. But they also speculated that IHAD may have been effective at keeping problem children in school, so by the time children had reached high school, the comparison between dreamers and comparison group members was uninformative. o Followup tracking is extremely difficult. Most children are lost to followup, and it is extremely difficult to figure whether they in fact dropped out of school or whether they transferred to other schools and had a successful end to their school years. Schools do not keep good transfer records. Furthermore, schools typically do not know that a transfer has occurred. This results in a severe selection bias problem. This is one reason for restricting an evaluation to a few school districts that have multiple IHAD programs. Are there hidden strengths in the project? The intervention is straightforward. Hidden strengths do not seem to exist. What are the sizes and characteristics of the target populations? There are 60-80 children per dreamer cohort. They are selected because they are disadvantaged. By program design the entire class must be selected, so at the class level the program is not selective. There are some exceptions to this rule. For example, Colorado dreamers do not constitute entire classes, but rather, Colorado dreamers are underprivileged children who are selected from classes that contain a mix of underprivileged and other children. Because comparison groups in Colorado would be difficult to assemble, it is recommended that Colorado programs be avoided as evaluation sites. How is the target population identified (i.e., what are the eligibility criteria)? Who/what gets excluded as a target? See above. Have the characteristics of the target population changed over time? Perhaps, but this has not been documented, and there does not appear to have been fundamental shifts. How large would target and comparison samples be after 1 year of observation? There is no significant difference after 1 year. Therefore, a 1-year followup would not be a useful timeframe for an evaluation. What would the target population receive in a comparison sample? A sample of students who receive standard public school education would serve as the comparison group. What are the shortcomings/gaps in delivering the intervention? Programs are funded at different levels depending on the means and the willingness of the donors. Programs appear to vary in the degree to which the donors or others provide mentoring and tutoring. Evaluators have seen the program coordinator as key, and the abilities of coordinators appear to vary materially from program to program. As noted earlier, the use of community resources seems to vary among programs. What do recipients of the intervention think the project does? The assessment team did not interview participants, but other evaluators have interviewed dreamers. Responses are curiously mixed. Some respondents found the program valuable and others did not. Many respondents were critical of sponsors whom they found distant and uninformed. Many respondents were critical that the guarantee did not cover all costs of postsecondary education. It would be a mistake to think that IHAD materially benefits all children who were selected to be participants. But as noted earlier, based on evaluation findings, evidence indicates that IHAD provides material benefits for an appreciable number of program participants. Many evaluations provide testimonials from children who found IHAD to be very helpful--especially with respect to the project coordinator, who seems to have become a surrogate parent for many children.[15] College funds have undoubtedly helped high school graduates attend college and technical schools. How do they assess the services received? See above. What kinds of data elements are available from existing data sources? See above comments on MIS. What specific input, process, and outcome measures would they support? See above comments on MIS. How complete are data records? Can you get samples? The MIS is under development and is scheduled to be completed during summer 2003. The national foundation provided a useful summary of data elements and their sources. What routine reports are produced? No reports have been routinely produced, and reporting screens were not observed. Can target populations be followed over time? Followup has been the principal problem of IHAD evaluation. In theory, children can be followed as long as they are in school, and in theory juvenile records can be located outside of school. In practice, access is a problem, and school transfers are frequent, so an appreciable number of children are lost to followup. When a child is lost to followup because he or she drops out of school, the attrition is not a problem because dropping out is a clear failure. IHAD participants would be expected to have lower drop out rates, and to the extent that is true, they would be judged IHAD successes. The problem is distinguishing between students who drop out and those who transfer. The need to draw this distinction is one reason why the recommended evaluation design is one that focuses on a selection of large school districts, so that at least students can be tracked within that school district. Can services delivered be identified? Yes. Services can be tracked through the MIS but probably not retrospectively. Can systems help diagnose implementation problems? Presumably, yes. When the MIS is available, it can help diagnose implementation problems. Of note, however, monitoring is the responsibility of the individual site. Do staff tell consistent stories about the project? IHAD program sites were not visited, so there was no opportunity to talk directly with site staff to hear their stories. However, headquarters' staff members were consistent in their descriptions of project activities. Are staff's backgrounds appropriate for the project's activities? Based on evaluation reports, the backgrounds of staff members seemed appropriate for the project's activities. What do partners provide/receive? Sponsors are altruistic. They provide funding and some form of support. How integral to project success are the partners? Sponsors are central to the program. Schools are instrumental. With information being limited, however, it is less clear whether community organizations play a large supporting role. What changes is the director willing to make to support the evaluation? The national foundation would be willing to change the MIS, with support from NIJ that paid for changes. The foundation would be supportive of the evaluation. CONCLUSIONS Would you recommend that the project be evaluated? Why or why not? Yes. Evaluation is recommended. The program is important. The intervention is intensive and has the potential to make a great impact on the lives of its clients. Evaluation would promote continued program development. The program has the potential to be evaluated because comparison groups are easily constructed. This does not mean that the evaluation itself would be straightforward. What kinds of evaluation designs would you propose? A cross-section, time-series design that is focused on school-based programs is recommended. For the cross section, the evaluator would include multiple dreamer cohorts in the study. For the time series, these cohorts would be followed both retrospectively (for dreamer cohorts that exist at the time the evaluation begins) and prospectively. NIJ funding for a long-term evaluation is recommended. The study would last several years but would provide meaningful interim reports. Retrospective data collection would facilitate the development of useful interim reports on the relative progress of dreamers and comparison students. Prospective data collection would allow interim reports to be updated. It is recommended that dreamer cohorts be selected to participate in the study from a limited number of places. Discussions with the national foundation suggested that four Los Angeles programs, four District of Columbia programs, four New York programs, and four Portland, Oregon, programs would be likely candidates. Each of these 4 locations will likely introduce new programs over time, so somewhat more than 16 programs would enter into the evaluation. However, it may be impossible to assemble adequate retrospective data for some of the older programs, so on balance the study sites are likely to comprise a combination of 16 old and new programs. There are advantages to this selection of programs. First, each cluster of four programs is managed by a local "I Have a Dream" Foundation. As a result, these programs are expected to be reasonably uniform and to represent quality IHAD endeavors. It is also expected that these local foundations would be willing to support an evaluation. Second, each cluster of four programs is within the same school district or, at least, the programs are within adjacent school districts. This should facilitate the process of tracking children. Third, it is expected that the local foundations will have better records than are maintained by independent sponsors. Consequently, a research design that studies outcomes in these places is recommended. Within each of these programs, there is one dreamer class. (Some schools may have more than one dreamer class, but a dreamer class would be the program.) As noted, it is uncertain whether all these dreamer classes could enter into the evaluation. Some active dreamer classes would have graduated by the time that this contemplated evaluation would begin. Retrospective data may be so poor for other dreamer classes as to preclude using those classes in the evaluation. Using the classes 1 year before the dreamer class and 1 year after the dreamer class as comparison groups is recommended. As noted earlier, by including two classes an evaluator could test for trends. Furthermore, an increase in the size of the comparison group would add some modest additional power to hypothesis testing. Although the new MIS would be useful for the prospective part of this evaluation, it would not be helpful for retrospective data collection. For that purpose the evaluator would have to access school records for both the dreamers and the comparison group members. Gaining access will be complicated. There is an advantage to dealing with a relatively small number of school districts, because negotiations are likely to take a lot of evaluator time per district. An evaluator would probably seek to access juvenile criminal records, as well, and that would further complicate data assembly. With respect to prospective data collection, the evaluator might provide further development funds of the national foundation's MIS. The objective is to increase the utility of that system for evaluation without burdening the foundation. The evaluation would have to pay for a person to extract records and enter them into the system. It seems doubtful that NIJ and OJJDP would be willing to wait several years for an evaluation. It seems equally doubtful that the national foundation would be satisfied with an evaluation that required such a long time. Therefore, the evaluation should provide interim results on a yearly basis. One way to accomplish this is to report findings current as of the end of each year and then update those results on a yearly basis. NIJ also should consider incremental funding. A first phase would be for project design and, especially, for gaining access to the requisite records. Were that not accomplished, an evaluation would be impractical. The design phase might last 1 year. The second phase--perhaps years 2 and 3-- would be for data collection, analysis, and reporting. The third phase would be for continuing the evaluation conditional on satisfactory performance during the first three years, demonstrating that a longer evaluation would be informative. For Further Reading Fitzgerald, D., and T. Russell, From High School to College: Helping First- Generation College Students Lead the Way to Their Own Academic Successes, The Pennsylvania State University, The National Academic Advising Association, National Conference, Salt Lake City, September 30, 2002. Hall, M., Policy Evaluation of "I Have a Dream" Foundation, paper submitted to Professor Deanna Pagnini, Princeton University, May 12, 1998. Herman, J., and J. Munoz, Review and Evaluation of I Have a Dream--Los Angeles Group 8 1994095, UCLA Center for Research on Evaluation, Standards and Student Testing, Center for the Study of Evaluation, Graduate School of Education & Information Studies, University of California, Los Angeles, July 19, 1995. Higgins, C., K. Furano, C. Toso, and A. Branch, I Have a Dream in Washington, D.C. Philadelphia: Public/Private Ventures, winter 1991. Kahne, J., Personalized Philanthropy: Limits, Challenges, and Possibilities (unpublished paper), The University of Illinois at Chicago, February 10, 1998. Kahne, J., and K. Bailey, The Role of Social Capital in Youth Development: The Case of "I Have a Dream" (unpublished paper), The University of Illinois at Chicago, September 10, 1997. Kahne, J., and K. Bailey, "The Role of Social Capital in Youth Development: The Case of 'I Have a Dream' Programs," Educational Evaluation and Policy Analysis, 21(3) (Fall 1999): 321-43. Kubayama, E., East Palo Alto "I Have a Dream" Program Evaluation Report, Stanford, CA: School of Education, Stanford University, 2000. Manke, B., and S. Haggard, I Have a Dream Houston: Program Evaluation 1999-2000 (unpublished paper), Houston: Psychology Department, University of Houston. Maria, K., The "I Have a Dream" Project Mount Vernon, New York: 1987-1997, paper provided by the "I Have a Dream" Foundation (New York), 2001. NOTES 1. See http://www.ihad.org/projects.php. 2. Aaron, L., and B. Barnow, Evaluation of New York City's Class of 1992 "I Have a Dream" Program, prepared for the New York City "I Have a Dream" Foundation under contract with Lewin-VHI, 1994. 3. See http://www.ihad.org/projects.php. 4. See http://www.ihad.org/pubs.Info_ProjectModels.pdf. 5. Ibid. 6. This short review was abridged from McDonough, H., J. Jastrzab, C. Sipe, and C. Rappapport, Mentoring, E-mentoring, and At-risk Youth Literature Review: Final Report, prepared by Abt Associates, Inc., for The Foundation for Civility, 2002. 7. Tierney, J.P., J.B. Grossman, and N.L. Resch, Making a difference: An impact study of Big Brothers/Big Sisters. Philadelphia: Public/Private Ventures, 1995. 8. Sipe, C.L., and A.E. Roder, Mentoring School-Age Children: A Classification of Programs. Philadelphia: Public/Private Ventures, 1999. 9. Dubois, D.L., B.E. Holloway, J.C. Valentine, and C. Harris, "Effectiveness of mentoring programs for youth: A meta-analytic review," American Journal of Community Psychology, 30(2) (2002, Special Issue): 157-197. 10. Levine, A., and J. Nidiffer, Beating the Odds: How the Poor Get to College, San Francisco: Jossey-Bass, Inc., 1996. 11. See http://www.ihad.org/history.php. 12. See http://www.ihad.org. 13. See http://www.ihad.org/pubs/Info_FAQs.pdf. 14. McGrath, R., and J. Hayman, The Paterson, New Jersey "I Have a Dream" Program: Academic Performance and Outcomes, Teaneck, New Jersey: Fairleigh Dickinson University, 1997. 15. See Shoemaker, M., and M. Sims, Delivering on a Promise: An Evaluation of the "I Have a Dream" Foundation, Pasadena, California, New York: "I Have a Dream" Foundation, 1997; and Davis, A.E., G. Hyatt, and D. Arrasmith, "I Have a Dream" Program--Class One Evaluation Report, Portland, Oregon: National Regional Education Laboratory, February 1998. (This report was updated with an addendum in June 1998 and October 1998.)