Title: Solicitation for Evaluations of Office of Juvenile Justice and Delinquency Prevention Discretionary Funds Projects Evaluability Assessment: Youth Crime Watch of America Series: Solicitation Author: National Institute of Justice Published: June 2003 Subject: Funding resources 25 pages 51,000 bytes ------------------------------- Figures, charts, forms, and tables are not included in this ASCII plain-text file. To view this document in its entirety, download the Adobe Acrobat graphic file available from this Web site. ------------------------------- Preface: Youth Crime Watch of America Staff Contact: Terrence W. Modglin Executive Director 305-670-2409 NIJ has identified some key outcome variables and other parameters of interest for this project, and has also provided some guidance on possible evaluation designs. Applicants may depart from this guidance by providing appropriate rationale. NIJ believes that evaluating the program's outcomes through a national sampling of Youth Crime Watch of America (YCWA) sites is cost-prohibitive. Because of some of the data advantages mentioned in the evaluability assessment, NIJ asks applicants to consider a multi-site evaluation of YCWA programs implemented in various Florida schools; Florida has the largest number of sites with some data commonalities. Also for cost reasons, NIJ asks applicants to limit their evaluation designs to middle school and early high school participation. NIJ suggests a maximum project length of 4 years. NIJ is interested in two broad questions regarding program outcomes: o Does participation in YCW activities influence students in terms of academic performance, involvement in community affairs, and involvement in risk- seeking behaviors? Length of participation in YCW activities and intensity of participation may be important explanatory variables. o Does a strong YCW program produce lower levels of school disorder? The evaluation should take into account that the school level (i.e., middle school versus high school) and the size of the YCW program may affect the mechanisms by which YCW students engage in school order maintenance and safety. NIJ expects the cost of this evaluation to be no less than $1 million. Total funds available for all six evaluations covered by this solicitation are approximately $5 million. ------------------------------- Evaluability Assessment: Youth Crime Watch of America SYNOPSIS Grantee: Youth Crime Watch of America (2000-JP-FX-K001) Grant Period: March 1, 2000-February 28, 2003, but awarded May 31, 2002 Grant Award: $900,000 Funding History: Unknown Project Summary: Youth Crime Watch (YCW), modeled after neighborhood crime watch programs, began in 1979. Instituted as a result of the rape of a 12-year-old girl, the first YCW program was formed at Miami Beach Senior High School. According to the Youth Crime Watch of America (YCWA),[1] which was formed in 1986 to help schools and communities across the country implement student-led crime prevention programs, the Miami Beach program was successful at reducing the school's drug problem by half and "chain snatchings" by three-fourths. More than 1,000 YCW programs currently operate in the United States and overseas. The highest concentration of YCW programs is in Florida, as more than half of the State's school districts have these programs. A separate organization, Youth Crime Watch of Florida (YCWF), helps coordinate the programs in Florida. Funding for YCWA comes from several sources. The Office of Juvenile Justice and Delinquency Prevention (OJJDP) has provided funds for expanding the program throughout the United States. These funds support the development of regional expansion centers that serve local schools. They also pay partly for the continued development of program materials, YCWA participation in the National Youth Network, and an annual international conference. Additional funds for continued development comes from the U.S. Department of Education (DOE). As is true of the OJJDP funds, the DOE funds come from congressional earmarks. Funding from the State of Florida is channeled through the YCWF program. Although YCWA and YCWF are distinct, the two programs share staff and a common mission statement. For evaluation purposes there seems to be no reason for maintaining any distinction. Logic model According to the YCWA Web site (http://www.ycwa.org), YCW programs have three goals: 1. Provide crime- and drug-free environments through a youth-led movement. 2. Instill positive values, foster good citizenship, and build self- confidence in children. 3. Enable youth to become resources for preventing crime, drug use, and violence in schools. The model that the YCWA formulated has nine components, not all of which are required for a program to be considered a youth crime watch program. Table 1 lists program components and the frequency with which YCW programs implement those components. The frequency comes from an evaluation by SPEC Associates.[2] Table 1. YCW Project Components Program Component--Frequency of Component in YCW Programs (%) Youth patrols--68 Crime reporting--68 Drug, crime, and violence education--56 Action projects--57 Conflict resolution--32 Peer mediation-29 Mentoring 20 Bus safety--17 Peer and cross-age teaching 12 Although SPEC documentation is unclear, it appears that about 20 children directly participate in a typical YCWA program.[3] A panel of experts[4] specified the YCW logic model as: o Students have a sense of belonging; and therefore, --They are more involved in school, and --They have a better sense of self-worth. Therefore, -----They feel safe at school, and -----They achieve more at school; therefore, ------Schools experience less crime. o Students understand the system of crime reporting; and therefore, --They feel more comfortable reporting crime. Therefore, ----They are proactive in crime reporting; and therefore, ------Schools experience less crime. Given this logic model, it is apparent that YCW operates at two levels. At the first level, YCW focuses on individual children who participate directly in the YCW program. The goal is to improve the school environment and educational experiences for those specific children. (See goal 2 of the goal statements.) At the second level, YCW attempts to affect the school environment for all children. (See goals 1 and 3 of the goal statements.) An outcome evaluation might focus its attention on either or both of those levels. Categorizing and evaluating YCW programs It's useful to think of YCW as being three different programs that operate at the elementary school, middle school, and high school levels, respectively. At the time that this review was being prepared, YCWA was attempting to expand into colleges and universities, so a fourth program may emerge in the future. Programming at the three levels is age-appropriate (e.g., elementary school YCW programs are less student-led than middle or high school programs). This complicates the evaluation, of course. From an evaluation standpoint then, it seems possible that an evaluation of YCWA might be effective at the elementary school level but not at the high school level. An additional complication arises from the fact that programming varies markedly among programs. Controlling for educational level, diversity might be less than what is shown in table 1. Nevertheless, there is likely to be considerable residual variation after controlling for educational level. An evaluation might focus on elementary school, middle school, or high school programs, or on some mix of the three. An evaluation might be restricted to programs that have some minimal mix of program components. It might concentrate on outcomes that are specific to student participants (goal 2), to the overall school environment and climate (goals 1 and 3), or to both. ANALYSIS What is already known about projects like these? There are many school-based programs that have goals similar to YCW. For example, schools have implemented a wide variety of programs that attempt to change students' behaviors and attitudes toward crime, violence, and drugs. Some are well- known "packaged" programs, such as Drug Abuse Resistance Education [registered trademark] (D.A.R.E.) and Gang Resistance Education and Training (G.R.E.A.T.). Evaluation findings on these and other school-based crime prevention programs are mixed. Evaluations of D.A.R.E., for example, have failed to demonstrate any significant short- or long-term reductions in drug use among program participants.[5] On the other hand, an NIJ-funded evaluation of the G.R.E.A.T. program found some positive effects on program participants.[6] Other studies have found that many schools do a poor job of implementing what otherwise might be potentially effective programs.[7] Schools also have implemented a variety of strategies for improving the overall school climate (e.g., increased use of law enforcement and security personnel and technologies, such as video surveillance and metal detectors). Can improvements to school climate be linked to implementation of these strategies? The widespread use of many of these strategies suggests that, at a minimum, schools believe they are effective. There are other existing programs that, like YCW, feature students in leading or prominent roles: o Project S.A.V.E. (Students Against Violence Everywhere), for example, is a student-run organization--supported by a national headquarters--in which students learn about alternatives to violence and practice what they learn through school and community service projects. o S.A.D.D. (Students Against Drunk Driving) is also primarily student run, although its focus is more narrow than YCW or S.A.V.E. o Teens, Crime, and the Community (TCC) seeks to engage teenagers in community crime prevention through a variety of neighborhood action projects. Extant evaluations of these projects have largely focused on student-level, rather than schoolwide, effects. A recent evaluation of S.A.V.E. involved interviews and surveys with student participants and school administrators.[8] An evaluation of TCC compared participants against a control group and observed some positive effects in the TCC participants, including an increased level of social bonding and a reduction in the frequency of delinquent acts.[9] The evaluations have also focused on only one or two schools. What could an evaluation of this project add to what is known? YCWA starts with a basic, coherent model and has promoted that model, with variations, across diverse settings. At least potentially, that diversity allows evaluators to inquire about which aspects of YCWA are successful, with considerable assurance that findings may not be idiosyncratic to specific implementations of that program. Such an evaluation would be valuable because many of the YCW's components are traditional in the sense that they are found in other prevention programs. As noted above, evaluations of student-run crime and violence prevention programs have focused on student-level effects and have been limited to one or two schools. Consequently, evaluations of a few programs may be misleading when attempting to generalize to a larger constellation of programs. A broad evaluation--with a large number of schools and one that focuses on school-level effects--has not been undertaken. A focus on a large number of schools is important, because success or failure of a local program could arise from specific program configurations, special skills and charisma of project directors, or both. In many ways, it should be expected that student participants in YCW, S.A.V.E., and other school-based violence and crime prevention programs that actively engage students (as opposed to using a traditional classroom lecture format) would react positively to the program; and extant evaluations have demonstrated this. Whether meaningful differences in schoolwide indicators can be detected for these student-run organizations is largely unknown. Multi-year longitudinal studies have been conducted on participants in D.A.R.E. and G.R.E.A.T. A similar evaluation of YCW would contrast nicely with evaluations of these and other adult-led crime prevention models. Which audiences would benefit from this evaluation? What could they do with the findings? YCWA itself would benefit from an evaluation. First, the organization is sincere in wanting to improve services, and an evaluation provides a partial basis for that improvement. Second, the YCWA's ability to secure future funding from various sources would benefit from an independent evaluation that demonstrates the program' efficacy. Third, an evaluation of efficacy would assist YCWA in dissemination of its program, and it would give the YCWA program a basis for program evolution, expansion, and promotion. Schools that use the YCW model would benefit from an evaluation. First, of course, schools would like to judge whether or not YCWA would be useful for them. Second, presuming that a school determines that a YCW program is valuable, the school could examine the evaluation to consider what specific configuration to implement. Schools not currently using the YCW model also could use the evaluation results to help them determine whether to implement it. Other potential users of the YCWA program would have a basis for deciding what provisions of YCWA would best work for them. An evaluation would provide funders both current and prospective--with a basis for determining whether or not YCWA and other similar programs are/were worthwhile investments. An evaluation of YCWA would also contribute to the current evaluation literature dealing with school-based prevention programs. This contribution would benefit funding sources, program developers, and researchers, particularly schools implementing Project S.A.V.E., S.A.D.D., or TCC. Is the grantee interested in being evaluated? Yes. YCWA has sought evaluations in the past and has expressed a strong interest in a future evaluation. Are local sites interested in being evaluated? This is a national program with individual schools that serve as local sites. The experience of the SPEC evaluators indicates that selecting a random sample of schools from across the country would lead to a high initial refusal rate and a high drop-out rate from school participants. In the SPEC evaluation, 185 YCW sites were selected to participate in the evaluation. Of these, only 66 initially agreed to participate and, of the 66, only 31 completed both the pre- and poststudy surveys. These results suggest that local sites are interested, to some degree, in being evaluated. A more practical approach might be to obtain the cooperation of the superintendent in a school district that has a large number of YCW programs, such as Dade and Broward Counties in Florida. What is the background/history of this program? See project summary (discussed earlier) and the YCWA Web site (http://www.ycwa.org). Does headquarters monitor fidelity of implementation? Yes. YCWA encourages diversity and considers it to be a program strength. The national YCWA headquarters conducts an annual "Census Survey" of YCW programs. The census asks for participant information (e.g., number of youths in the core group, the estimated number of hours spent per month on YCW activities, and the ethnicity of participants), program information (e.g., the types of problems addressed and program components implemented), and whether the YCW program has attended any of the YCWA-sponsored training events. After a site submits the census survey, the YCWA registers the site as an official YCW site. Registered sites receive a certificate from the YCWA and mailings about YCWA trainings, retreats, conferences, and other events. What are headquarters' roles in the program? YCWA has a Board of Directors, a student advisory board, other advisors, and eight national staff. The national program has inaugurated nine expansion centers (with a tenth planned) to recruit new school participants and provide technical assistance. YCWA provides training and technical assistance. This includes o Preparing and revising training materials. o Regional training of experts at the expansion centers. o Direct training for States that lack expansion centers. o Presenting findings at national meetings. YCWA's dissemination plan includes o Newsletters. o Manuals. o Web site (e.g., profiles of YCW programs and chat room). o International Youth Leader Crime Prevention Conference. YCWA also supports and is a participant in the National Youth Network. At what stage of implementation are sites? This varies. Some sites, especially those in Florida where the program began, have mature programs that have operated for several years. Others are start-up programs (the minimum configuration for a school participant is a YCW advisor and a core group of at least six youths). For evaluation purposes, there appears to be a solid base of mature programs upon which to base an evaluation. What are the project's outcome goals in the view of the project director? YCW programs have three goals: o To provide crime- and drug-free environments through a youth-led movement. o To instill positive values, foster good citizenship, and build self-confidence in children. o To enable youth to become resources for preventing crime, drug use, and violence in schools. Does the proposal/project director describe key project elements? At minimum, a school must have a YCW advisor (e.g., a school administrator or a school resource officer) and a core group of at least six youths to participate in the program. There are nine program components (see table 1) but an individual YCW program does not have to implement all of them. It can pick and choose, based on what seems most appropriate for the school and interests of the participants. Do they describe how the project's primary activities contribute to goals? The SPEC evaluation provides a logic model that has been endorsed by the YCW program. From this perspective, the program is seen (by YCWA) to operate on two tracks: o Students have a sense of belonging; and therefore, --They are more involved in school, and --They have a better sense of self-worth. Therefore, ----They feel safe at school, and ----They achieve more at school; therefore, ------Schools experience less crime. o Students understand the system of crime reporting; and therefore, --They feel more comfortable reporting crime. Therefore, ----They are proactive in crime reporting; and therefore, ------Schools experience less crime. YCW operates at two levels. At the first level, YCW focuses on individual children who participate directly in the YCW program. The goal is to improve the school environment and educational experiences for those specific children. At the second level, YCW attempts to affect the school environment for all children. An outcome evaluation might focus its attention on either or both of those levels. Can you sketch the logic by which activities should affect goals? See the previous question. Are there other local projects providing similar services that could be used for comparisons? Although other school-based programs (e.g., D.A.R.E. and G.R.E.A.T.) have similar goals, there appears to be no other viable possibility. Will samples that figure in outcome measurement be large enough to generate statistically significant findings for modest effect sizes? To answer this question, one has to consider the sample at two levels. At the individual school level, the number of participants would probably be too small to provide acceptable power at that level of analysis. However, given the diversity among YCW programs and the likely sensitivity of program outcomes to the skills of specific advisors and police participants, site- specific evaluations are probably of little value. Multiple programs need to enter into the evaluation, and consequently the most important question about sample size is how many programs would participate in an evaluation. Assuming that elementary, middle, and high schools have distinct programs, an evaluation would need a sufficient sample within each of those three types of schools. Other stratification might be advisable. Impoverished schools might be distinguished from other schools. The evaluation might be limited to schools that meet the minimum configurations for being designated as a YCW program. Is the grantee planning an evaluation? YCWA has sponsored an evaluation. Although informative, it is based on a weak design and does not answer the questions of interest to NIJ and OJJDP. This is not intended as a criticism of YCWA or the evaluators, as the resources for an evaluation were insufficient to support a rigorous outcome evaluation. What data systems exist that would facilitate evaluation? Recall that YCW goal two pertains to children who participate in the crime watch program and goals one and three pertain to the school environment. If an evaluator were to judge outcomes on the basis of these three goals, then s/he will require data at the student level and at the school level. At the student level, an evaluation design would presumably track standardized achievement scores, grade-point average (GPA), disciplinary reports, graduation/advancements, and other student-level indicators. At the school level, an evaluation design would presumably track schoolwide indicators, such as the number of reported crimes and the level and type of disciplinary actions. In general, the data systems that would facilitate an evaluation potentially exist at the school level, the school district level, and the State level. Most school districts maintain "student information systems" that house a variety of student-based records, such as schedules, attendance, grades, standardized test scores, and some disciplinary actions. Many schools track "referrals"--instances in which a student is sent to an administrator's office for behavioral or other school rule violation. Some schools have automated systems to track such referrals. Generally, most of these referrals stay at the school level and are never reported to the central district office. More serious offenses, particularly those that could result in an out- of-school suspension, are reported to the central district office, which may document the offense in their student information system. Finally, some States require schools to report certain offenses to the State DOE or other State-level agencies. In Florida, for example, 21 different offenses must be reported to the Florida DOE and the local law enforcement agency. It is not known whether such records would be available across schools that participate in YCW. Furthermore, all schools in Florida are required to file a school safety plan with the State DOE. This would seem to be a source for learning about safety-oriented programs that exist in each school. Any evaluation would need to know about other safety-oriented programs to avoid confusing the effect from YCW with the effect from some other program. Florida also has mandatory reporting for school crimes. It's recommended that any evaluation be limited to Florida and more specifically, to Dade and Broward counties (or perhaps to one of the two). This reduction in scope would result in significant cost savings because consistent automated school data systems could provide much of the needed information. What are the key data elements contained in these systems? For student-based records, the key elements are achievement scores and disciplinary actions. For school-based records, the key elements are safe school strategies being implemented and reporting crimes and aggregate disciplinary actions. Are data available to estimate unit costs of services or activities? YCWA presumably maintains records of training and development costs. At the individual YCW sites, labor costs are presumably donated. In some cases, schools and other organizations (e.g., the local police department) donate funds to support the program, such as paying for trips to the national conference and YCW T-shirts and vests. Are there data about possible comparison samples? Identifying a comparison group will be the most serious problem facing any evaluator. All YCW programs are voluntary and at least some are exclusionary. That is, children choose to participate. Programs also screen the applicants. For example, in one of the schools, students had to apply and write an essay explaining why they wanted to be in the program. Selection bias is an unavoidable problem when evaluating how the YCW program affects participants. Some approaches to structuring an evaluation are suggested later in this document. Is there routine reporting of specific data from local sites? As noted earlier, programs submit an annual census survey to YCWA. The survey contains participant information (e.g., the number of youths in the core group) and program information (e.g., which of the nine components are being implemented). In general, how useful are the data systems to an impact evaluation? Assuming the evaluator can gain access to the above-described information, the data systems seem suitable for supporting and evaluation. Site Visit Evaluability Assessment Three sites were visited in Florida: o Hallandale Elementary School. o Northeastern High School. o Pioneer Middle School. Hallandale and Northeastern serve transient and lower-income populations. Pioneer serves a middle-to-upper income population. For example, at Hallandale 85 percent of students qualify for free or reduced price lunches, compared with only 7 percent at Pioneer. How much of the program model has the site implemented? As already noted, a YCW program has to provide a minimal configuration, and all three programs complied. Beyond that minimum, a program can choose program elements, and these programs selected their own mixes. Also as noted, programs differ at the elementary, middle, and high school levels. For the three sites in the Florida evaluation, the elementary school program had 96 kids, the middle school had 50, and the high school had 5. Activities were age-appropriate and varied across the programs. All three schools had implemented the safety patrol component, although in different ways--the middle school, in particular, had YCW members assigned to hallways each period of the day, whereas the high school patrols occurred at after- school or evening events (e.g., athletic events). In addition, all three schools conducted a variety of action projects (e.g., such community service activities as toy drives and food drives). The elementary school YCW members (mostly fourth and fifth graders) had a buddy system program with younger kids at the school. In the elementary and high school, the YCW programs are "clubs," meaning that activities generally take place before or after school or on weekends. By contrast, YCW is an elective at the middle school. That is, participants conduct project activities during an assigned period of the school day, and they earn academic credits for their participation. According to the YCWA, only about 2 percent of participating schools offer YCW as an elective. What is the intervention to be evaluated? From an evaluation standpoint, the programs and their target audiences are so different at these three age/educational levels that the evaluation should focus on three separate types of interventions. At the school level, a first step in evaluation would be to find certain clusters of activities that predominate and comprise one or more YCW models. This could be done readily with the participant and program data that each YCW provides to the YCWA. If the evaluator felt that some additional information should be collected in the annual survey, additional questions could be incorporated into that survey, presumably with minimal additional costs. For example, at the elementary school level, a standard cluster of activities might be in-school sessions on violence and drug prevention, age- appropriate after school activities, and cross-age teaching. The standard cluster might be found to incorporate other programs such as D.A.R.E. There might be more than one standard cluster of YCW activities, and if they are sufficiently different, the evaluation might contrast them. What outcomes could be assessed? By what measures? At the student level, an evaluator could assess outcomes in terms of standardized test achievements, GPA, attendance, disciplinary records, and graduation rates. At the school level, an evaluator could assess outcomes in terms of perceptions of safety (if student or staff survey data are available), crime reports, and disciplinary actions. Evaluators often use interviews but they are typically costly, and their costs increase when the interviewees are children and parental permission becomes an issue. Costs are even greater when the evaluation involves multiple schools. It is recommended that an approach maximize the use of school and State records, as they seem to provide the requisite information for an evaluation. Are there valid comparison groups? Because students choose to participate in YCW and, in some settings, YCW selects participants from a surplus of applicants, identifying a comparison group is complicated. Consider the following design. Suppose that an evaluation were to focus on elementary school programs, and suppose furthermore that an elementary school program involves fifth grade students. Suppose furthermore that the evaluation could collect data from 10 years of fifth graders. Because Florida schools keep records for 5 years, 5 of the 10 years could be collected retrospectively; the other 5 years might be collected prospectively. Let T1, T2 T10 denote the 10 years. YCW participants who were fifth graders at T1 would be matched with other fifth graders at T1 who did not participate in YCW. The matching would be based on the value of their outcome measures (standardized achievement scores, grades, attendance, and so on) predating the fifth grade. Matching would be repeated for fifth graders at T2, T3, and so on. From an evaluator's viewpoint, it would be desirable if YCW were not introduced until after T1. If that were the case, then fifth graders who were in school before the introduction of YCW would be matched with fifth graders who participated in YCW after the program had been implemented. The evaluator might choose to drop the first year of YCW, treating that year as a start-up year. Is random assignment possible? Presumably, random assignment is impractical. This would be a school-by- school decision, and it seems unlikely that schools would comply. What threats to a sound evaluation are most likely to occur? Matching cannot eliminate selection bias when children select programs and YCW staff screen participants. Are there hidden strengths in the project? No hidden strengths have been identified at this stage. What are the sizes and characteristics of the target populations? The size of the target population varies. YCWA requires a minimum of six core students, although the high school had only five at the time it was visited. The middle school and the elementary school had 50 and 96, respectively. Exact turnover is unknown; however, middle school and high school children typically spend more than a single year in the program. How is the target population identified (i.e., what are eligibility criteria)? Who/what gets excluded as a target? This varies across programs. The high school seemed to have trouble recruiting, and the advisor expressed no special selection criteria. In the middle school, participants were said to represent a cross-section of the student body. Applicants must write an essay expressing their interest. Children are not supervised constantly, so applicants have to be credible. However, the program selects some kids who "have potential" but who need "special help." The elementary school program had a waiting list. Have the characteristics of the target population changed over time? Although the program has evolved, major design changes do not seem to have occurred. How large would target and comparison samples be after 1 year of observation? Because the program varies greatly across the Nation and within Florida, sample sizes are unpredictable. As noted in Florida, sample sizes ranged from 5 (high school) to 96 (elementary school). The sample size is really based on the number of programs that participate in the evaluation. That is, the sample size from a single program is relevant because it provides a more or less accurate measure of program outcome for that school program. The number of programs that participate in the study is relevant because it provides a more or less accurate measure of effectiveness for the YCW program.[10] What would the target population receive in a comparison sample? The comparison group would receive whatever else the school offered. The comparison sample would almost certainly participate in other prevention efforts (e.g., D.A.R.E.) and other activities (e.g., sports, clubs, etc.), or perhaps some of the comparison group members would choose to participate in no activities. What are the shortcomings/gaps in delivering the intervention? The support from the school principal may vary considerably. In the middle school visited in this evaluation, it was clear that the principal strongly supported the project. The degree of support at the other two schools was unknown, because the principals were not available for meetings. Additionally, the intervention may be very modest with respect to the problems faced by children. What do recipients of the intervention think the project does? Based on interviews with students at each of the three schools, they felt that they had accomplished a lot and had made a positive contribution to the school and their community. On a personal level, some of the middle- school YCW participants said that, through their patrolling of the hallways, they had gained self-confidence and had overcome their shyness. At the school level, these same students felt that the school was safer ("there's less horseplay"). The participants at the elementary school commented that the program had made them work harder in school because of the requirement that YCW members always do their homework. How do they assess the services received? Students were enthusiastic about the program. They felt that it had contributed to their personal growth and improved the overall school climate. What kinds of data elements are available from existing data sources? As noted earlier, potential data sources include the YCWA's annual census survey, school-level referral databases, district-level student information systems, and State-level school crime databases. School-level referral databases typically contain, for each referral, the date of the referral, the reason for the referral (e.g., defiance), and the action taken (e.g., after- school detention). Administrators at the three Florida schools indicated that they had a referral database; however, documentation on the database was not obtained nor was quality of the data assessed. Student information systems contained a substantial amount of data on each student, including their attendance, schedule, grades, standardized test schools, and some disciplinary actions (particularly those involving out-of-school suspensions or expulsions). Again, documentation on student information systems used in the school districts was not obtained. State-level agencies that collect school crime data either collect aggregate data (e.g., the total number of serious incidents that occurred during a school year) or incident-level data (e.g., characteristics about each serious incident that occurred during a school year). In Florida, schools report crime data via the School Environmental Safety Incident Reporting (SESIR) system. SESIR collects data on incidents that occur on school grounds; on school transportation; and at off-campus, school-sponsored events, during any 24- hour period, 365 days per year. Incidents are reported to SESIR even if individuals other than students are involved or if the offender is unknown. Specific data elements in SESIR include the type of incident (one of 21 pre-defined categories); whether the incident is gang-, alcohol-, drug-, hate crime-, or weapon-related; whether students or non-students are involved; whether the incident occurred during school hours or after school hours; and whether the incident occurred on school grounds, at a school- sponsored activity, or on school transportation. What specific input, process, and outcome measures would they support? The YCWA census survey database contains a few potential input and process measures, including the number of participants, the program components implemented, the level of training obtained, and the number of hours spent on YCW activities. Referral databases, student information systems, and State-level school crime databases could potentially provide outcome measures for both individual students (e.g., changes in academic achievement or disciplinary incidents) and the entire school (e.g., changes in the level of reported crime or the total number of disciplinary actions). How complete are data records? Can you get samples? Samples of these records were not examined. In general, student information systems accurately tracked out-of-school suspensions and expulsions (since funding is often tied to attendance levels). The quality of crime reporting by schools, on the other hand, is thought to be highly variable. What routine reports are produced? The Florida Department of Education (and, in general, State education agencies in other States) publishes a variety of district-level reports on their Web site (e.g., graduation rates, dropout rates, test scores). It also publishes an annual school crime report on its Web site. Other reports are presumably available but not via their Web site. At the school level, the routine reports that are available from referral databases depend on the capabilities of the particular system in use. Can target populations be followed over time? Following students over time is instrumental for evaluating how YCW affects students, but any evaluator will confront major problems when attempting to follow these children. Schools keep records for 5 years, so to the extent that the evaluation design requires retrospective data, 5 years would seem to be the time limit. When following children prospectively (as well as retrospectively), the evaluator will confront the problem of confidentiality. An additional problem is that populations in two of the Florida schools (Hallandale and Northeastern) are transient. At the elementary school, the principal said that the school population had a 50-percent turnover rate per year. Although Florida might be extreme, experiences investigating other school-based programs indicate that Florida is not unique in one other regard. Specifically, if children are to be tracked beginning in elementary school or in middle school, then evaluators will discover that children will typically disperse from a single elementary school to multiple middle schools, and they will typically disperse from a single middle school to multiple high schools. If an evaluation could be localized to Dade County or perhaps to a combination of Dade County and Broward County, the Board of Education might grant access to school records, overcoming the difficulty of negotiating for records across multiple school districts. If all schools within Dade County (or Dade and Broward combined) were included in the evaluation, this would mitigate against the problem of students dispersing across multiple middle schools and high schools. Can services delivered be identified? As noted earlier, the YCWA census survey contains basic service delivery information. YCWA coordinators at each school could serve as additional reporting sources. Can systems help diagnose implementation problems? YCWA seems aware of implementation problems because it serves a monitoring function, but the extent to which systems can diagnose such problems is not known. Moreover, if the evaluation were limited to Dade (and perhaps Broward) County, the evaluator could choose among some mature programs. Do staff tell consistent stories about the project? Both the executive director and the president of the Board of Directors of YCWA had clear visions of the program, but YCWA expects schools to be innovative within the framework of the basic YCW model. In general, stories were different across the three settings. However, they were consistent among YCW coordinators at each school and among other school staff at the elementary and middle schools (including a police officer at the middle school). Are the backgrounds of YCW coordinators appropriate for the project's activities? It was apparent that the YCW coordinators were dedicated and knowledgeable. What do partners provide/receive? N/A How integral to project success are the partners? N/A What changes is the director willing to make to support the evaluation? The YCWA executive director is firmly committed to improving the quality of YCW programs and sees an independent evaluation as an important way of achieving this. Possible specific changes were not discussed. Presumably he would have no problem with minor changes to the census survey. Major changes to the program for the sake of the evaluation seem unreasonable. CONCLUSIONS Would you recommend that the program be evaluated? Why or why not? Provide a table that highlights commonalities and differences across the sites visited. What kinds of evaluation designs would you propose? Consider the following possibilities: 1. There is sufficient consistency in program implementation across sites to warrant a national evaluation (i.e., one that could sensibly aggregate information across sites). 2. Sites vary so greatly that each is rather unique; multi-site comparisons would be largely descriptive and syntheses of results would require considerable professional judgment. 3. Certain program elements are consistent across sites while others vary; a hybrid of a national evaluation for some elements and multi-site comparisons for others seems feasible. 4. There are one or more sites that offer good opportunities to evaluate an important intervention regardless of whether they generalize to the program. If you were to evaluate the program under some multi-site strategy, what criteria would you use to sample the pool of sites and how easily could this sampling be accomplished? What should OJJDP's grant manager know about this program? It's recommended that NIJ fund an evaluation of the YCW program. Clearly YCWA has been successful at disseminating this model across the Nation and such expansion continues. Both YCWA and participating schools seek to learn the extent to which the program is successful, what elements of the program are the most important, and how the program should be changed to improve success rates. An evaluation would pose major challenges for an evaluator. As noted already, the evaluation should focus separately on elementary schools, middle schools, and high schools. Because of diversity across YCW programs, a successful evaluation would require a large number of each kind of school. Some outcome measures are school specific while other outcome measures are student participant specific. School-specific data are less expensive to collect, although effect sizes are likely to be small. This observation suggests that an evaluator might attempt to collect school-specific data from a large number of schools, thus a study of schools in Florida is recommended. Student participant-specific data are more expensive to collect, and effect sizes will probably be easier to detect than effects from school-level data. For student participant-level data, a study across a smaller number of Florida schools is recommended for the following reasons: Limiting the evaluation to Florida has distinct advantages. A principal advantage is that the evaluator would face a common data source because Florida has a statewide-reporting requirement, a statewide requirement to provide a safety plan, and statewide achievement testing. Having common data sources is a distinct advantage for a cross-site evaluation. Another advantage is that YCW is widely but not universally used across Florida. Some schools have used YCW for many years; some have recently begun their program; still others have not yet implemented a YCW program. This variation in implementation is the best that an evaluator could hope to face because it provides the prospect for a cross-section, time-series design of program impact. Furthermore, a true national evaluation is especially costly because of the need to travel to multiple sites and deal with many independent gatekeeper authorities. A disadvantage is that the evaluation would be unique to Florida. Whatever makes Florida special may account for what YCW does or does not accomplish. Because YCWA does a yearly survey of all YCW programs, at least it would be practical to compare Florida programs with other programs across the country, thereby determining whether Florida programs operate similarly to non-Florida programs. Negotiating for student and school records is likely to be time consuming, resource intensive, and uncertain. That is a principal reason why limiting the evaluation of student-specific outcomes to Dade and Broward Counties is recommended. Especially in Dade County, YCWA has close contacts with the Board of Education, so there is the prospect of cooperative behavior. The evaluator could possibly seek and receive cooperation from other Florida counties. A cross-sectional time-series design would make for a strong evaluation. One approach would be to select a 10-year period during which cohorts who enter the study in each school can be identified. Five of those 10 years could be based on retrospective data; the other 5 years would be collected prospectively. Let T1 through T10 represent the 10 years. Assume that the evaluation were based on elementary schools. (The plan would be the same for middle schools and high schools.) During year T1, the evaluation would identify all children who joined YCW, whether or not they were in YCW at the end of the year. It would also identify a matched set of non-participants. Matching would be done on the basis of grade, gender, achievement tests, GPA, disciplinary infractions, and other factors. The evaluation would use records current at the time that the child joined YCW. If YCW did not exist before year Tx, then the comparison sample would be selected to match the characteristics of children who participated in year Tx. If a school never had a YCW program, then matching would be based on the characteristics of children who participate in YCW at neighboring schools. There are variations on this approach, but the basic point is that comparison subjects are a set drawn contemporaneously or historically (or both) to match program participants. Members of participant cohorts and comparison cohorts would be followed until their high school careers end, they drop out of school, or the study ends. NIJ will need to decide how long it intends to fund this evaluation. By using retrospective data, the evaluation could provide meaningful interim results. The scientific validity of those findings would increase the longer the evaluation period. At the aggregate level, a long-term follow up is less important. The cross- sectional time-series design would measure the extent to which school- based crime and disciplinary problems decreased with implementation of a YCW program. Having a cross-sectional time-series design and varying implementation dates for YCW programs provides some assurance that meaningful changes in school environments attributable to YCW can be detected. Many uncertainties exist without the benefit of a formal power analysis: o Many variations across YCW programs, both in program elements and in students served. o The precision with which outcomes can be measured. o The size of the effect that would be deemed to be cost beneficial. NOTES 1. Please note the difference between the two acronyms: YCWA refers to the national organization, whereas YCW refers to the Youth Crime Watch model that is implemented in schools. 2. Lynch, C.O., V.L. Essenmacher, and M. Hwalek, Youth Crime Watch of America 2001-2002 National Outcome Study, Detroit: SPEC Associates, 2003. 3. Ibid. 4. Ibid. The SPEC report says that the panel is comprised of ". . . national YCW experts who included YCWA Board members and staff, YCW site advisors and students." 5. See, for example, Thombs, D.L., "Retrospective Study of DARE: Substantive Effects not Detected in Undergraduates," Journal of Alcohol and Drug Education 46 (1) (2002): 27-40. 6. Esbensen, F.A., D.W. Osgood, T.J. Taylor, D. Peterson, and A. Freng, How Great Is G. R. E. A. T.? Results From a Longitudinal Quasi- Experimental, Washington, DC: National Institute of Justice, 2001, NCJ 191891. 7. See, for example, Crosse, S., D. Cantor, M. Burr, C.A. Hagen, I. Hantman, M.J. Mason, A.J. Siler, A. von Glatz, and M.M. Wright, Wide Scope, Questionable Quality: Three Reports from the Study on School Violence and Prevention (Executive Summary), Washington, DC: U.S. Department of Education, Planning and Evaluation Service, 2002. See also, Gottfredson, G., D. Gottfredson, E. Czeh, D. Cantor, S. Crosse, and I. Hantman, National Study of Delinquency Prevention In Schools, Ellicott City, Maryland: Gottfredson Associates, 2000 (unpublished paper) (also see http://www.gottfredson.com/national.htm). 8. See http://www.nationalsave.org/main/ETIEVAL.php. 9. See Teens, Crime, and the Community: 1995 Student Impact Assessment Report (http://ericae.net/ericdb/ED395078.htm). 10. One might suggest an evaluation based on just a few programs as a test of concept. There are two problems. The first is that an evaluator would not know if all YCW programs are successful, if these few were deemed successful; nor would s/he know if all were failures, if these few programs were deemed a failure; nor, of course, would the success of some and the failure of others provide much firm guidance. The second problem is that YCW programs are diverse. A study of multiple programs seems necessary to understand the importance of that diversity.