Title: Solicitation for Evaluations of Office of Juvenile Justice and Delinquency Prevention Discretionary Funds Projects Evaluability Assessment: I-SAFE America, Inc. Series: Solicitation Author: National Institute of Justice Published: June 2003 Subject: Funding resources 21 pages 40,000 bytes ------------------------------- Figures, charts, forms, and tables are not included in this ASCII plain-text file. To view this document in its entirety, download the Adobe Acrobat graphic file available from this Web site. ------------------------------- Preface: I-SAFE America, Inc. Staff Contact: Teri Schroeder President and Project Director 760-603-7911, ext. 12 NIJ has identified some key outcome variables and other parameters of interest for this project, and has also provided some guidance on possible evaluation designs. Applicants may depart from this guidance by providing appropriate rationale. NIJ believes that the evaluation should focus on I-SAFE's fifth through eighth grade curriculum. Because of the broad implementation of this curriculum, NIJ recommends evaluating the program's outcomes through either a national sample of project sites or a multi-site strategy. NIJ suggests a maximum project length of 2 years. NIJ is interested in three broad questions regarding program outcomes: o Do students retain the knowledge received during the I-SAFE lessons? o Do they use this knowledge? o Suppose that a strong, well-implemented I-SAFE curriculum produces measurable short- and long-term outcomes. At what reduced levels or intensities of implementation are program benefits no longer measurable? NIJ expects the cost of this evaluation to be no less than $450,000. The total amount available for all evaluations in this solicitation is approximately $5 million. ------------------------------- Evaluability Assessment: I-SAFE America, Inc. SYNOPSIS Grantee: I-SAFE America, Inc. (2002-MU-MU-K003) Grant Period: April 1, 2002-March 31, 2003 Grant Award: $3,198,600 Funding History: Recipient of Federal earmark through OJJDP beginning in April 2002. In 2003, the U.S. Congress increased the funding to $5 million. Project Summary: The initial mission of I-SAFE America was "to reduce the risk of children being victimized as a result of online Internet activity." Today its mission is to educate and empower youths to safely and responsibly take control of their Internet experiences. To this aim, the grantee is dedicated to the development and implementation of a nationwide I-SAFE Safe School Education Curriculum and a Community Outreach Plan. The I-SAFE program provides kids and teens with essential tools to reduce their risks of being victimized through activities in which they engage via the Internet. These tools are designed to reduce the risk of victimization if adopted. I-SAFE's goal is to provide students with the awareness and knowledge they need to recognize and avoid dangerous, destructive, or unlawful online behavior and to respond appropriately. As of May 2003, the curriculum for grades 5 through 8 has been implemented in some classrooms in 18 States. It is unclear to what degree the outreach component has been implemented nationwide. Scope of Evaluation: The two major activities of I-SAFE are the provision of the I-SAFE school education curriculum to schools nationwide and community outreach. To date, two I-SAFE education curriculums have been developed. The first curriculum was created for fifth through eighth grade students and has been implemented in school classrooms in 18 States. The second, more recently released curriculum is online and designed to reach students from grades 9 through 12. I-SAFE is also working on expanding the curriculum to include kindergarten through grade 4. The community outreach component includes events for the community at large and school-based assemblies for the student population at which Internet safety issues are discussed. The community outreach component has not been implemented in all of the 18 States that are currently implementing the I-SAFE school education curriculum. On the basis of grant reviews and site visits, the assessment team proposes that the I-SAFE education curriculum for fifth through eighth grade students is appropriate for a full-scale evaluation. Summary of Evaluability Assessment Activity: Four data collection methods were used. They included phone interviews with the OJJDP grant manager and indepth reviews of program materials, including the grant application, categorical reports, budget information, e-mail exchanges between the OJJDP program manager and the grantee, relevant literature on Internet safety, and I-SAFE curriculums. Phone calls were also made to the I-SAFE project director and to school staff at each site. Finally, headquarters and selected sites were visited to gather information on program implementation. Site visits were conducted at Appomattox, Virginia; Hastings, Nebraska; and two school districts in Tahlequah and Key, Oklahoma. For a summary of the site visits, see appendix A. Findings: An outcome evaluation of the I-SAFE education component, specifically the curriculum for grades 5 through 8, is feasible. This component has a clear logical link among goals, activities, and outcomes. Sample sizes are large enough to yield statistically significant results, and the program is being implemented at a level to support an intensive evaluation. Information from an evaluation of this curriculum would inform policymakers, practitioners, and the population at large regarding the effectiveness of curriculums on Internet safety in improving outcomes for families and children. ANALYSIS What is already known about projects like these? According to the National Center for Education Statistics, in fall 2001, 99 percent of public schools in the United States had access to the Internet. A growing concern is that, as Internet use continues to grow, cybercriminals will grow in number as well. These criminals take the form of sexual predators, pornographers, hackers, and thieves. All too often, their victims are youths. To better understand the risks to adolescent girls online, a baseline Web- based study was conducted by Seventeen Magazine Online in conjunction with various partners. An online survey was offered on the Seventeen Magazine site to assess the level of Internet use, at-risk online behavior, negative interactions online, and perceived measures promoting safety. Most of the participants used a home computer for their online activities. However, computers at the school were used as well. Twenty percent of the participants said they spent their time surfing for new things on the Web, and 16 percent said they were spending it in chat rooms. Respondents were asked to choose activities from a list of online risks in which they had participated. Sixty percent said they had filled out a form or questionnaire giving out personal information, 45 percent admitted to disclosing similar information to an individual they met online, and 23 percent said they had given out pictures of themselves to someone they had met on the Internet. In response to this growing concern, it is important to know how much children understand about the potential threats they may be subject to while spending time in cyberspace. In the same Seventeen online survey, 70 percent of participants reported that their parents had discussed online safety with them, and 35 percent reported that teachers addressed cybersafety. However, engaging them in discussions about cyberactivities was not as common. Only 30 percent said that a parent or teacher had spoken with them about their online activities.[1] Two curriculums have been identified. The United Kingdom's Internet Proficiency Scheme was developed to help teachers educate children on staying safe on the Internet. The model was developed by British Educational Communications and Technology Agency (Becta), the Department for Education and Skills (DfES), and the Qualifications and Curriculum Authority (QCA). It aims to create a set of safe and discriminating behaviors for students to adopt when using the Internet and other technologies.[2] The curriculum includes a detailed teacher's guide and a set of lesson plans on such topics as managing e-mail, responding in chat rooms, and evaluating Web sites. Becta and the DfES piloted the Internet Proficiency Scheme during the summer of 2002. The pilot involved a cross-section of 50 schools. The teachers were given a trial proficiency pack that included teaching activities, learning objectives, teachers' notes, activity sheets, and sample certificates. The pack was used in conjunction with the Cybercafe, an interactive Web site that helps teach the students the curriculum. The feedback from the teachers who participated was used to develop the teachers' pack and Cybercafe Web site. One teacher reported, "The Internet Proficiency Scheme has had a huge impact on pupils, as we hadn't broached the subject before. Pupils are now discussing safety with teachers, who encourage them in that discussion without being judgmental."[3] CyberSmart! is another new and similar curriculum designed to help students (K-8), parents, and teachers work together to boost security on the Internet. The curriculum is copublished by Macmillan/McGraw-Hill and includes standards-based lesson plans, student activity sheets, and other materials that promote the responsible and effective use of a school's technology investment.[4] Issues addressed by the curriculum include Internet safety, manners, and advertising. Although there has been no formal evaluation of the program, according to the executive director of CyberSmart!, it is the only "responsible use" curriculum supported by the White House to date, and has been integrated into both small and large school districts around the country. Links to CyberSmart!'s Web site are provided in millions of McGraw-Hill elementary school social studies textbooks. This curriculum is promoted at all major educational trade shows and touted by Terry McGraw, CEO of McGraw-Hill. Several Internet sites (e.g., http://www.worldkids.net, http://www.SafeKids.com, http://www.Getnetwise.org, http://www.Yahooligans.com) exist that contain some type of guidance for children on how to safely use the Internet, but CyberSmart! is the only curriculum available for school-age children regarding Internet safety. What could an evaluation of this project add to what is known? It would be beneficial to learn how Internet safety rules and curriculum- based programs have an effect on children's use of the Internet. Except for anecdotal feedback from teachers regarding the United Kingdom's Internet Proficiency Scheme, there were no other research studies or literature available regarding the effectiveness of these types of programs. Which audiences would benefit from this evaluation? An evaluation would inform stakeholders from the education community, school systems, policymakers, funding organizations, children, parents, and the general public. What could they do with the findings? The findings could be used to lobby for mandatory Internet safety programs in the schools, and to raise awareness in the general community and among parents and children regarding the importance of Internet safety. Findings may also be used to deter predators from committing cybercrimes by making them aware that the public knows of their existence. Is the grantee interested in being evaluated? Yes. I-SAFE headquarters selected the sites and made the initial contact. It was also learned during a visit to headquarters that I-SAFE had hired an evaluator the prior month, but no specific evaluation plan has been developed or implemented to date. Are local sites interested in being evaluated? Yes. The four sites visited have expressed an interest in being part of the evaluation. What is the background/history of this program? I-SAFE America is a nonprofit educational foundation founded in 1998. Its mission is to educate and empower youths to safely and responsibly take control of their Internet experience. The goal of the program is to provide students with the awareness and knowledge they need to recognize and avoid dangerous, destructive, and unlawful online behavior and to respond appropriately. I-SAFE America is dedicated to implementing a standardized Internet safety education program throughout the Nation that provides children and teenagers with essential tools to reduce their risk of being victimized while engaged in activities via the Internet. To this end, I-SAFE America is launching an Outreach Campaign to empower students to take control of their online experiences and make educated, informed, and knowledgeable decisions as they actively engage in cyberactivities. In 2002, I-SAFE's Safe School Education Initiative and Outreach Campaign received bipartisan support from both the U.S. Senate and the U.S. House of Representatives. Subsequently, the U.S. Congress awarded I-SAFE America $3.554 million to begin to fulfill its mission and goal. In 2003, Congress increased its support of I-SAFE and awarded it $5 million to continue its vital work. The Safe School Education Initiative and Outreach Campaign were scheduled for implementation in 24 states during 2002-2003. Per the project plan established under the grant, the curriculum will be expanded to be age-appropriate for student in kindergarten through fourth grade. Does headquarters monitor fidelity of implementation? I-SAFE encourages fidelity of implementation by providing standardized training activities and encouraging sites to complete two documents at the beginning of each year. The two documents are the I-SAFE Education and Outreach Implementation Plan (appendix B) and a Cooperative Agreement (appendix C). A preliminary assessment of the I-SAFE implementation plan form indicates that the document was written without taking into consideration the principles of program fidelity. In addition, the national headquarters does not conduct periodic site visits to the numerous local sites to monitor consistency and implementation. During one of the site visits, a teacher mentioned that she was planning to make changes to some of the lessons for next year. Headquarters is, however, paying close attention to the administration of the pre- and posttests for each of the I- SAFE lessons. What are the headquarters' roles in the program? After a school district's administration has been contacted, a district coordinator has been identified, and an initial school has been chosen to implement the I-SAFE America curriculum, an I-SAFE staff member is assigned to the school to help develop the implementation plan. The I- SAFE staff person trains instructors and provides all classroom materials, including manuals, handouts, promotional materials, course evaluation forms, and tracking materials. The staff person also familiarizes school personnel with the procedures necessary to fully implement the program. Additionally, representatives of the I-SAFE outreach department are assigned to coordinate community outreach events and work with students as mentors and ambassadors. Throughout implementation, I-SAFE America staff continue to make themselves available to school personnel and are available for additional training at any time. In addition to providing implementation support, I-SAFE staff are responsible for monitoring the lessons to ensure that they are in line with the goals identified in the implementation plans. Staff also monitor implementation by reviewing the test results from each lesson (pre- and posttests are required for every lesson from every student; these tests are sent to headquarters for analysis). At what stage of implementation are sites? All sites are in full implementation mode and have either completed the curriculum once or are in the middle of completing it. What are the project's outcome goals in the view of the project director? The I-SAFE Safe School Education Curriculum provides students with the knowledge to-- o Independently recognize and avoid dangerous online situations. o Recognize techniques used by predators to deceive them. o Critically appraise online situations in which they find themselves. o Independently recognize inappropriate materials, Web sites, and online behavior. Does the proposal/project director describe key project elements? According to the program director, the I-SAFE America project is dedicated to the following two key elements: o Implementing a standardized Internet safety education program throughout the Nation that provides children and teens with essential tools to reduce their risk of being victimized while engaged in activities via the Internet. o Launching its Outreach Campaign to empower students to take control of their online experiences and make educated, informed, and knowledgeable decisions as they actively engage in cyberactivities. Do they describe how the project's primary activities contribute to goals? The primary activity of this project is the delivery of the I-SAFE curriculum to youth. This activity contributes to I-SAFE goals by providing students with the knowledge and skills to recognize and respond to dangerous situations and inappropriate behavior displayed on the Internet. Each lesson of the curriculum has been mapped against curriculum standards/skills, learning modalities, and academic areas. For instance, lesson 2 ("Personal Safety") is constructed around the FBI's "Internet Safety Tips for Kids." [5] (I-SAFE has been authorized by the FBI to use their Logo and Safety Tips within the curriculum.) Can you sketch the logic by which activities should affect goals? The logic model presented in appendix D is based on information from the grant application. Are there other local projects providing similar services that could be used for comparisons? No. Across sites, no other project similar to I-SAFE could be identified as a possible comparison group. Will samples that figure in outcome measurement be large enough to generate statistically significant findings for modest effect sizes? Yes. More than 18 sites across the Nation use the I-SAFE curriculum. Additionally, the Safe School Education Initiative and Outreach Campaign were scheduled for implementation in 24 additional States during 2002- 2003. Finally, a status report on the amount of participants in each implementing State is included in appendix E. It indicates that more than 10,335 students in 28 school districts (including more than 70 schools) in 20 States have participated in the program. Is the grantee planning an evaluation? Personnel at the I-SAFE headquarters recently hired an evaluator to analyze the pre- and postdata that are submitted by the sites. Although an evaluation plan has not yet been developed or implemented, the hiring of an evaluator suggests that the grantee is planning an evaluation. Also, personnel at the headquarters reported that they plan to evaluate the program. What data systems exist that would facilitate evaluation? What are the key data elements contained in these systems? Data from headquarters. Below is a summary of the data elements being collected by the national headquarters: o Contacts (names and numbers). -- Educational leaders/contacts. -- Community leaders. -- Mentors and ambassadors (students). o Events. -- Outreach events. -- PTA introductory meetings. -- Community leader meetings. o Train the trainer. -- Locations. -- Names and numbers of people trained. o Number of students entering the system. Results from pre- and posttests. Some key data elements measured by this instrument are knowledge of the cyber community, personal safety, computer viruses, intellectual property, and law enforcement and the Internet. The national headquarters reports that it is planning to collect data that will track behavioral and/or attitudinal changes after a student's exposure to the program. Are data available to estimate unit costs of services or activities? There is virtually no cost to the sites for the implementation or maintenance of the I-SAFE program, except in relation to teachers' time to implement the program. One site mentioned the cost of paper and copying; these costs were minimal, however. Are there data about possible comparison samples? Yes. Across sites, middle schools with similar demographics were identified as possible comparison schools for an evaluation. Is there routine reporting of specific data from local sites? The only routine reporting is the submission of the pre- and posttests to headquarters. In general, how useful are the data systems to an impact evaluation? The I-SAFE data system located at headquarters could prove useful to an impact evaluation. The results of the pre- and posttests could speak to changes in knowledge. Is the project being implemented as advertised? No. Program delivery varies in terms of dosage (e.g., all five lessons covered versus fewer than five lessons), frequency (e.g., all curriculum taught in 1 week versus one class per month), and modality (e.g., curriculum taught as part of a computer class versus replacement of a core subject like English or math). During the site visit, it also was noted that some school facilitators plan to use the results of the pre- and posttest to modify the curriculum (e.g., place more emphasis on areas that the students do not understand after the program's completion). What is the intervention to be evaluated? The evaluation would focus on implementation and outcomes related to the I-SAFE curriculum for the fifth and eighth grades. The curriculum consists of five lessons. What outcomes could be assessed? By what measures? The I-SAFE Safe School Education Curriculum provides students with the knowledge to: o Independently recognize and avoid dangerous online situations. o Recognize techniques used by predators to deceive them. o Critically appraise online situations in which they find themselves. o Independently recognize inappropriate materials, Web sites, and online behavior. Results from pre- and posttests for each lesson of the curriculum are available. Other measures, such as Internet traffic to inappropriate sites and the amount of Internet use, can be tracked through school records and self- reported instruments. Some schools have a system in place to monitor trafficking to inappropriate sites. For instance, one Oklahoma site has an Internet support system that provides the school librarian with a snapshot, every 8 seconds, of the sites students are accessing. Are there valid comparison groups? Yes. Each I-SAFE program visited was able to suggest one or more demographically similar comparison schools or districts that were not implementing the I-SAFE curriculum at the time of the evaluability assessment. The school in Appomattox, Virginia, can potentially be compared with Prince Edward Middle School (Farmville, Virginia) and Charlotte Middle School (Charlotte Courthouse, Virginia). The school in Hastings, Nebraska, can potentially be compared with one in Columbus, Nebraska; and the school district in Tahlequah, Oklahoma, can potentially be compared with the Holbert School District (Holbert, Oklahoma). Is random assignment possible? No. However, schools and students can be matched on such variables as socioeconomic status and grade point average. What threats to a sound evaluation are most likely to occur? There are two major challenges to an evaluation: issues with program fidelity (e.g., the program is not being implemented consistently across the sites) and agreement across sites on I-SAFE outcomes (e.g., knowledge versus attitudes versus behavior). Are there hidden strengths in the project? No hidden strengths have been identified at this stage. What are the sizes and characteristics of the target populations? o The Appomattox, Virginia, site has served 190 sixth, seventh, and eighth grade students to date. It expects that 287 students will have been served by year's end. o The Hastings, Nebraska, site will deliver the program to 500 seventh and eighth grade students. o The Tahlequah, Oklahoma, site will deliver the program to 250 sixth grade students who participated in lessons 1, 2, and 5. o The Keys, Oklahoma, site will deliver the curriculum to 100 students in the fifth, sixth, seventh, and eighth grades. Additionally, 100 community members received 1 lesson, and 150 teachers have received the full curriculum. How is the target population identified (i.e., what are eligibility criteria)? Who/what gets excluded as a target? The eligibility requirement is established by the grade the curriculum was designed to target. At the Appomattox site this year, all of the students who have taken an elective computer class have received the curriculum. Because the curriculum will be taught as part of a course that is required for all students next year, all students will receive at least part of the curriculum. In the Hastings site, all students receive the curriculum. In the Tahlequah site, sixth, seventh, and eighth grade students who have participated in certain elective classes or are members of targeted community groups (e.g., Boy Scouts) are selected for participation. Have the characteristics of the target population changed over time? The characteristics of the target population have remained consistent since the inception of the program. However, I-SAFE is working on a curriculum for kindergarten through fourth grade. Once this curriculum is available, the target population will expand to include youths in kindergarten through 12th grade. How large would target and comparison samples be after 1 year of observation? The sample will vary depending on the size of the school. Small nonurban schools tend to have a stable population. What would the target population receive in a comparison sample? The comparison sample would not receive the I-SAFE curriculum, although it could potentially be involved in other school safety and prevention activities. What are the shortcomings/gaps in delivering the intervention? Two shortcomings are noted: (1) completion of the pre- and posttests as designed by I-SAFE and (2) implementation of the curriculum by law enforcement officers who have not yet been trained. What do recipients of the intervention think the project does? How do they assess the services received? Students were not interviewed. What kinds of data elements are available from existing data sources? Below is a summary of the data elements being collected by the national headquarters: o Contacts (names and numbers). -- Educational leaders/contacts. -- Community leaders. -- Mentors and ambassadors (students). o Events. -- Outreach events. -- PTA introductory meetings. -- Community leader meetings o Train the trainer. -- Locations. -- Names and numbers of people trained. o Number of students entering the system. o Results from pre- and posttests. Data available across sites include disciplinary records, attendance records, records of inappropriate use of the Internet (length of time and type of site visited), and pre- and posttests for each of the sessions taught. Some sites have a more sophisticated system to track students' Internet use. For instance, Tahlequah has an Internet support system that provides the librarian with a snapshot, every 8 seconds, of which sites students are accessing. What specific input, process, and outcome measures would they support? I-SAFE headquarters and sites can support all of them. How complete are data records? Can you get samples? Samples of pre- and posttests and other data collection forms are included in appendix F. What routine reports are produced? Attendance records and disciplinary records. Can target populations be followed over time? If the curriculum is being taught in schools located in small, nonurban communities with limited migration, the target population can be followed over time. If the target population is reached after school hours (for example, in Boy Scout troops), it would be difficult to track participants. Can services delivered be identified? Yes. The curriculum and related materials are available and are used at all sites where the program is implemented. Can systems help diagnose implementation problems? I-SAFE personnel who are assigned to assist each site with implementation are a potential resource for helping to diagnose implementation problems. This was not discussed with I-SAFE headquarters, however. Do staff tell consistent stories about the project? Yes. Across sites, staff report that I-SAFE is a valuable program. Are staff's backgrounds appropriate for the project's activities? Yes. All staff have either a school-based or law enforcement background. What do partners provide/receive? Law enforcement officers provide time to teach lesson 5 (Law Enforcement and Internet Safety). At some of the sites visited, law enforcement officers also add additional teaching material to the lesson. For their contribution to I-SAFE, officers receive gratitude and respect from students, teachers, and administrators. In Oklahoma, the school librarian is also involved in the program. How integral to project success are the partners? Across sites, those interviewed reported that having law enforcement teach lesson 5 was very important for communicating the message and the seriousness of the law when applied to Internet use and abuse. What changes is the director willing to make to support the evaluation? Staff from I-SAFE headquarters and program sites reported that they would support the evaluation effort. However, the national headquarters and one site would like to have more information about the scope of the evaluation before committing to participating in a full-scale evaluation. CONCLUSIONS Would you recommend that the project be evaluated? Why or why not? Yes. The assessment team recommends that the program be evaluated for the following reasons: o The program is currently being implemented in 18 states and thus provides opportunities for a sample large enough to produce statistically significant results. o Evidence regarding the effectiveness of the curriculum is unknown, but the curriculum is innovative and well received by both staff and participants. o A significant amount of money is being invested in the program without an empirical understanding of its effectiveness or outcomes. o Internet safety is a popular issue in Congress, and so an evaluation would likely receive attention and fiscal support. o The OJJDP grant manager has strongly recommended that the program be evaluated. o The program headquarters and the sites (those that were visited) support an evaluation effort. What kinds of evaluation designs would you propose? A cross-site evaluation design is recommended, in which observations are nested in two ways. First, observations would be nested in individual participants who would provide information at baseline and follow-up. Second, observations would be nested in sites with differing characteristics, including socioeconomic, demographic, and other differences. Students receiving the I-SAFE curriculum would be matched with students from another school where the curriculum has not been implemented. The purpose of this design is to determine the degree to which I-SAFE increases students' knowledge of identified outcomes compared with similar students who do not receive the intervention. Statistical analysis such as hierarchical linear models will allow the evaluation to control for differences between sites on such characteristics as socioeconomic status and race and ethnicity while explaining the variance among variables of interest. The assessment team recommends that the evaluation consist of a process (including dosage) and an outcome component: o Process component. This component will focus on information related to program activities, target population, and structural factors of the sites. Because program outcomes depend in part on "context effects," process data will be crucial in understanding variation in outcomes. The process component also will assess the duration and intensity of the intervention (i.e., the dosage) to determine its influence on program outcomes. o Outcome component. The outcome component will assess the effectiveness of the I-SAFE curriculum in achieving the desired outcomes. Is there sufficient consistency in program implementation across sites to warrant a national evaluation? Although variation in implementation exists across sites, each site is nonetheless implementing the same intervention (the I-SAFE curriculum). However, requiring evaluation sites to deliver the program in the manner in which it was intended would help to control the variation that currently exists across sites. It is unclear, however, under what mandate sites would be required to implement the program in a specific manner. Support from the national headquarters would be imperative to implement this requirement. If the evaluation team could not physically control for variation, differences in program delivery could potentially be statistically controlled. However, this could prove difficult and may reduce the level of comparisons that can be made across program sites. In order to understand the outcome of the program across sites, it would be important to require sites to implement the same curriculum in the same manner. Do sites vary so greatly that each is rather unique, making multi-site comparisons largely descriptive and lending syntheses of results to considerable professional judgment? This is not the case. Although there is some variation in implementation across sites, most sites are implementing the curriculum with some degree of fidelity. That is, most of them are delivering the five lessons as intended; however, some sites add additional material or spend more time on one lesson than do other sites. It is not the case, however, that sites vary so greatly that they could be considered unique sites. There is enough similarity in implementation that comparisons could be made. Hierarchical linear modeling techniques have been developed to deal with the kinds of nested data typically generated by multi-site studies. These techniques are also known as random regression on multi-level modeling. Observations are nested in both individual participants and sites. In essence, although each site's participants will receive the curriculum under the I-SAFE program, they are members of a community and share characteristics and experiences that make them more similar than participants in other communities. Hierarchical linear modeling takes these correlations into account before estimating the program effects. Are certain program elements consistent across sites while others vary, making a hybrid of national evaluation for some elements and multi-site comparisons for others seem feasible? There are two potential options here. First, evaluation sites could be chosen on the degree of fidelity used in implementing the I-SAFE curriculum. That is, sites that are implementing the curriculum with the highest degree of fidelity would be chosen, while others would be selected out. Another alternative would be to include a cross-site component (selecting those sites that are implementing with fidelity) in combination with a within-site component in which two or three sites would be chosen on the basis of the unique changes they have made to the program. Outcomes could be assessed for both the standard and unique delivery of the curriculum and compared for effectiveness. It may be the case that enhancing the program actually enhances its effects, or, conversely, that the program only produces the intended effects if it is delivered with fidelity. It might be interesting to tease this out by including both a cross- and within- site comparison (e.g., case study) component in an evaluation. Are there one or more sites that offer good opportunities to evaluate an important intervention regardless of whether they generalize to the program? N/A If a multi-site strategy was used for the evaluation, what criteria would be used to sample the pool of sites and how easy would this sampling be accomplished? Sample sites could be selected based on several criteria to ensure adequate representation of the program sites currently implementing the I-SAFE curriculum. These criteria might include such factors as: o Willingness to participate. o Community demographic characteristics (e.g., rural versus urban schools; geographic region). o Size of school (i.e., number of students). o Degree of implementation (e.g., all grades receiving the curriculum versus only certain classes or grades). o Number of years the curriculum has been implemented in the schools (e.g., new versus old). Once the pool of sites has been identified, a mix of probability and non- probability sampling techniques be used. For instance, the criteria listed above could be used first to select a large pool of sites that meet the criteria (convenience sample). Once the pool of sites was selected, the required number of sites necessary for a rigorous evaluation could then be randomly selected from the larger pool. Should the evaluation include a preliminary power analysis? The assessment team recommends conducting a preliminary power analysis to determine the exact sample size that would be required for a rigorous evaluation. The power to detect differences in outcomes in the I-SAFE program will depend on the usual factors (alpha, expected effect sizes, achieved sample sizes, and follow-up rates). Power also will depend on several features of the evaluation design that will not be determined until the initial phase of the evaluation's implementation. These features include the number of treatment and comparison groups and the number of follow- up data collection efforts that are required. The method by which outcome measures are assessed will also affect power. These issues will need to be examined within the context of a full-scale evaluation. What should OJJDP's grant manager know about this program? The grantee has benefited from communication with the grant monitor, especially with regard to its outcome measures. I-SAFE had put a great deal of emphasis on outputs that generally reflected process measures but has focused more attention on individual-level outcomes as a result of the grant monitor's input. The process measures continue to be important, however, because the grant manager should be aware of varying implementation across I-SAFE sites. In particular, the curriculum is not always fully implemented or taught in the order and over the time period expected. Some sites have also been unable to implement the community outreach activity in addition to the core I-SAFE curriculum. Furthermore, although all sites indicated ongoing and responsive communication with I- SAFE headquarters, it does not appear that headquarters is monitoring the fidelity of program implementation at the sites. It is, however, paying close attention to the administration of the pre- and posttests for each of the I- SAFE lessons. NOTES 1. Berson, I.R., M.J. Berson, and J.M. Ferron, "Emerging Risks of Violence in the Digital Age: Lessons for Educators from an Online Study of Adolescent Girls in the United States," Meridian: A Middle School Computer Technologies Journal, 5 (2002): 1-20. 2. Superhighway Safety: Internet Proficiency Scheme, retrieved May 3, 2003, from http://safety.ngfl.gov.uk/schools. 3. Ibid. 4. M. Bawer, Executive Director, CyberSmart!, personal communication, May 2, 2003. 5. See http://web.archive.org/web/20020210154858/http://www.fbi.gov/kids/ crimepre/internet/internet.htm.