Lessons Learned

Do arts-based programs prevent or reduce delinquent behavior among youth? This key question remains only partially answered by this evaluation. Although it produced considerable evidence to support the hypothesis that such programs can contribute to the avoidance or reduction of delinquent behavior, the evaluation suffered several problems that limited the ability to definitively answer the question. However, the lessons learned from this evaluation—about both the programs themselves and the process of evaluating them—can be immensely valuable to the design, implementation, and evaluation of other youth-focused arts-based efforts.

Evaluation Lessons

Evaluating any social service program is challenging, especially when the program deals with populations who are at risk of becoming delinquent or already in trouble. Evaluating programs that deal with juvenile delinquency poses special challenges because they serve adolescents, a population that requires special attention to data collection protocols, such as obtaining parental permission for their children to participate in interviews or surveys and gaining the cooperation of system-wary youth. Moreover, these programs are often modestly funded, with staff stretched in many directions, working with too many youth in too little time. In order for an evaluation component to succeed, there needs to be careful attention to the design, timing, and resources. Following are the lessons learned regarding several interrelated issues—control groups, sample size, data collection, and program stability.

Control groups. Answering questions about the effectiveness of delinquency prevention and intervention programs requires the most rigorous evaluation design possible. An experimental design with random assignment of youth into participant and control groups, the design that best allows for testing whether the program "causes" changes in participants, was originally planned for this evaluation. Because of limited recruitment (i.e., not enough youth to use random assignment) and concerns among some of the program partners about the ethics of not providing services to all eligible youth, a quasi-experimental design with matched comparison groups was adopted. To enable evaluators to attribute changes in participants' attitudes and behaviors to the program, the comparison group had to be similar to the group of participant youth from the start of the evaluation in several key factors (e.g., age, race, grade level, history of delinquency, attitudes, and behaviors). Although attempts were made to match youth on key characteristics in Atlanta and Portland, the participant and control groups had some important differences (e.g., history of delinquency). These differences and the small sample sizes made it difficult to attribute with certainty a direct relationship between program participation and the observed outcomes. Implementing rigorous evaluation designs requires considerable planning and, equally important, commitment of time and resources to ensure that the integrity of the design is maintained throughout the course of the evaluation.

Sample size. To some extent, the small sample sizes for the evaluation of the three YouthARTS programs were due to program design and were thus unavoidable. An important feature of many arts programs, like Art-at-Work in Atlanta and Youth Arts Public Art in Portland, is the small youth-to-artist ratio. Although this allows for quality one-on-one instruction of the participants, it creates a challenge for the evaluator. Small sample sizes preclude testing for statistical significance (determining whether observed changes occur by chance or because of the intervention). Additionally, small sample sizes make it difficult to generalize the findings to the broader population of at-risk youth. Although an evaluator cannot control the size of a program, steps can be taken to avoid small sample sizes resulting from missing data. Because data were not uniformly collected for all youth at all data collection points, many youth had to be excluded from the analyses. Every effort must be made to obtain complete data for all youth involved in the program and comparison groups.

Data collection. One of the most critical aspects of any evaluation is data collection. Even the most rigorous design will not provide meaningful results without the necessary data. Every evaluation faces resource-driven tradeoffs, however, and data collection is often an area where compromises are made. The YouthARTS evaluation used program staff, supported by local data collectors hired by each site, to collect the YouthARTS evaluation data.

Even with detailed training and how-to manuals provided to assist the local staff with their data collection tasks, the data collection effort suffered. Program staff were very busy, focused primarily on program operations and service delivery. Many of the local data collectors were inexperienced and, despite guidance and manuals, were often unable to manage the challenges posed in collecting the necessary academic and court data. Frequent technical assistance throughout the data collection process could help avoid such data collection problems in similar situations in the future.

Program stability. Outcome evaluations seek to answer the question, Did this program work? Implicit in answering the question is an understanding of what "this program" is. If the program itself is young and still undergoing developmental changes (as the YouthARTS programs were, especially in Atlanta and Portland), it is difficult to determine which program worked or did not work. Moreover, young programs face enormous challenges with hiring and training staff, program logistics, recruitment, and day-to-day operations. These evolving programs resulted in two key problems for the evaluation:

  • First, it was difficult for project managers to oversee the evaluation, ensure compliance with data collection requirements, and deal with the day-to-day demands of program operations.

  • Second, because of the first problem, both identifying the intervention itself and ensuring that the data collection instruments were measuring the appropriate outcomes—those that reasonably could be expected to occur as a result of the intervention—were difficult.

Future evaluations of arts-based programs should be done on those programs that have reached a level of stability that ensures the intervention and its desired outcomes are well defined. Also, if program staff are to be responsible for data collection, they must have the capacity and training.

Program Lessons

Many important lessons were learned about planning and implementing arts-based juvenile delinquency prevention/intervention programs. The experiences of these three YouthARTS programs provide useful insights for future program operation and evaluation. Information from both the process and outcome evaluation components helped program staff identify common factors that led to successful program implementation.

Collaboration. All of the programs incorporated some form of collaborative process. Participants noted that having a voice in establishing program rules, regulations, and workshop activities and having frequent opportunities to produce and exhibit their art were factors that helped them engage in the program. Program staff recognized the need for improved collaboration among program partners (e.g., school district, juvenile justice agency, art agency) and the importance of identifying the roles and responsibilities of each partner. Lack of "buy in" and understanding of the program and the evaluation from the beginning resulted in unexpected complications (e.g., lack of access to data, difficulty accessing facilities, overextension of staff). Additionally, program staff believed that the partnership needed to be broadened to include other youth-serving agencies. They quickly learned that many of the youth they were serving had problems beyond the scope of the arts program and that any impact the program may have had on improving behavior was reduced by other circumstances that affected the lives of the participants. Widening the circle of collaborative partners would have improved the program's ability to refer youth and their families to other services in the community.

Skilled, qualified artists. Artists who had both experience in the arts and the ability to work with at-risk youth tended to excel at engaging youth in art activities, adapting activities to meet participants' specific needs, handling problem behaviors, bonding with participants, and establishing mutual respect with the youth. YouthARTS participants gravitated to those artists who exhibited expertise in their field and communicated easily and respectfully with them.

Onsite caseworkers and probation officers or counselors. Onsite caseworkers and probation officers or counselors were invaluable assets to the programs. These individuals served as positive role models for the participants, and by collaborating closely with the artists, they helped them provide individualized art instruction to smaller groups of participants. Involvement of these key players also provided youth with an opportunity to develop positive relationships with their probation officers or counselors. Participants were more involved in the instructional activities when the probation officers tried to learn the material with them. The youth seemed more likely to take risks (e.g., reveal personal information, try new things that might embarrass them) when they saw their probation officers taking the same risks. Finally, close involvement of the caseworkers and probation officers or counselors allowed them to handle behavioral problems during the workshops, provide referrals, and work with the youth and their families to ensure that the participants received any additional support needed to develop or maintain positive attitudes and behaviors.

Comprehensive training for all program staff. Comprehensive training covering topics such as at-risk youth, risk factors and problem behaviors, classroom management, conflict resolution education and collaborations with other youth-serving agencies was essential for all program staff. The training not only served to orient staff to the program but also provided them with an opportunity to discuss expectations, roles, and responsibilities. These discussions ensured that program staff understood the importance of collaboration to achieve common goals and that everyone shared realistic, appropriate expectations about what the program could accomplish.

Range of arts programs and services. Program staff and youth participants from all three programs recognized the need for expanded arts services in the community. Many of the youth expressed a continued interest in the arts and identified the desire to be connected with other services in the community that would allow them to pursue their art interests (e.g., additional art classes, jobs, internships, scholarships). Getting connected to the community was important for these youth, and, although each of the programs had made some progress in connecting youth to other services, most youth felt nothing was available for them once the program ended. Program staff in all three cities are working to identify ways to keep youth connected to the arts and to their communities in the future.

Transportation for participants. One of the most common reasons for poor program attendance cited by participants and staff was lack of transportation. Youth attendance and participation is essential to a program's success. Providing reliable transportation (e.g., school buses, city transportation, volunteer-driven vehicles) makes higher attendance rates possible.

Previous Contents Next

The YouthARTS Development Project Juvenile Justice Bulletin May 2001