MENU TITLE: Trial Court Performance Standards and Measurement System. Series: BJA Program Brief Published: July 1997 43 pages 67,513 bytes ------------------------------ Figures, charts, forms, and tables are not included in this ASCII plain-text file. To view this document in its entirety, download the Adobe Acrobat graphic file available from this Web site or order a print copy from BJA at 800-688-4252. ------------------------------ U.S. Department of Justice Office of Justice Programs Bureau of Justice Assistance Trial Court Performance Standards and Measurement System Program Brief ------------------------------ U.S. Department of Justice Office of Justice Programs 810 Seventh Street NW. Washington, DC 20531 Janet Reno Attorney General U.S. Department of Justice John C. Dwyer Acting Associate Attorney General Laurie Robinson Assistant Attorney General Nancy E. Gist Director, Bureau of Justice Assistance U.S. Department of Justice Response Center 1-800-421-6770 Bureau of Justice Assistance Clearinghouse 1-800-688-4252 Bureau of Justice Assistance World Wide Web Home Page http://www.ojp.usdoj.gov/BJA This document was prepared by the National Center for State Courts, supported by grant number 91-DD- CX-0013(S-1) awarded by the Bureau of Justice Assistance, Office of Justice Programs, U.S. Department of Justice. The opinions, findings, and conclusions or recommendations expressed in this document are those of the authors and do not necessarily represent the official position or policies of the U.S. Department of Justice or the National Center for State Courts. The Bureau of Justice Assistance is a component of the Office of Justice Programs, which also includes the Bureau of Justice Statistics, the National Institute of Justice, the Office of Juvenile Justice and Delinquency Prevention, and the Office for Victims of Crime. ------------------------------ Bureau of Justice Assistance Trial Court Performance Standards and Measurement System Program Brief July 1997 NCJ 161569 ------------------------------ Foreword The Bureau of Justice Assistance (BJA) is pleased to present this brief overview of the Trial Courts Performance Standards and Measurement System. It is intended for judges, court managers, lawyers, policymakers, and others in a community interested in improving their trial court's responsiveness and effectiveness. The Trial Courts Performance Standards and Measurement System was created to serve as a common language for describing, classifying, and measuring the performance of trial courts. Developing such a framework was the goal of an 8-year effort, the Trial Court Performance Standards Project, initiated in 1987 by the National Center for State Courts and BJA. Demonstrated successfully in trial courts across the Nation, the measurement system is a valuable resource for helping courts provide fair and efficient adjudication and disposition of cases. It is our hope that every community in the United States will consult this Program Brief and the companion publications of the Standards Project to begin the process of improving access to justice and its administration with equality, integrity, and timeliness. Nancy E. Gist Director ------------------------------ Acknowledgments The Bureau of Justice Assistance would like to thank the Commission on Trial Court Performance Standards for its dedication and vision in guiding the Trial Court Performance Standards to their fruition. Commission members include the following individuals: Honorable Robert C. Murphy, Chair Chief Judge (Retired) Court of Appeals of Maryland Towson, Maryland Honorable Rebecca A. Albrecht Associate Presiding Judge Superior Court of Arizona, Maricopa County Phoenix, Arizona Robert N. Baldwin State Court Administrator Supreme Court of Virginia Richmond, Virginia Carl F. Bianchi Director of Legislative Services Legislative Services Office Boise, Idaho Honorable Robert C. Broomfield Chief Judge, U.S. District Court District of Arizona Phoenix, Arizona John A. Clarke Executive Officer/Clerk Los Angeles Superior Court Los Angeles, California Judith A. Cramer Manager, Weed & Seed Neighborhood Revitalization Program Tampa, Florida Honorable Anne L. Ellington Assistant Presiding Judge King County Superior Court Seattle, Washington Howard Hanson County Clerk/Superior Court Administrator Marin County San Rafael, California Robert D. Lipscher Administrative Director (Retired) New Jersey Administrative Office of the Courts Trenton, New Jersey Edward B. McConnell (ex-officio) President Emeritus National Center for State Courts Williamsburg, Virginia Doris Marie Provine Chair, Department of Political Science Syracuse University Syracuse, New York Honorable Henry Ramsey, Jr. Dean, Howard University School of Law Washington, D.C. Honorable Leo M. Spellacy Judge, Ohio Court of Appeals Eighth District Cleveland, Ohio Whitfield Smith (1987-1991) Clerk of Court (Former) Superior Court for DeKalb County, Georgia Decatur, Georgia Honorable Fred B. Ugast Chief Judge (Retired) Superior Court of the District of Columbia Washington, D.C. Bureau of Justice Assistance, U.S. Department of Justice Marilyn Nejelski Program Manager Charles Hollis Chief, Adjudication Branch National Center for State Courts Sally T. Hillsman Vice President (1992-1995) Geoff Gallas Vice President (1987-1991) Trial Court Performance Standards Project Staff Pamela Casey, Director Ingo Keilitz,* Director Hillery Efkeman Margaret Fonner John Goerdt Thomas Hafemeister* Roger Hanson William Hewitt Brenda Jones* Susan Keilitz Fred Miller* Beatrice Monahan* Pamela Petrakis* David Rottman* *denotes former project staff In addition to those listed above, numerous individuals at the National Center for State Courts and elsewhere gave generously of their time to assist the development and initial testing of the Trial Court Performance Standards and Measurement System, including Stevalynn Adams, David Aday, Carl Baar, Kent Batty, Richard Berk, Chuck Campbell, Joy Chapper, George Cole, Hank Daley, Michael Dann, Tom Dibble, Chris Duncan, Bill Fishback, Gene Flango, Sandy Garcia, Debbie Gause, George Gish, Gordon Griller, Mary Hogan, Cindy Huffman, Michael Jeans, Lynn Jordaans, Carl Kessler, Kay Knapp, Gerald Kuban, Monica Lee, Chris Lomvardias, Kay Loveland, Jennifer Rae Lovko, Robert Lowe, James Lynch, Barry Mahoney, Mary McCall, Craig McEwen, Jan Michaels, Barbara Meierhoefer, Tom Munsterman, Raymond Nimmer, Jessica Pearson, Mike Planet, Maryann Rondeau, Jane Raynes, Teresa Risi, Dalton Roberson, Ronald Rosenberg, Jeffrey Roth, Fred Rusillo, Hisako Sayers, Bob Tobin, Anne Walker, Patricia Wall, Steven Wasby, Joan White, Matt Williams, and Robert Williams. We also are indebted to the many individuals in the 4 pilot States and the 12 courts for the countless hours and invaluable insights they contributed during the project's 4-year demonstration phase: in New Jersey, Robert D. Lipscher and Theodore Fetter, Administrative Office of the Courts--Atlantic County Superior Court, Burlington County Superior Court, Morris County Superior Court, Ocean County Superior Court, and Somerset County Superior Court; in Ohio, Stephan W. Stover and Ruth Ann Elmer, Supreme Court of Ohio--Meigs County Court of Common Pleas, Stark County Court of Common Pleas, and Wayne County Court of Common Pleas; in Virginia, Robert N. Baldwin and Beatrice P. Monahan, Office of the Administrator for the Courts--Fairfax County Circuit Court; and in Washington, Mary Campbell McQueen and Yvonne Pettus, Office of the Administrator for the Courts--Spokane County Superior Court, Thurston County Superior Court, and Whatcom County Superior Court. We also gratefully acknowledge Richard Van Duizend, Deputy Director of the State Justice Institute (SJI), for his commitment to the project and SJI's financial support of many demonstration phase activities. The Commission on Trial Court Performance Standards, the National Center for State Courts, and the Bureau of Justice Assistance thank these individuals and the many individuals not named here who assisted in the Trial Court Performance Standards Project in its 8-year duration. ------------------------------ Contents Chapter 1 Program Focus: The Trial Court Performance Standards and Measurement System: A Blueprint for Improved Judicial Administration Overview Chapter 2 Defining a Court's Mission: Access to Justice Expedition and Timeliness Equality, Fairness, and Integrity Independence and Accountability Public Trust and Confidence Chapter 3 Measuring Court Performance: Demonstration of the Measurement System Summary of Measures Chapter 4 Implementation Issues: Step 1: Create an Impetus for Change Step 2: Form a Guiding Coalition Step 3: Create and Communicate a Vision and Mission Step 4: Select Standards Relevant to a Vision and Mission Step 5: Conduct Measures Step 6: Make Improvements Step 7: Communicate Progress Chapter 5 Conclusion Appendix A Bibliography Appendix B Sources for Further Information Appendix C Overview of the Trial Court Performance Standards and Measurement System Appendix D Court Performance Inventory ------------------------------ Chapter 1 Program Focus In 1987, in recognition that State court systems were being stretched beyond their capacities and that court personnel were experiencing fatigue and burnout in attempting to deal with the significant increase in drug-related cases, the Bureau of Justice Assistance (BJA), U.S. Department of Justice, and the National Center for State Courts (NCSC), a nonprofit organization providing leadership and service to State courts, initiated an ambitious program, the Trial Court Performance Standards Project (Standards Project). The program's objective was to increase the capacity of the Nation's trial courts to provide fair and efficient adjudication and disposition of cases. The program's goals included the development of a set of standards and an accompanying measurement system that would define and measure effective trial court performance. Implicit in the establishment of the Standards Project was recognition by State court leadership that existing judicial and support resources could handle increased caseloads only by pursuing a more focused application and use of those resources. Accordingly, through systematic research, development, and demonstration, the Standards Project resulted in the creation of the Trial Court Performance Standards and Measurement System. Program publications for courts that wish to use the system include Planning Guide for Using the Trial Court Performance Standards and Measurement System; Trial Court Performance Standards With Commentary; Trial Court Performance Standards and Measurement System Implementation Manual; and this document, Trial Court Performance Standards and Measurement System (Program Brief). The Trial Court Performance Standards and Measurement System: A Blueprint for Improved Judicial Administration The foundation of the Trial Court Performance Standards and Measurement System is the theme of the court as an organization accountable for its performance. The system defines a philosophy of optimum trial court performance. Endorsed by the Conference of Chief Justices, the Conference of State Court Administrators, and the National Association for Court Management and incorporated into the standards of the National College of Probate Judges, the system's performance standards are widely viewed as a blueprint for improving the administration of justice in State trial courts. The measurement system includes: o A common language for the description, classification, and communication of court activities. o A conceptual framework for understanding and improving court performance. o A means for self-assessment, self-improvement, and accountability to the public. A hallmark of the measurement system is its dual emphasis on the systematic assessment of a trial court's performance as an organization that serves those who use the court and on the use of the assessment findings to improve that performance. The system is not intended to evaluate the performance of individuals; instead, it views the court as a system of closely interlinked processes and tasks. The collective work of the court involves all individuals who perform administrative court functions--including judges, clerks of court, administrators, probation officers, and other court staff, as well as private lawyers, public defenders, prosecutors, and social service providers. Overview This program brief is divided into three main sections, followed by a conclusion and four appendixes. The first main section discusses the goals of the court system in the 5 performance areas into which the 22 performance standards are grouped. These areas are: (1) Access to Justice; (2) Expedition and Timeliness; (3) Equality, Fairness, and Integrity; (4) Independence and Accountability; and (5) Public Trust and Confidence. The second section summarizes the specific performance measures that are associated with the performance standards, including methods of data collection and measurement, sources of data, and primary evaluators. The third main section discusses a seven-step model that provides guidance to courts in implementing the performance standards and measurement system. A brief conclusion summarizes the benefits of systematic implementation of the Trial Court Performance Standards and Measurement System. The four appendixes to this program brief provide additional information for those who wish to explore trial court performance issues and the implementation of performance standards in greater depth. Appendix A is a bibliography applicable to all four trial court performance documents. Appendix B lists sources for further information about the Trial Court Performance Standards and Measurement System. Appendix C presents a tabular overview of the performance measurement system. Appendix D presents the Court Performance Inventory, a survey instrument that is designed to collect individual perceptions about trial court performance. In addition to the survey questions, Appendix D presents the Court Performance Inventory Response Form and the Court Performance Profile, tools for tabulating survey responses and presenting those responses graphically. ------------------------------ Chapter 2 Defining a Court's Mission The measurement system developed by the 14-member Commission on Trial Court Performance Standards (Commission), which consists of State and local judges, court administrators, scholars in the area of judicial administration, and an elected clerk of the court, sets forth the goals of general jurisdiction trial courts in five areas: 1. Access to Justice 2. Expedition and Timeliness 3. Equality, Fairness, and Integrity 4. Independence and Accountability 5. Public Trust and Confidence These five performance areas encompass the fundamental purposes and responsibilities of courts and may be considered a court's mission. Within each area several standards articulate specific goals for court performance. A total of 22 standards extend across all 5 performance areas. A discussion of these standards follows. Access to Justice The standards defining the performance area of Access to Justice require a trial court to eliminate any unnecessary geographic, economic, procedural, language, or psychological barriers to court services. Specifically, the five standards in this area require a trial court to do the following: o Conduct its proceedings and other public business openly. o Maintain facilities that are safe, accessible, and convenient to use. o Provide an opportunity for all who appear before the court to participate effectively, without undue hardship or inconvenience. o Ensure that judges and other trial court personnel are courteous and responsive to the public, according respect to all with whom they come in contact. o Maintain reasonable, fair, and affordable costs of access to trial court proceedings and records-- whether the costs are measured in money, time, or the procedures that must be followed. These five standards encourage a court to think in terms of the court user rather than focusing on the needs of those who work within the system. Taken as a whole, the standards focus on the ability of citizens to maneuver through different aspects of the justice system, despite the public's general unfamiliarity with court facilities and procedures. Expedition and Timeliness During the last 10 years, court reform has focused considerable attention on reducing delays in case processing. The standards in the second performance area of Expedition and Timeliness expand the concept of timely case processing to encompass all court activities. The standards recognize that the manner in which a trial court fulfills its responsibilities affects not only litigants but all individuals and organizations involved with the judicial system--including jurors, attorneys, witnesses, criminal justice and social service agencies, and the public. This emphasis on the timely handling of all court services is articulated in three performance standards, which require a court to do the following: o Establish and comply with recognized guidelines for timely case processing while, at the same time, remaining current with its incoming caseload. o Disburse its funds promptly, provide reports and information according to required schedules, and respond to requests for information and other services on an established schedule that ensures their effective use. o Promptly implement changes in law and procedure. Equality, Fairness, and Integrity The standards in the third area of court performance--Equality, Fairness, and Integrity-- address the constitutional guarantees of due process and equal protection under the law. The standards emphasize fidelity to established laws and procedures, and they require a court not only to be explicit in its orders but also to ensure enforcement of the orders. Specifically, the six standards in this area direct a trial court to do the following: o Faithfully adhere to relevant laws, procedural rules, and established policies. o Maintain jury lists that are representative of the jurisdiction from which they are drawn. o Give individual attention to cases, deciding them without undue disparity among like cases and upon legally relevant factors. o Render decisions that unambiguously address the issues presented and clearly indicate how compliance can be achieved. o Take appropriate responsibility for the enforcement of its orders. o Monitor records of all relevant court decisions and actions for accuracy and proper preservation. Independence and Accountability The standards in the fourth performance area-- Independence and Accountability--recognize the importance of judicial independence and the separation of powers. At the same time, however, the standards require trial courts to maintain effective working relationships with other branches of government and other components of the justice system. This performance area's standards also focus on the court's status as a public institution. The standards suggest that, as such, a court take responsibility for developing action plans, obtaining resources to implement the plans, monitoring its operations, and accounting publicly for its performance. Specifically, the five standards instruct that a court do the following: o Maintain its institutional integrity and observe the principle of comity in its governmental relations. o Responsibly seek, use, and account for its public resources. o Use fair employment practices. o Inform the community about its programs. o Anticipate new conditions and emergent events and adjust its operations as necessary. Public Trust and Confidence The judicial system derives its power and legitimacy from those it serves. A court must attend to several constituencies--the general public, community opinion leaders, citizens who have business before the court, and court employees. These constituencies vary by the type and extent of contact they have with the justice system. The standards in this fifth and final performance area, Public Trust and Confidence, ask a court to consider its performance in the other four areas through the eyes of these various constituencies. The three standards in this area urge a trial court to instill public trust and confidence to achieve the following results: o The public perceives the trial court and the justice it delivers as accessible. o The public has trust and confidence that basic trial court functions are conducted expeditiously and fairly and that court decisions have integrity. o The public perceives the trial court as independent, accountable, and not unduly influenced by other components of government. ------------------------------ Chapter 3 Measuring Court Performance Each performance standard in the measurement system is linked to a set of measures. The measures provide court officials with the tools for assessing how well their respective courts are performing with regard to their mission and goals as articulated by the 22 standards. The measures help courts develop a strategic plan based on systematic data rather than intuition and guesswork. The measures are designed to gather information that the court can use readily in a variety of ways, including budgeting, case management, implementing court improvement projects, and strategic planning. The initial application of the measures aids the court in identifying areas that require immediate attention and those that potentially need improvement. The measures also may be used to establish benchmarks regarding court performance on each standard the court wishes to address. Subsequently, the court can use the measures to determine whether its performance with respect to a particular standard is better, about the same, or worse than when the measures were taken originally. The information gathered through the measures can help to determine whether the improvement efforts the court has implemented are succeeding or need to be altered in some way. The measures use a variety of data collection methods, including: o Case and administrative record reviews and searches. o Group techniques. o Interviews. o Observations and simulations. o Surveys of various reference groups (e.g., the general public, court employees, and attorneys). Of these data collection methods, courts are most likely to be familiar with case and administrative record reviews. Just as the measurement techniques vary, different types of evaluators are employed depending on the object of the measure. For example, volunteers conduct structured observations of court proceedings and simulations of public access to information, whereas court staff conduct many of the measures involving record reviews. A few measures, such as data analysis, are best carried out by consultants or court staff with particular expertise. Demonstration of the Measurement System As the measurement system evolved, 75 measures were developed, tested, and refined by the Commission and Standards Project staff. Trial courts in Arizona, Michigan, and Ohio cooperated in and contributed to this process by serving as test sites for the draft measures. Following the research and development phase, a 4-year demonstration phase commenced in New Jersey, Ohio, Virginia, and Washington. A total of 12 trial courts in the 4 States participated in the demonstration, and each of the 75 measures was tested in at least 2 of the courts.[1] The 12 trial courts varied on a number of factors including size, organization, jurisdiction, funding source demographic and economic context, and, of course, State law and court rules. This variation across the courts provided the opportunity to test the measures under diverse conditions and produced a rich body of information relevant to the application of the measurement system in other trial courts throughout the country. As the demonstration proceeded, the Commission and Standards Project staff reviewed and revised the measures to reflect the experiences of these trial courts in implementing the measurement system. As a result the original 75 measures were refined to a set of 68. A tabular overview of the entire performance measurement system is presented in Appendix C to this program brief. A summary of the measures by performance area follows. Summary of Measures Access to Justice The five standards in the first performance area, Access to Justice, require a trial court to eliminate all unnecessary or inappropriate barriers to its services. Twenty-one specific measures are associated with these five standards. Prescribed methods of measurement include structured observations, interviews, surveys, and record searches and reviews. Many of the measures associated with this and other performance areas can be taken simultaneously. The method most often prescribed for measuring access to justice is structured observation of court proceedings, operations, and facilities. The information collected includes records of what people see and hear, structured so that it can be examined quantitatively and qualitatively by court personnel. Although observations can be carried out by almost anyone, the recommended approach is to use citizen volunteers who are relatively naive about the court system and who will yield information and experiences likely to reflect those of ordinary citizens who have infrequent business with the court. Expedition and Timeliness Ten specific measures are associated with the three standards in the performance area of Expedition and Timeliness, which highlights the timely performance of all trial court functions. Unlike the measures associated with most of the other standards, several of the measures associated with this performance area (e.g., length of time to disposition, ratio of case dispositions to case filings, and age of pending caseload) will be familiar to judges and court managers. Four other measures draw on State and local sources of information to determine whether the court also is performing its noncase-related functions (e.g., distribution of funds and provision of reports, information, and services) in a timely manner. Satisfactory performance requires not only that provision of reports and services be timely, but also that they be completed in such a manner as to make them useful to the person or agency requesting the information or report. Finally, two other measures relate to the promptness with which a trial court implements externally mandated changes. Equality, Fairness, and Integrity A total of 23 measures are associated with the 6 standards in the performance area of Equality, Fairness, and Integrity. For the most part, the measures require similar data elements, data collection procedures, and methods of analysis. For example, five of the six measures associated with the standard on production and preservation of records use some portion of the same pool of cases to examine the extent to which court records are adequately stored. A court that measures a given standard can apply all the measures associated with that standard in a relatively efficient manner. The most common method of measurement in this performance area is the review and analysis of case-related information. Case files are used as a basic source of data for many of the measures. Mail surveys also are used to assess the views of key reference groups, such as attorneys, jurors, and witnesses. Independence and Accountability In contrast to the measurement approach taken in the other four performance areas--which is largely prescriptive, detailing specific measures and indices--the measurement approach in the performance area of Independence and Accountability is largely heuristic. That is, rather than defining specific measures of performance, it describes methods by which a court proceeds along empirical lines to identify the people, events, and activities needed to develop valid and workable measures to assess the court's independence and accountability. Rather than initially taking a specific measurement, the court engages in a process that will allow it to make inferences (including preliminary plans for improvement) about its independence and accountability, based on empirical results. To engage in this process, measures in this area should be undertaken only after the formation of a steering committee composed of judges and court managers. The committee will be involved in planning data collection, discussing the significance of the results, and integrating the findings from all of the measures into an overall view of court performance in this area. Structured group techniques for decisionmaking, such as the Nominal Group Technique and Ideawriting, led by a skilled facilitator, are recommended to save time during steering committee meetings and to maximize objectivity in the committee's work. The use of the steering committee, in conjunction with research efforts that may be undertaken by court staff or consultants, constitutes a process that combines fact gathering, value clarification, decisionmaking, and action. Public Trust and Confidence Performance in the area of Public Trust and Confidence is dependent, in large part, on the court's performance in the other four areas. Thus, several of the measures in the other areas that rely on informed opinions (i.e., opinions of individuals who have had contact with the court for various reasons) are appropriate to consider for this performance area as well. Three additional measures are included to address all three standards for this area by gauging the perceptions of different groups about court performance. These measures involve a mail survey of court employees, a modified focus group discussion with representatives of the various components of the justice system, and a telephone survey of the general public. The first two measures are likely to provide the court with the most useful information for developing an action plan to improve performance in this area. The third measure will provide a benchmark of general public perception of the court's overall performance. This benchmark then can serve as a gauge for comparing the results of future surveys of the general public. It is important to note that the measures in this area are intended to examine public perception of court performance with regard to the court's overall administration and operation. The measures do not examine the extent of public agreement with individual case decisions made by the court. 1. In New Jersey the five demonstration courts were the Superior Courts of Atlantic, Burlington, Morris, Ocean, and Somerset Counties. In Ohio the three demonstration courts were the Common Pleas Courts of Meigs, Stark, and Wayne Counties. The demonstration court in Virginia was the Fairfax County Circuit Court. In Washington the three demonstration courts were the Superior Courts of Spokane, Thurston, and Whatcom Counties. ------------------------------ Chapter 4 Implementation Issues The Trial Court Performance Standards and Measurement System is crafted for the "generic" general jurisdiction trial court. How the system is applied in an actual court depends on both the needs of the court and the environment in which it operates. For one court the application of the system might involve selecting and conducting one or two measures that address a particular area of concern for the court. For another court the application might involve articulating a strategic plan for the court in which the measurement system plays a central role. An implementation model, consisting of seven steps, provides guidance to courts that wish to use the measurement system in their jurisdiction. Based on the experiences of the courts demonstrating the measures, the model helps courts translate the philosophy of the measurement system into practical application. The model's seven steps are outlined below. Step 1: Create an Impetus for Change This first step involves identifying the reasons for undertaking a self-improvement effort and generating the momentum to get the effort started and sustained. It does not matter whether the court uses a national or local issue to drive its self-improvement process. What matters is that the identified issue has meaning for, and elicits a reaction from, the individuals who will be involved in the process. Step 2: Form a Guiding Coalition A self-improvement effort that remains effective over time should be directed by a Guiding Coalition that can command respect and resources and maintain the energy to keep the project going when resistance is encountered.[2] The coalition's core should consist of senior court officials who are willing to commit their time and energy to the project. Although these senior persons need not always be the officials who occupy the court's positions of highest authority or power, the court's top management should at least be in agreement philosophically with the effort. The coalition should not necessarily be a task force in which a representative from every potentially relevant group is invited to participate. Involving too many individuals can cause the planning process to supersede the purpose, which is to embark on a self-improvement process.[3] Representatives from various groups needed for or affected by the self-improvement process are mentioned in specific performance measures, and their cooperation will be solicited along the way. However, these representatives do not all have to be permanent members of the coalition. Step 3: Create and Communicate a Vision and Mission The Guiding Coalition should draft a vision statement that builds on the shared values of the individuals who will be involved in the court's self-improvement efforts and clarifies the direction of these efforts. If a court does not already have a vision statement, the measurement system is a good resource for developing an initial version. Several themes are interwoven throughout the standards component of the system, such as the court as service provider (customer orientation), problem solver, institution builder, and even as active governing agent in the community. The foundation of the measurement system is the theme of the court as an organization accountable for its performance. Whereas a vision statement describes the court's desired state or position in the future, a mission statement describes its fundamental purpose. As noted earlier the five performance areas--Access to Justice; Expedition and Timeliness; Equality, Fairness, and Integrity; Independence and Accountability; and Public Trust and Confidence-- offer a starting point for developing and refining the court's mission statement. Any or all of the fundamental responsibilities articulated in the measurement system could become the focus of a court's mission statement. A clear vision and mission communicated to the individuals participating in the court's self-improvement plan will sustain the effort during the more mundane and less inspiring days of data collection and analysis. Step 4: Select Standards Relevant to a Vision and Mission The next step for the Guiding Coalition is to select one or more standards to focus on. In some instances the selection of certain standards may be obvious. The selection may have been implied in the coalition's prior discussions regarding the court's vision and mission, or a court may be reacting to a particular accusation, such as excessive case processing time or unequal treatment of individuals. In other instances, however, the coalition may not be prepared to set priorities for applying measures without further discussion. In this case the Court Performance Inventory (CPI) may be helpful in identifying areas on which to focus first. The CPI (presented in Appendix D) is a 50-question instrument designed to document quickly the first impressions of sources who are knowledgeable about court operations. The inventory asks respondents to make judgments about the court with regard to the various performance areas. The CPI can be used as a point of discussion for coalition members. What patterns emerge from responses to the questions? Is there general agreement about the areas in which the court is performing well and about those needing improvement? Do the responses vary by position? For example, do clerical staff rate access to justice lower than judges do? If so, why? The answers to these questions will help the coalition pinpoint the areas to focus on initially. Step 5: Conduct Measures The measurement system describes the actual measures in detail. The measures involve several different methods for collecting data and range broadly in difficulty and resources necessary for their application. Courts participating in the demonstration of the system reported the importance of starting off small and focusing on one or two measures to become familiar with the process before tackling some of the more complex and lengthy measures. The demonstration courts also suggested undertaking both short- and long-term data collection efforts to maintain interest in the project during the longer data collection periods of some measures. The results from the short-term measures provide court staff with "small wins" that will help clarify the benefits of the process as well as motivate continuing long-range efforts. In addition, regular status reports of the data collection effort help those involved to stay focused and remind them of the importance of their efforts. Step 6: Make Improvements Strategic change--getting from here to there-- requires good thinking, planning, and, more important, doing. Doing means actually making improvements based on the results of the first five steps. This sixth step involves performing tasks to accomplish the court's objectives, goals, mission, and vision. At best, the tasks necessary for improvement become self-evident as a result of performance measurement. (For example, problems associated with the integrity of the court's document and file management systems may come to light as a result of undertaking the measurements for Standard 3.6, Production and Preservation of Records.) If so, the problems are likely to be amenable to relatively mechanical fixes. These improvements need not be controversial nor take a long time to implement. More often than not, however, formulating an improvement strategy is an exercise in entrepreneurship, requiring the coordinated effort of the Guiding Coalition. During its strategy formulation, the coalition should consider the following questions:[4] o Is it suitable? Does the strategy make sense in light of the court's vision, mission, and current environment? o Is it valid? Is the strategy based on realistic assumptions? o Is it feasible? Does the court have the authority, resources, and confidence required to implement the strategy? o What are the strategy's vulnerabilities? What are the risks of implementing the strategy? How likely is the strategy to work as intended? Will anyone in the organization be harmed by the strategy? If so, what compensatory steps will be taken? o What are the timing requirements? When must the court see tangible benefits? Are there "windows of opportunity" to consider in implementing the strategy? o How adaptable is the strategy? Does the strategy significantly limit or enhance the court's flexibility to fulfill its mission? o Is the strategy usable? Can the court readily implement the strategy? Step 7: Communicate Progress Making meaningful and sustainable changes in an organization is not a simple task. Courts engaged in implementation of the measurement system are analogous to companies trying to "remake themselves into significantly better competitors . . . . The change process goes through a series of phases that, in total, usually require a considerable length of time."[5] When it becomes clear to people that major change will take a long time, organizational inertia reasserts itself and urgency levels are likely to drop. New energy in the form of regular feedback about the process and preliminary results will help overcome the inertia. The planning process also should include some relatively short measures that confirm hypotheses about positive court performance, as well as hypotheses or suspicions about problem areas. The Guiding Coalition must nurture the conviction that new information itself is a win, regardless of whether the information confirms positive hypotheses or reveals areas in need of improvement. Reviewing and communicating the results of the measurement process will help sustain the urgency level and spur analytical thinking to clarify the court's vision. Conclusion "Local level changes in programs, processes, and recording of data will be made as a result of the knowledge gained." "Early on it became evident that the standards should not be used on a `start and stop' basis, but rather need to be incorporated into the routine fabric of all work performed. Through continued utilization, emerging work trends could be tracked and impact of change could be measured." "Participation in this project has provided us with valuable insight as to how trial courts can more effectively improve judicial service to the public." "The court benefited from the project and encourages other courts to become involved. The project is not an easy task, but the results can be beneficial to the community and county the court system serves." These comments, offered by participants in the demonstration phase of the Standards Project, underscore the benefits of implementing the measurement system for both the court and the community. As a result of their participation in the program, demonstration courts have already initiated improvement efforts in a variety of areas such as court security, court interpretation, personnel practices and decisions, jury systems, public education, and court access. These improvement efforts directly respond to current demands for increased accountability throughout government and will help ensure a more responsive justice system overall. Until recently, court reform focused on the structures and machinery of the courts, rather than on their performance, and on the needs of judges and court personnel, rather than on the needs of those served by the courts. The measurement system shifts the focus of reform from resources (e.g., the number of judges and trained staff to efficiently operate a court) and processes (e.g., alternatives to formal dispute resolution and master versus individual calendaring) to performance and its measurement. The measurement system provides a new framework for understanding the unique and important role and responsibility of the courts in State and local governance. 2. T.J. Kiely, "Managing Change: Why Reengineering Projects Fail," Harvard Business Review 73(2)(1995), p. 15. 3. P. Ellickson and J. Petersilia, Implementing New Ideas in Criminal Justice, R-2929-NIJ (Santa Monica, CA: RAND Corporation, 1983), p. 72. 4. Adapted from B.J. Wagenknecht-Ivey, An Approach to Long-Range Strategic Planning for the Courts: Training Guide (Denver, CO: Center for Public Policy Studies, 1992), p. 10. 5. J.P. Kotter, "Why Transformation Efforts Fail," Harvard Business Review, 73(2)(1995), p. 59. ------------------------------ Bibliography American Bar Association. Standards Relating to Juror Use and Management. Chicago, IL: American Bar Association. 1983. Belasco, J.A., and R.C. Stayer. Flight of the Buffalo: Soaring to Excellence, Learning To Let Employees Lead. New York: Warner Books. 1993. Blankenship, M.B., J.B. Spargar, and W.R. Janikowski. "Accountability v. Independence: Myths of Judicial Selection." Criminal Justice Policy Review 6(1)(1992), pp. 69-79. Bureau of the Census. Statistical Abstract of the United States, 1988. Washington, DC: U.S. Department of Commerce. 1989. Bureau of Justice Assistance. Planning Guide for Using the Trial Court Performance Standards and Measurement System. Washington, DC: U.S. Department of Justice. 1997. Bureau of Justice Assistance. Trial Court Performance Standards With Commentary. Washington, DC: U.S. Department of Justice. 1997. Bureau of Justice Assistance. Trial Court Performance Standards and Measurement System (Program Brief). Washington, DC: U.S. Department of Justice. 1997. Bureau of Justice Assistance. Trial Court Performance Standards and Measurement System Implementation Manual. Washington, DC: U.S. Department of Justice. 1997. Chapper, J., and R. Hanson. Three Papers on Understanding Reversible Error in Criminal Appeals. Williamsburg, VA: National Center for State Courts. 1979. Citizens' Commission to Improve Michigan Courts. Final Report and Recommendations to Improve the Efficiency and Responsiveness of Michigan Courts. Lansing, MI: Michigan Supreme Court. 1986. Clynch, E., and D.W. Neubauer. "Trial Courts as Organizations: A Critique and Synthesis." In Administration and Management of Criminal Justice Organizations: A Book of Readings, Stan Stokjovic et al. (eds.) 2nd ed. Prospect Heights, IL: Waveland Press, Inc. 1994. Cooper, C.S. Expedited Drug Case Management. Washington, DC: U.S. Department of Justice. Office of Justice Programs. Bureau of Justice Assistance. 1994. Covey, S.R. Principle-Centered Leadership. New York: Summit Books. 1991. Duren v. Missouri, 439 U.S. 357 (1979). Ellickson, P., and J. Petersilia. Implementing New Ideas in Criminal Justice (R-2929-NIJ). Santa Monica, CA: RAND Corporation. 1983. Flemming, R., P. Nardulli, and J. Eisenstein. "The Timing of Justice in Felony Trial Courts." Law and Policy 9(2)(April 1987), pp. 179-206. Gallas, G., and E.C. Gallas. "Court Management Past, Present, and Future: A Comment on Lawson and Howard." Justice System Journal 15(2)(1991), pp. 605-616. GMA Research Corporation. Washington State Judicial Survey. Olympia, WA: Office of the Administrator for the Courts, State of Washington. 1988. Goerdt, J.A., et al. Examining Court Delay: The Pace of Litigation in 26 Urban Trial Courts, 1987. Williamsburg, VA: National Center for State Courts. 1989. Goerdt, J.A., C. Lomvardias, and G. Gallas. Reexamining the Pace of Litigation in 39 Urban Trial Courts. Washington, DC: U.S. Department of Justice. Bureau of Justice Assistance. 1991. Goldkamp, J., and M. Gottfredson. Guidelines for the Pretrial Release Decision: Superior Court of Arizona, Maricopa County; Circuit and County Courts, Dade County; Boston Municipal Court; and Suffolk County Superior Court. Bail Guidelines Project. Philadelphia, PA: Temple University. 1985. Goodman, M.L. "Effective Case Monitoring and Timely Dispositions: The Experience of One California Court." Judicature 76(5)(February-March 1993), pp. 254-257. Gray, E.B. "Day in the Life of a Multi-Door Courthouse." Negotiation Journal 9(3)(July 1993), pp. 215-221. Hardenbergh, D. "Planning and Design Considerations for Trial Courtrooms." State Court Journal 14(4)(Fall 1990), pp. 32-38. Headley-Edwards, N., and D.A. Ryan. Comprehensive Adjudication of Drug Arrestees (CADA) Project, 1988-1990. San Jose, CA: Santa Clara County Office of the County Executive. 1990. Herbert, A., and R. Colton. Tables for Statisticians. New York: Barnes and Noble. 1963. Hewitt, W. Court Interpretation: Model Guides for Policy and Practice in the State Courts. Williamsburg, VA: National Center for State Courts. 1995. Jacoby, J.E. "Expedited Drug Case Management Programs: Some Lessons in Case Management Reform." Justice System Journal 17(1)(1994), pp. 19-40. Jacoby, J.E., E.C. Ratledge, and H.P. Gramckow. Expedited Drug Case Management Programs: Issues for Program Development, Executive Summary. Washington, DC: U.S. Department of Justice. National Institute of Justice. 1992. Johnson, S.S., and P. Yerawadekar. "Courthouse Security." Court Management Journal 3(1981), pp. 8-12. Kairys, D., J.B. Kadan, and J.P. Lehoczky. "Jury Representation, A Mandate for Multiple Source Lists." California Law Review 65(1977), pp. 776-827. Kiely, T.J. "Managing Change: Why Reengineering Projects Fail." Harvard Business Review 73(2)(1995), p. 15. King County Department of Public Safety. King County Department of Public Safety 1989 Annual Report. Seattle, WA. 1990. Knuth, D.J. The Art of Computer Programming, Vol. 2, Semi-Numerical Algorithms. Reading, MA: Addison- Wesley Publishing Company. 1969. Kotter, J.P. "Why Transformation Efforts Fail." Harvard Business Review 73(2)(1995), pp. 59-67. Krueger, R.A. Focus Groups: A Practical Guide for Applied Research. Beverly Hills, CA: Sage Publications. 1988. Luskin, M., and R. Luskin. "Why So Fast, Why So Slow: Explaining Case Processing Time." Journal of Criminal Law and Criminology 77(1)(Spring 1986), pp. 190-214. MacCoun, R.J., and T.R. Tyler. "Basis of Citizens' Perceptions of the Criminal Jury: Procedural Fairness, Accuracy, and Efficiency." Law and Human Behavior 12(3)(September 1988), pp. 333-352. Maddi, D. Judicial Performance Polls. Chicago: American Bar Foundation. 1977. Mahoney, B., et al. Changing Times in Trial Courts: Caseflow Management and Delay Reduction in Urban Trial Courts. Williamsburg, VA: National Center for State Courts. 1988. Martin, J.A. Approach to Long-Range Strategic Planning for the Courts. Alexandria, VA: State Justice Institute. 1992. Menaster, Spooner, and Greenberg. "Getting a Fair Cross-Section of the Community." Forum (1989), pp. 14-21. Moore, C. Group Techniques for Idea Building. Applied Social Research Methods Series, Vol. 9. Beverly Hills, CA: Sage Publications. 1987. Morgan, D. Focus Groups as Qualitative Research. Beverly Hills, CA: Sage Publications. 1988. Munsterman, G.T., and J.T. Munsterman. "The Search for Jury Representativeness." Justice System Journal 11(1986), pp. 59-78. Nagel, I. "The Legal/Extra-Legal Controversy: Judicial Decisions in Pretrial Release." Law and Society Review 17(1983), pp. 481-515. National Center for State Courts. The Americans with Disabilities Act: Title II Self-Evaluation. Williamsburg, VA. 1992. National Center for State Courts. Methodology Manual for Jury Systems. Williamsburg, VA. 1981. National Center for State Courts. A Supplement to the Methodology Manual for Jury Systems: Relationships to the Standards Relating to Juror Use and Management. Williamsburg, VA. 1987. National Institute of Law Enforcement and Criminal Justice. Multiple Lists for Juror Selection: A Case Study for San Diego Superior Court. Washington, DC: U.S. Department of Justice. Law Enforcement Assistance Administration. 1978. National Sheriffs' Association. Court Security: A Manual of Guidelines and Procedures. Washington, DC: U.S. Department of Justice. Law Enforcement Assistance Administration. 1978. Osborne, D., and T. Gaebler. Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector. Reading, MA: Addison-Wesley Publishing Company. 1992. Philip, C. How Bar Associations Evaluate Sitting Judges. New York: Institute for Judicial Administration. 1976. Press-Enterprise Co. v. Superior Court of California, 464 U.S. 501 (1984). Press-Enterprise Co. v. Superior Court of California for Riverside, 478 U.S. 106 (1986). Schultz, W.L., C. Bezold, and B.P. Monahan. Reinventing Courts for the 21st Century: Designing a Vision Process. Williamsburg, VA: National Center for State Courts. 1993. Sponzo, M.J. "Independence vs. Accountability: Connecticut's Judicial Evaluation Program." Judge's Journal 26(2)(Spring 1987), pp. 13-17. Task Force on Principles for Assessing the Adequacy of Judicial Resources. Assessing the Need for Judicial Resources: Guidelines for a New Process. Williamsburg, VA: National Center for State Courts. 1983. Taylor v. Louisiana, 419 U.S. 526 (1975). Tyler, T. "What Is Procedural Justice? Criteria Used by Citizens to Assess the Fairness of Legal Procedures." Law and Society Review 22(1988), pp. 103-139. U.S. Congress. Senate. Committee on the Judiciary. Juvenile Courts: Access to Justice: Hearing Before the Subcommittee on Juvenile Justice. 102d Cong., 2d sess., March 4, 1992. U.S. Department of Health and Human Services. Final Report on the Validation and Effectiveness Study of Legal Representation Through Guardian Ad Litem. Washington, DC. 1994. Wagenknecht-Ivey, B.J. An Approach to Long-Range Strategic Planning for the Courts: Training Guide. Denver, CO: Center for Public Policy Studies. 1992. Williams, R.J. "Envisioning the Courts: Old Myths or New Realities?" The Court Manager 9(4)(1994), p. 45. Yankelovich, Skelly, and White, Inc. The Public Image of Courts: Highlights of a National Survey of the General Public, Judges, Lawyers, and Community Leaders. Williamsburg, VA: National Center for State Courts. 1978. Yin, R. Case Study Research Design and Methods. Beverly Hills, CA: Sage Publications. 1984. ------------------------------ Sources for Further Information For further information about the Trial Court Performance Standards and Measurement System, contact: Bureau of Justice Assistance Adjudication Branch 810 Seventh Street NW. Washington, DC 20531 202-514-5943 World Wide Web: http:// www.ojp.usdoj.gov/BJA Bureau of Justice Assistance Clearinghouse P.O. Box 6000 Rockville, MD 20849-6000 Tel: 1-800-688-4252 Fax: 301-519-5212 World Wide Web: http:// www.ncjrs.org Department of Justice Response Center Tel: 1-800-421-6770 National Center for State Courts 300 Newport Avenue Williamsburg, VA 23185 Tel: 757-253-2000 Fax: 757-220-0449 World Wide Web: http:// www.ncsc.dni.us ------------------------------ Appendix D Court Performance Inventory The Court Performance Inventory is intended to familiarize individuals with the Trial Court Performance Standards and Measurement System developed by the Commission on Trial Court Performance Standards, the National Center for State Courts, and the Bureau of Justice Assistance.* The inventory is designed to collect individual perceptions (that may be pooled with the perceptions of others) about specific trial court performance. Results of the inventory graphed onto the Court Performance Profile at the end of this appendix can be used to target further diagnostic efforts and management strategies. DIRECTIONS: Read each of the attached 50 statements. Based on your experiences with your court or another court of interest to you, determine whether you believe the statement is BROADLY TRUE or BROADLY FALSE about the way the court performs. If you believe the statement is BROADLY TRUE, place an "X" in the corresponding numbered box on the Court Performance Inventory Response Form that follows the last statement. For example, if you believe the first statement to be generally true about the court you are rating, place an "X" in the box with the "1" on the response form. Once you have completed the response form, add the number of boxes marked as BROADLY TRUE for each of the five columns. Each column refers to one of the performance areas. Use the five sums to complete the Court Performance Profile. 1. An observer sitting in any courtroom's public seating area will have no difficulty hearing judges, attorneys, litigants, witnesses, and other participants in the proceedings. 2. The total elapsed time it takes the court to dispose of cases once they are filed complies with national standards. 3. The final juror pools used by the court are representative of the demographic characteristics of the jurisdiction's population. 4. The allocation of personnel resources among case types is defensible and based upon logic and reason. 5. The general public (a) perceives the trial court and the justice it delivers as accessible; (b) has trust and confidence that basic trial functions are conducted expeditiously and fairly and that the court's decisions have integrity; and (c) knows that the trial court is independent, accountable, and not unduly influenced by other government components. 6. An undercover law enforcement official, dressed in plain clothes, will not be able to breach either the court's security systems that protect the public or confidential court files and records. 7. An examination of court financial records will reveal that the various types of funds for which the court is responsible are disbursed in a timely manner. 8. An examination of relevant case file documents and court records will reveal that the court closely adheres to key legal requirements. 9. An analysis of the court as an equal opportunity employer would reveal that race and gender distribution for each job category is generally reflective of the available labor pool for each category. 10. Justice system representatives (a) perceive the trial court and the justice it delivers as accessible; (b) have trust and confidence that basic trial functions are conducted expeditiously and fairly and that court decisions have integrity; and (c) know that the trial court is independent, accountable, and not unduly influenced by other components of government. 11. Interpreter services provided by the court are performed by individuals with language proficiency, interpreting skill, and knowledge of professional conduct. 12. The court promptly implements changes in substantive and procedural laws that are a result of Federal and State legislation and new Federal regulations. 13. A statistical analysis conducted of data collected from closed files for bail, bond, and release on recognizance decisions will reveal that these decisions are not based on extralegal factors such as the defendant's race or gender, the judge assigned to the case, or the geographic location of the court. 14. The court does a good job in disseminating information to the public about its programs and operations. 15. Court employees (a) perceive the trial court and the justice it delivers as accessible; (b) have trust and confidence that basic trial functions are conducted expeditiously and fairly and that court decisions have integrity; and (c) know that the trial court is independent, accountable, and not unduly influenced by other components of government. 16. Observers of court proceedings are likely to find all court personnel courteous and responsive. 17. The court keeps up with its incoming caseload by disposing of as many cases as are filed each year. 18. Record examinations and statistical analyses are likely to reveal that sentencing decisions of the court are based mostly on legally relevant factors and not on extralegal factors such as the defendant's race or gender, the judge assigned to the case, or the geographic location of the court. 19. A group of knowledgeable persons both within and outside the court would conclude that the court has acted responsibly in responding to public policy issues of concern within the jurisdiction such as domestic violence, discrimination, substance abuse, or others that could have affected the fair and effective administration of justice by the court. 20. Individuals who have had contact with the court such as litigants, jurors, witnesses, victims, or those conducting other business with the court consider the court's decisions and treatment of individuals as fair and equitable. 21. The court takes measures to reduce costs and facilitates affordable access to the judicial system for financially disadvantaged persons. 22. The court responds promptly to requests for information from the public. 23. A broad examination of appeal outcomes reflects that the trial court adheres to substantive laws and procedural requirements. 24. Selected knowledgeable individuals are likely to conclude that the trial court maintains its independence and institutional integrity, but that it still has good relations with other units of government. 25. Regular users of the court (i.e., court employees, attorneys, probation officers, and jurors) are likely to say that they are able to conduct their business with the court with relative ease and convenience. 26. A person relatively unfamiliar with the court will have no difficulty in locating and actually entering the courtroom in which a particular hearing is taking place. 27. The court promptly implements changes in administrative procedures required by the State supreme court and the administrative office of the courts. 28. The court does a good job of communicating clearly the terms and conditions of criminal sentences. 29. The court's responses to requests for information from the media are accurate and timely. 30. The results of a survey of regular court users, court employees, attorneys, probation officers, and jurors are likely to conclude that judges and other trial court personnel are courteous and responsive to the public and all others with whom they come into contact. 31. A person who attempts to determine the specific time and location of a particular court event will have no difficulty getting this information from the court by telephone. 32. Cases scheduled for trial are heard on the first scheduled trial date. 33. The court is well positioned and organized to enforce or facilitate the enforcement of its orders and judgments. 34. A group of knowledgeable individuals is likely to conclude that court personnel practices and decisions are fair. 35. Citizens are likely to report that access to court services is generally not hindered because of costs or complexity of procedures. 36. Persons with physical disabilities are able to conduct transactions in the court with relative ease. 37. The various services available from the court (such as indigent defense services, interpreter services, and mental health evaluations) are provided promptly. 38. A test of the timely retrieval of individual case files will reveal that the court's file control system is reliable and efficient. 39. The court has adequate statistical reporting capacity to make useful assessments of the relationship between the court's workload and the distribution of court resources. 40. Members of the bar who have appeared regularly in the court in the past year would assess the court's actions and decisions as fair and equitable. 41. Court observers are likely to find that litigants are treated with a high degree of courtesy and individual respect by judges of the court. 42. The number of pending cases exceeding national or State time standards for case processing is low. 43. The court clearly states the terms and conditions of obligations imposed as a result of adjudication of a civil dispute. 44. Court employees and media representatives are likely to be satisfied with policies and practices for responding to media inquiries. 45. Court staff will rate highly the degree of independent control that the court exercises over its fiscal operations, personnel, and services related to case flow. 46. Indigent persons who have never tried to obtain legal assistance are likely to obtain affordable legal assistance with relatively routine legal problems. 47. The court complies with established schedules for routine court reports such as statistical reports required by the State administrative office of the courts and the Equal Employment Opportunity Commission. 48. An examination of a sample of case file data will show that most files are complete and accurate. 49. The court conducts periodic internal and external audits of its financial practices and responds to auditors' suggestions for improvements. 50. Court employee responses to structured questions about fairness in personnel practices related to employee morale and competence are likely to reflect general satisfaction. * This instrument was devised initially by Daniel H. Straub and further developed and revised by Ingo Keilitz, former director, and staff of the Trial Court Performance Standards Project. ------------------------------ Bureau of Justice Assistance Information General Information Callers may contact the U.S. Department of Justice Response Center for general information or specific needs, such as assistance in submitting grants applications and information on training. To contact the Response Center, call 1-800-421-6770 or write to 1100 Vermont Avenue NW., Washington, DC 20005. Indepth Information For more indepth information about BJA, its programs, and its funding opportunities, requesters can call the BJA Clearinghouse. The BJA Clearinghouse, a component of the National Criminal Justice Reference Service (NCJRS), shares BJA program information with State and local agencies and community groups across the country. Information specialists are available to provide reference and referral services, publication distribution, participation and support for conferences, and other networking and outreach activities. The Clearinghouse can be reached by: o Mail P.O. Box 6000 Rockville, MD 20849-6000 o Visit 2277 Research Boulevard Rockville, MD 20850 o Telephone 1-800-688-4252 Monday through Friday 8:30 a.m. to 7:00 p.m. eastern time o Fax 301-519-5212 o Fax on Demand 1-800-688-4252 o BJA Home Page http://www.ojp.usdoj.gov/BJA o NCJRS World Wide Web http://www.ncjrs.org o E-mail askncjrs@ncjrs.org o JUSTINFO Newsletter E-mail to listproc@ncjrs.org Leave the subject line blank In the body of the message, type: subscribe justinfo [your name] BJA World Wide Web Address For a copy of this document online, as well as more information on BJA, check the BJA Home Page at http://www.ojp.usdoj.gov/BJA