MENU TITLE: Planning Guide for Using the Trial Court Performance Standards and Measurement System. Series: BJA Monograph Published: July 1997 51 pages 96,724 bytes Figures, charts, forms, and tables are not included in this ASCII plain-text file. To view this document in its entirety, download the Adobe Acrobat graphic file available from this Web site or order a print copy from BJA at 800-688-4252. U.S. Department of Justice Office of Justice Programs Bureau of Justice Assistance Planning Guide for Using the Trial Court Performance Standards and Measurement System Monograph July 1997 NCJ 161568 ------------------------------ U.S. Department of Justice Office of Justice Programs 810 Seventh Street NW. Washington, DC 20531 Janet Reno Attorney General U.S. Department of Justice John C. Dwyer Acting Associate Attorney General Laurie Robinson Assistant Attorney General Nancy E. Gist Director, Bureau of Justice Assistance ------------------------------ U.S. Department of Justice Response Center 1-800-421-6770 ------------------------------ Bureau of Justice Assistance Clearinghouse 1-800-688-4252 ------------------------------ Bureau of Justice Assistance World Wide Web Home Page http://www.ojp.usdoj.gov/BJA ------------------------------ This document was prepared the National Center for State Courts,supported by grant number 91-DCX-0013(S-1), awarded by the Bureau of Justice Assistance, Office of Justice Programs, U.S. Department of Justice. The work of the courts participating in the demonstration of the Trial Court Performance Standards and Measurement System was supported by grants from the State Justice Institute. The opinions, findings, and conclusions or recommendations expressed in this document are those of the authors and do not necessarily represent the official position or policies of the U.S. Department of Justice. ------------------------------ The Bureau of Justice Assistance is a component of the Office of Justice Programs, which also includes the Bureau of Justice Statistics, the National Institute of Justice, the Office of Juvenile Justice and Delinquency Prevention, and the Office for Victims of Crime. ------------------------------ Foreword This Nation's trial courts face an extraordinary challenge as we near the beginning of a new century. They must meet the needs of America's increasingly diverse people with justice systems that in too many instances have become inefficient, unaccountable, and unresponsive to the individuals they serve. The reasons for this are familiar to observers of many institutions in modern society: trial courts are overburdened and lack the means and resources to implement a process of self- evaluation and improvement based on universally accepted performance standards. Developing a common language for describing, classifying, and measuring the performance of trial courts was the goal of an 8-year effort, the Trial Court Performance Standards Project, initiated in 1987 by the National Center for State Courts and the Bureau of Justice Assistance. The Trial Courts Performance Standards and Measurement System is the result of that effort. Crafted by a commission of leading trial judges, court managers, and scholars and demonstrated successfully in trial courts across the Nation, the measurement system is a valuable resource for helping courts provide fair and efficient adjudication and disposition of cases. The Planning Guide for Using the Trial Court Performance Standards and Measurement System describes in detail this innovative and comprehensive system. To help courts implement the system, this guide includes a discussion of key considerations before beginning the process, a prototype educational program for court staff, and an inventory to evaluate areas of performance that need improvement. This guide is essential reading for judges, court managers, lawyers, policymakers, and others preparing to use the system in their trial courts. It is also useful to members of the public and citizen groups who would like to see their community's trial courts set and meet higher standards of responsiveness and effectiveness. It is our hope that every trial court in the Nation will use this planning guide and its companion publications to begin improving access to justice and its administration with equality, integrity, and timeliness. Nancy E. Gist Director ------------------------------ Preface This planning guide is intended to be used in conjunction with the Trial Court Performance Standards and Measurement System, the product of an 8-year initiative designed to define and provide indicators of effective trial court performance. Endorsed by the Conference of Chief Justices, the Conference of State Court Administrators, and the National Association for Court Management, and incorporated into the standards of the National College of Probate Judges, the Trial Court Performance Standards are widely viewed as a blueprint for improving the administration of justice in State trial courts. The Trial Court Performance Standards and Measurement System includes: o A common language for the description, classification, and communication of court activities. o A conceptual framework for understanding and improving court performance. o A means for self-assessment, self-improvement, and accountability to the public. The measurement system is designed for the "generic" general jurisdiction trial court. This manual helps court officials think through the issues of implementation in their respective jurisdictions. Chapter 1 describes a model for implementing the system, Chapter 2 provides an overview of the measures and discusses key points to consider before starting the measurement process, and Chapter 3 outlines the components of a program for educating individuals in the successful implementation of the system. Five appendixes provide additional information for those who wish to study further the implementation of trial court performance standards. Appendix A is a bibliography applicable to all four trial court performance documents. Appendix B lists sources for further information about the Trial Court Performance and Measurement System. Appendix C presents a tabular overview of the Trial Court Performance Standards and Measurement System. Appendix D presents a prototype educational program on the Trial Court Performance Standards and Measurement System. Appendix E presents the Court Performance Inventory, a survey instrument designed to collect individual perceptions about trial court performance. Complementing the survey instrument are the Court Performance Inventory Response Form and the Court Performance Profile, tools for tabulating survey responses and presenting those responses graphically. The three major products of this project are this planning guide; the Trial Court Performance Standards With Commentary; and the Trial Court Performance Standards and Measurement System Implementation Manual. In addition, the Bureau of Justice Assistance has developed a brief overview of the system: Trial Court Performance Standards and Measurement System (Program Brief). ------------------------------ Acknowledgments The Bureau of Justice Assistance would like to thank the Commission on Trial Court Performance Standards for its dedication and vision in guiding the Trial Court Performance Standards to their fruition. Commission members include the following individuals: Honorable Robert C. Murphy, Chair Chief Judge (Retired) Court of Appeals of Maryland Towson, Maryland Honorable Rebecca A. Albrecht Associate Presiding Judge Superior Court of Arizona, Maricopa County Phoenix, Arizona Robert N. Baldwin State Court Administrator Supreme Court of Virginia Richmond, Virginia Carl F. Bianchi Director of Legislative Services Legislative Services Office Boise, Idaho Honorable Robert C. Broomfield Chief Judge, U.S. District Court District of Arizona Phoenix, Arizona John A. Clarke Executive Officer/Clerk Los Angeles Superior Court Los Angeles, California Judith A. Cramer Manager, Weed & Seed Neighborhood Revitalization Program Tampa, Florida Honorable Anne L. Ellington Assistant Presiding Judge King County Superior Court Seattle, Washington Howard Hanson County Clerk/Superior Court Administrator Marin County San Rafael, California Robert D. Lipscher Administrative Director (Retired) New Jersey Administrative Office of the Courts Trenton, New Jersey Edward B. McConnell (ex-officio) President Emeritus National Center for State Courts Williamsburg, Virginia Doris Marie Provine Chair, Department of Political Science Syracuse University Syracuse, New York Honorable Henry Ramsey, Jr. Dean, Howard University School of Law Washington, D.C. Honorable Leo M. Spellacy Judge, Ohio Court of Appeals Eighth District Cleveland, Ohio Whitfield Smith (1987-1991) Clerk of Court (Former) Superior Court for DeKalb County, Georgia Decatur, Georgia Honorable Fred B. Ugast Chief Judge (Retired) Superior Court of the District of Columbia Washington, D.C. Bureau of Justice Assistance, U.S. Department of Justice Marilyn Nejelski Program Manager Charles Hollis Chief, Adjudication Branch National Center for State Courts Sally T. Hillsman Vice President (1992-1995) Geoff Gallas Vice President (1987-1991) Trial Court Performance Standards Project Staff Pamela Casey, Director Ingo Keilitz,* Director Hillery Efkeman Margaret Fonner John Goerdt Thomas Hafemeister* Roger Hanson William Hewitt Brenda Jones* Susan Keilitz Fred Miller* Beatrice Monahan* Pamela Petrakis* David Rottman* *denotes former project staff Numerous individuals at the National Center for State Courts and elsewhere gave generously of their time to assist the development and initial testing of the Trial Court Performance Standards and Measurement System, including Stevalynn Adams, David Aday, Carl Baar, Kent Batty, Richard Berk, Chuck Campbell, Joy Chapper, George Cole, Hank Daley, Michael Dann, Tom Dibble, Chris Duncan, Bill Fishback, Gene Flango, Sandy Garcia, Debbie Gause, George Gish, Gordon Griller, Mary Hogan, Cindy Huffman, Michael Jeans, Lynn Jordaans, Carl Kessler, Kay Knapp, Gerald Kuban, Monica Lee, Chris Lomvardias, Kay Loveland, Jennifer Rae Lovko, Robert Lowe, James Lynch, Barry Mahoney, Mary McCall, Craig McEwen, Jan Michaels, Barbara Meierhoefer, Tom Munsterman, Raymond Nimmer, Jessica Pearson, Mike Planet, Maryann Rondeau, Jane Raynes, Teresa Risi, Dalton Roberson, Ronald Rosenberg, Jeffrey Roth, Fred Rusillo, Hisako Sayers, Bob Tobin, Anne Walker, Patricia Wall, Steven Wasby, Joan White, Matt Williams, and Robert Williams. We also are indebted to the many individuals in the 4 pilot States and the 12 courts for the countless hours and invaluable insights they contributed during the project's 4-year demonstration phase. Their activities were supported, in part, by funding from the State Justice Institute (SJI). BJA, the Commission, and the NCSC gratefully acknowledge the SJI's commitment to the project and the feedback and insights the "demonstration" courts offered regarding the measurement process. Their help has been invaluable. State Justice Institute Richard Van Duizend, Deputy Director New Jersey Administrative Office of the Courts Robert D. Lipscher, (Retired) Director Theodore Fetter, Deputy Director and Project Manager Demonstration Courts: Atlantic County Superior Court Burlington County Superior Court Morris County Superior Court Ocean County Superior Court Somerset County Superior Court Ohio Supreme Court of Ohio Stephan W. Stover, Administrative Director Ruth Ann Elmer, Former Associate Director for Court Services and Project Manager Demonstration Courts: Meigs County Court of Common Pleas Stark County Court of Common Pleas Wayne County Court of Common Pleas Virginia Office of the Administrator for the Courts Robert N. Baldwin, Executive Secretary Beatrice P. Monahan, Senior Planning Analyst and Project Manager Demonstration Court: Fairfax County Circuit Court Washington Office of the Administrator for the Courts Mary Campbell McQueen, Administrator for the Courts Yvonne Pettus, Manager of Court Services and Project Manager Demonstration Courts: Spokane County Superior Court Thurston County Superior Court Whatcom County Superior Court ------------------------------ Contents Chapter 1 An Overview of the Implementation Process Step 1: Create an Impetus for Change Step 2: Form a Guiding Coalition Step 3: Create and Communicate a Vision and a Mission Step 4: Select Standards Relevant to the Vision and Mission Step 5: Conduct Measures Step 6: Make Improvements Step 7: Communicate Progress Chapter 2 Key Considerations Before Starting the Measurement Process Overview of Measures Who Will Oversee the Measurement Process? How Should the Measures Be Adapted for a Specific Court? How Will the Results Be Used? Chapter 3 Educational Program on Trial Court Performance Standards Component 1: Reviewing the Trial Courts Performance Standards and Measurement System Component 2: Using the Trial Courts Performance Standards and Measurement System To Plan, Organize, Manage, and Lead Component 3: Using the Measurement System To Measure Performance Component 4: Using the Trial Courts Performance Standards and Measurement System To Formulate, Implement, and Evaluate Strategies Appendix A Bibliography Appendix B Sources for Further Information Appendix C Overview of the Trial Court Performance Standards and Measurement System Appendix D Prototype Educational Program on the Trial Court Performance Standards and Measurement System Appendix E Court Performance Inventory ------------------------------ Chapter 1 An Overview of the Implementation Process As noted in the Preface, the Trial Courts Performance Standards and Measurement System is crafted for the "generic" general jurisdiction trial court. But the term "generic" rarely applies to real institutions. Therefore, how the system is applied in an actual court depends both on the needs of the court and the environment in which the court operates. For one court, the application of the system might involve selecting and conducting one or two measures that address a particular area of concern. For another court, the application might involve articulating an overall strategic plan in which the system plays a central role. Both of these approaches are incorporated into the implementation model depicted in Figure 1. One court might begin at the fourth step and conclude with the sixth step. A second court might begin at the first step, follow the process through completion of the seventh step, and then repeat the process using information gained during the first cycle to refine the court's strategic plan further. These differing approaches, both of which are legitimate uses of the system, should be kept in mind as the implementation model is discussed in the next sections. Although the model in Figure 1 is presented in a linear manner, it is inherently flexible. The steps at which courts undertake and complete the implementation process are left completely to their discretion. Step 1: Create an Impetus for Change The standards and measurement system provides a court with the tools for identifying areas in need of improvement, but having the tools and using them often are two different things. The first step involves identifying the reasons for undertaking self-improvement and generating the momentum to initiate and sustain the effort. J.P. Kotter refers to this process as establishing a "sense of urgency."[1] This sense of urgency is achieved by identifying great opportunities, or potential or actual crises, and then communicating them to the individuals who will direct and participate in the improvement effort. Although some challenges facing courts vary by jurisdiction (for example, changes in funding base or legislative mandates), many are common to all courts. For example, Judge R.J. Williams notes that the growing use of private alternatives to dispute resolution could be viewed as a major threat to the current operation of the court system. "If we care about equality and justice," he writes, "then we cannot afford a two-tiered justice system (private dispute resolution for those who can afford it and public for those who cannot)."[2] On the other hand, Judge Williams also notes a potential opportunity for courts: "As we face a future of rapid change and increasing societal diversity, public institutions like the courts are a source through which we can experience a sense of national community. . . . But to foster this sense of national community requires that our institutions be relevant and hold the trust and confidence of all in our society."[3] It does not matter whether the court uses a local or national issue to drive its self-improvement process. What matters is that the identified issue has meaning for and elicits a reaction from the individuals who will be involved in the process. Step 2: Form a Guiding Coalition Sustaining a self-improvement effort requires an implementation team that can "command respect and resources" and maintain the energy to keep the project going when resistance is encountered.[4] Typically, individuals possessing these characteristics are easy to identify within any organization. J.P. Kotter refers to this implementation team as a "Guiding Coalition," a phrase adopted in this guide.[5] The Guiding Coalition's core should consist of committed, senior court officials who are willing to invest a substantial amount of time and energy in the project. Although these senior persons need not always be the individuals who occupy the positions of highest authority or power, the court's top management should be in agreement at least philosophically with the effort. Team size will depend on the size of the court. The coalition should not necessarily be a task force in which a representative from every potentially relevant group is invited to participate. Involving too many individuals can cause the planning process to supersede the purpose, which is to embark on a self-improvement process.[6] Representatives from various groups needed for or affected by the self- improvement process are mentioned in specific performance measures and their cooperation is solicited along the way, but these representatives need not all be permanent members of the Guiding Coalition. In fact, representatives from internal departments and divisions and key constituent groups (for example, the bar or the general public) should be included only if they are willing to make a commitment to the project. Persons directly involved in implementing the measurement system should be assembled after the measures are chosen so that the correct expertise can be tapped for each measure. Step 3: Create and Communicate a Vision and a Mission Successful organizations have a clear sense of direction, a unity of purpose, and a workable plan of action. People must have a guiding reason for acting. A vision provides that reason, because it grows out of the shared values of the organization's members. In a work prepared specifically for court organizations, Reinventing Courts for the 21st Century: Designing a Vision Process, Schultz, Bezold, and Monahan describe what a vision is: A vision is a compelling statement of the preferred future that those who develop and subscribe to it want to create. Visions are "futures for the heart"--they touch and move us.[7] In describing a vision as a "future for the heart," the authors are contrasting it with the analytical elements of planning, which are called "futures of the head." Planning includes assessing current reality, which is what measuring trial court performance accomplishes, and strategizing, the process of rationally contemplating how current reality can be transformed into a preferred future. A clear vision communicated to the individuals participating in the court's self-improvement plan will sustain the effort during the mundane and less inspiring days of data collection and analysis. J.A. Belasco and R.C. Stayer emphasize a customer orientation in their description of an organizational vision. A customer orientation is familiar in the context of private enterprise, and it is emerging as similarly important to government organizations. Vision is the focus of action. It is a critical leadership task. It doesn't take a `special person.' Vision is a statement of what your customers tell you your organization must be. It is a simple-to-understand, inspirational, focusing statement. It establishes the framework for all other leadership activities. Then lots of actions to live and support the vision must follow. Vision is much more `down and dirty doing' than fancy plans and words.[8] If a court does not already have a vision statement, the standards serve as a good resource for the development of an initial version. Several themes are interwoven throughout the standards: the court as service provider (customer orientation), problem solver, institution builder, and proactive governing agent in the community. The foundation of the standards and measurement system is the theme of the court as an organization that is accountable for its performance. The Guiding Coalition should draft a vision statement that builds on the shared values of those who will be involved in the court's self- improvement efforts and that clarifies the direction of these efforts. The vision may be a bit blurry at first and will probably evolve over several iterations before the coalition is satisfied with it. In fact, as the implementation model suggests, feedback from court users as a result of the measurement process also may help the court clarify its vision (see Figure 1). Whereas a vision statement describes the court's desired state or position in the future, a mission statement describes a court's fundamental purpose. Stephen Covey refers to an organization's mission statement as a "corporate constitution" by which everything else in the organization is governed.[9] The five performance areas identified by the Commission--Access to Justice; Expedition and Timeliness; Equality, Fairness, and Integrity; Independence and Accountability; and Public Trust and Confidence--offer a starting point for developing and refining the court's mission statement. The standards, crafted by a diverse and sophisticated group of members of the national court community, in fact represents a version of a "national vision" for courts. Any one or all of the fundamental responsibilities articulated in the standards could become the focus of a court's mission statement, as illustrated by the mission statement developed by Judge R.J. Williams: I believe that the mission of the courts is to serve justice by providing a comprehensive public dispute resolution service, accessible to all citizens, capable of resolving disputes fairly, efficiently, and enforceably.[10] Once vision and mission statements are drafted, they should be communicated by word and deed to everyone who comes in contact with the court-- especially to those who will be involved in the court's self-improvement effort. According to Kotter, "Without credible communication, and a lot of it, the hearts and minds of the troops are never captured."[11] The coalition needs to communicate the vision and mission often and in a variety of forms--in presentations, in employee newsletters, through repeated references as part of training sessions and meetings, and in the language and behavior of court leaders. Step 4: Select Standards Relevant to the Vision and Mission The standards and measurement system allows a court to ground its vision and mission in measurable goals and objectives. A goal is a selected area of primary emphasis for which activity is to be completed within 2 to 5 years. As goals are achieved, they may diminish in relative significance or be replaced by more urgent goals. An objective is a key result that can be attained within a year. Generally, objectives are stated in quantifiable terms that permit ready verification of achievement. The system, consisting of 22 standards--or, in this context, goal statements--and 68 measures or objectives, should be implemented in stages. The pursuit of a vision is a journey toward the future, and, regardless of the journey's length, it should begin with small steps. Courts participating in the demonstration of the measures, in fact, emphasized the importance of tackling the measurement system in small "chunks." Application of the measures involves staff time, money, attention to detail, and, in some instances, the use of measurement methods with which court staff are unfamiliar. Therefore, the Guiding Coalition's fourth step is to select one or more standards to focus on. In some instances, the selection of certain standards may be obvious: the selection may have been implied in the coalition's prior discussions regarding the court's vision and mission, or a court may be reacting to a particular accusation such as excessive case processing time or unequal treatment of individuals. In other instances, however, the coalition may not be prepared to set priorities for applying measures without further discussion. In this case, the Court Performance Inventory (CPI) included in Appendix E may be helpful in identifying areas to focus on first. The CPI is a 50-question instrument designed to document first impressions quickly from sources who are knowledgeable about court operations. The inventory asks respondents to make judgments about the court with regard to the various performance areas. The CPI can be used as a point of discussion for coalition members. What patterns emerge from responses to the questions? Is there general agreement about the areas in which the court is performing well and about those that need improvement? Do the responses vary by position--for example, do clerical staff rate access to justice lower than judges do? If so, why? The answers to these questions will help the coalition pinpoint the areas in which it should initially focus its efforts. Step 5: Conduct Measures Measuring performance is what sets the Guiding Coalition's efforts apart from those of previous task forces considering the court's future. Performance is measured for exploratory reasons, to find out how the court is performing with regard to a particular standard, as well as for confirmatory reasons, to substantiate hypotheses about how it is performing. Sometimes a court may know how it is doing in a particular area, but needs systematic, empirical data to convince others of the urgency of a situation. In other instances, a court may already have data that address one or more of the performance standards of interest to the coalition. For example, if several complaints already have been lodged regarding the poor acoustics in a courtroom, there is little reason to apply Measure 1.1.3, Audibility of Participants During Open Court Proceedings, unless the court needs facts and figures to convince a reluctant funding agency. Therefore, data also may be beneficial for motivating individuals to action (step 1 in Figure 1). Finally, a court already may have adopted a strategy for improving its performance relative to one of the standards selected during step 4 in Figure 1, and now wants to measure performance to see how well the strategy is working. This scenario is depicted in Figure 1 by the dotted line between step 4 and step 6, where measurement is skipped initially in favor of getting immediately to the task of making improvements with respect to a known problem area. The critical importance of measuring performance is articulated by Osborne and Gaebler in their book Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector:[12] o What gets measured gets done. o If you don't measure results, you can't tell success from failure. o If you can't see success, you can't reward it. o If you can't reward success, you're probably rewarding failure. o If you can't see success, you can't learn from it. o If you can't recognize failure, you can't correct it. o If you can demonstrate results, you can win public support. To lessen the uneasiness some judges and court staff may have about the measurement process, the Guiding Coalition should reinforce the importance of collecting data in helping a court reach its goals. After all, this is a self-improvement effort that should help the court become a more responsive public institution and a place in which court staff want to work. The coalition should stress that the measures focus on the court as an organization and not on particular individuals within the court. In addition, the coalition should inform those involved in the measurement process that the measures were developed and tested by individuals who work in and with courts and are not being imposed by an outside policy or funding agency. The measurement system describes the actual measures in detail. The measures involve several different methods for collecting data and range broadly in difficulty and in resources necessary for their application. Courts participating in the demonstration of the measurement system reported the importance of starting off small by focusing on one or two measures to "get their feet wet" before tackling some of the more complex and lengthy measures. The demonstration courts also suggested undertaking both short-term and long-term data collection efforts to maintain interest in the project during the longer data collection periods of some measures. Results from the short-term measures will provide court staff with small wins that help clarify the benefits of the process and provide motivation for continuing long-range efforts. Regular status reports of the data collection effort also will help those involved to remain focused and remind them of the importance of their efforts. Step 6: Make Improvements Strategic change--getting from here to there-- requires good thinking, planning, and, more important, doing. "Doing" means actually making improvements based on the results of the first five steps. This sixth step involves tasks directed at accomplishing court objectives, goals, mission, or vision. For example, a court may discover through the use of measures for Standard 4.3, Personnel Practices and Decisions, that employee morale is low--much lower than the judges had realized. The court may learn that employees feel cut off from the judges and know very little about how the work they do fits into the heart of the organization's purposes (for example, the just and timely resolution of disputes and the protection of individuals from the arbitrary exercise of government power). The action step or task in response to this problem simply may be for the chief judge, or all of the judges, to "walk around" more regularly in the clerk's office and visit with the staff. This action or decision responds directly to a particular issue or problem and needs little preparation. The action may be more complex, however, and require articulation of a new or revised policy or program. To continue the example, the policy might call for judges to meet periodically with line staff in question-and-answer sessions. The action also could include an outline of a series of interconnected policies, processes, and procedures that need to be established and followed to accomplish the court's organizational goals or objectives, thereby positioning the court for sustained success. At best, the actions necessary for improvement become self-evident as a result of performance measurement. For example, problems associated with the integrity of the court's document and file management systems may come to light as a result of undertaking the measurements for Standard 3.6, Production and Preservation of Records. If so, the problems are likely to be amenable to relatively mechanical fixes. These improvements need not be controversial nor take a long time to implement. More often than not, however, formulating an improvement strategy is an exercise in entrepreneurship, requiring the coordinated effort of the Guiding Coalition. The challenge for the coalition is to evaluate and select a preferred strategy from among a host of possible ones. Several criteria should be considered before the preferred improvement strategy is selected:[13] o Is it suitable? Does the strategy make sense in light of the court's vision, mission, and current environment? o Is it valid? Is the strategy based on realistic assumptions? o Is it feasible? Does the court have the authority, resources, and confidence required to implement the strategy? o What are the strategy's vulnerabilities? What are the risks of implementing the strategy? How likely is the strategy to work as intended? Will anyone in the organization be harmed by the strategy? If so, what compensatory steps will be taken? o What are the timing requirements? When must the court see tangible benefits? Are there "windows of opportunity" to consider in implementing the strategy? o How adaptable is the strategy? Does the strategy significantly limit or enhance the court's flexibility to fulfill its mission? o Is the strategy usable? Can and will the court readily implement the strategy? The Guiding Coalition will need to observe all or most of the following tenets in selecting and implementing the chosen strategy: o Position others to act on the court's vision. o Encourage risk taking and new ways of doing things. o Anticipate resistance and obstacles and develop strategies for overcoming them. o Avoid blaming and criticizing past policies and actions. o Meet threats head-on. o Link what the court does well with what needs improvement. o Plan for significant short-term needs. o Explain the link between new behaviors and success. o Direct resources toward areas of high impact. o Leverage gains (wins) to achieve more improvement. o Model desired behaviors and actions. o Reinvigorate the implementation process continually with new ideas and projects. o Develop the means to ensure leadership, development, and succession. Step 7: Communicate Progress Making meaningful and sustainable changes in an organization is not a simple task. Courts engaged in the measurement system endeavor are analogous to companies trying to reengineer themselves for better performance. Studies of such companies have shown that "the most general lesson to be learned from the more successful cases is that the change process goes through a series of phases that, in total, usually require a considerable length of time."[14] When it becomes clear to people that major change will take a long time, organizational inertia reasserts itself, and urgency levels are likely to drop. Familiar problems easily recede into the organizational landscape and reassume the character of perennial "realities"--the problems we all acknowledge but relegate to the realm of "beyond our control." Inertia is predictable if no compensatory mechanisms are built into the process. Compensation for inertia requires new energy, which can be found in "small wins" and feedback to ensure that these wins are visible. Therefore, the planning process for the system implementation, as noted earlier, needs to incorporate some measures that are relatively short, simple, and easy to complete. The planning process also should include some relatively short measures that confirm hypotheses about good court performance as well as suspicions about problem areas. The Guiding Coalition also must nurture the conviction that new information itself is a win, regardless of whether the information confirms positive hypotheses or reveals areas in which improvement is needed. Undetected problems invariably persist. Problems that are detected through a conscious self-assessment rarely go unattended. Commitments to produce short-term wins help keep the urgency level up and force detailed analytical thinking that can clarify or revise visions. Instead of declaring victory, leaders of successful efforts use the credibility afforded by short-term wins to tackle even bigger problems. . . . They understand that renewal efforts take not months but years.[15] ------------------------------ 1. J.P. Kotter, "Why Transformation Efforts Fail," Harvard Business Review 73(2)(1995), p. 60. 2. R.J. Williams, "Envisioning the Courts: Old Myths or New Realities?" The Court Manager 9(4)(1994), p. 46. 3. R.J. Williams, "Envisioning the Courts: Old Myths or New Realities?" The Court Manager 9(4)(1994), p. 46. 4. T.J. Kiely, "Managing Change: Why Reengineering Projects Fail," Harvard Business Review 73(2)(1995), p. 15. 5. J.P. Kotter, "Why Transformation Efforts Fail," Harvard Business Review 73(2)(1995), p. 62. 6. P. Ellickson and J. Petersilia, Implementing New Ideas in Criminal Justice, R-2929-NIJ (Santa Monica, CA: Rand Corporation, 1983), p. 72. 7. W.L. Schultz, C. Bezold, and B.P. Monahan, Reinventing Courts for the 21st Century: Designing a Vision Process (Williamsburg, VA: National Center for State Courts, 1993), p. 16. 8.J.A. Belasco and R.C. Stayer, Flight of the Buffalo: Soaring to Excellence, Learning To Let Employees Lead (New York: Warner Books, 1993), p. 96. 9. S.R. Covey, Principle-Centered Leadership (New York: Summit Books, 1991), p. 165. 10.R.J. Williams, "Envisioning the Courts: Old Myths or New Realities?" The Court Manager 9(4)(1994), p. 47 11.J.P. Kotter, "Why Transformation Efforts Fail," Harvard Business Review 73(2)(1995), p. 63. 12.D. Osborne and T. Gaebler, Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector (Reading, MA: Addison-Wesley Publishing Company, 1992), pp. 146-155. 13.Adapted by B.J. Wagenknecht-Ivey, An Approach to Long-Range Strategic Planning for the Courts: Training Guide (Denver, CO: Center for Public Policy Studies, 1992), p. 10, Unit 8. 14. J.P. Kotter, "Why Transformation Efforts Fail," Harvard Business Review 73(2)(1995), p. 59. 15. J.P. Kotter, "Why Transformation Efforts Fail," Harvard Business Review 73(2)(1995), pp. 66-67. Studies, 1992). ------------------------------ Chapter 2 Key Considerations Before Starting the Measurement Process They say that figures rule the world. I do not know if this is true, but I do know that figures tell us if it is well or poorly ruled. - Goethe, 1814 Overview of Measures The measurement system provides a trial court with concrete indicators of how well it is performing with regard to a particular standard. For example, three measures are associated with Standard 1.1, Public Proceedings. The first measure verifies that court proceedings that should be open to the public; the second examines whether an observer can identify which proceeding is under way in a courtroom; and the third determines whether an observer actually can hear what is going on in the courtroom. These three measures tell court officials whether individuals can gain entry to, identify, and hear a particular court proceeding-- all of which are components of maintaining open and accessible proceedings. The measures associated with each performance standard are presented in Appendix C. Performance in the area of Public Trust and Confidence is dependent, in large part, on the court's performance in the other four areas. Thus, as noted in Appendix C, several of the measures from other areas are applicable to this final performance area. Each of the 68 measures is described in detail in the BJA monograph Trial Court Performance Standards and Measurement System Implementation Manual. Descriptions of most of the measures are divided into three stages: Planning/preparation, data collection, and data analysis and report preparation. Before court staff select a measure and begin to follow the steps in the measure's description, the court should consider broader issues related to the measurement process. This chapter discusses some of those issues within the context of three questions: (1) Who will oversee the measurement process? (2) How should the measures be adapted for a specific court? and (3) How will the results be used? Specific considerations will vary by the size, scope, political climate, and resources of the court jurisdiction. Who Will Oversee the Measurement Process? The experiences of courts demonstrating the application of the measurement system component indicate that court officials should designate a measurement coordinator before beginning the measurement process. The advantages of having a measurement coordinator include the following: o One person is responsible for ensuring that the measurement process moves forward. Due to the immediacy of regular business, court employees may be tempted to place the measures on hold until a less hectic time arrives. Unfortunately, that rarely or never happens. The measurement coordinator ensures that the measures maintain a high profile among court staff and that problems are addressed quickly to keep the process on track. o One person is familiar with all of the measures the court has elected to apply and may see opportunities for using court resources more efficiently and effectively during the assessment process. For example, for some measures, the preparation work (such as recruiting volunteer observers or selecting case file samples) may overlap. Depending on other considerations such as staff availability, the coordinator may decide to apply these measures simultaneously rather than sequentially to maximize court resources. Someone with a "bird's-eye view"--that is, knowledge of the entire measurement process as well as court resources--needs to consider these kinds of issues before data collection begins. o One person is the "repository" of information regarding the application of measures. This centralization ensures that what is learned during one measure is communicated to individuals who are preparing to conduct similar measures. o One person is familiar with the results of all the measures undertaken and will play an important role when interpreting the results of a particular measure. The coordinator may have knowledge of findings from one measure that complement or clarify findings from another. The measurement coordinator can be a court employee, a staff person from the State's administrative office of the courts, or a consultant from a university or private firm. Each of these categories of individuals served as the coordinator for one or more courts participating in the demonstration phase of the Trial Court Performance Standards Project. Based on the combined experiences of the demonstration courts, the best candidate for the coordinator role is someone who is committed to the court's goals for improvement, has time to devote to the effort, and has the perceived authority to ensure that the process stays on track. For some courts, this individual will be found internally; for others, outside resources may be needed. Whether the court's coordinator is selected from an internal or external source, he or she will have to work closely with court staff to successfully apply the measures selected by the Guiding Coalition (see Chapter 1). As the coordinator in one demonstration court noted: The trial court needs to do the performance measurements, not have them done. It isn't enough for a court to cooperate with the Supreme Court or administrative office and allow a measurement to be taken. The court staff needs to be involved in data collection and analysis, and the court must have an idea of what to do with results once they are generated. How Should the Measures Be Adapted for a Specific Court? The measurement system describes performance measures for general jurisdiction trial courts. Obviously, trial courts vary considerably in their structure and organization, jurisdiction, magnitude of caseloads, and political environment. In many instances, the generic measures will need to be modified to fit the characteristics of a specific court. For example, use of a survey of court personnel may not be appropriate for a court with a small number of employees. In such a small setting, individual responses could be identifiable in the presentation of results, and survey administrators would not be able to guarantee the confidentiality of responses. Without guaranteed confidentiality, some employees may choose not to participate, not to answer certain questions, or to give less than candid responses. In such cases, the court needs to consider alternative approaches that address the same issues articulated in the measure. Returning to the previous example, the coordinator for the measure could explore the option of in-person interviews conducted by an individual not connected with the court (for example, a university professor). The consultant could conduct the interviews, preferably in a neutral setting, and report the results to the court within the context of general themes. Innovative uses of technology, such as group decisionmaking software, also could be explored. During the demonstration of the measurement system, courts modified data collection methods and instruments for a number of reasons. Some courts did not have jurisdiction over particular issues addressed in a measure, some courts wanted to obtain more specific information on a particular issue, some courts combined measures and modified data collection instruments accordingly, and some courts substituted one measurement method for another (for example, a mailed survey as opposed to a focus group) for ease and economy. Although the measures are somewhat flexible, they should not be modified extensively without careful consideration of the impact of the changes on the goal of the measure. For example, a data element should not be eliminated simply because it is hard to obtain. A court should consider how difficulties in obtaining the data could be overcome, as well as the consequences of not including the data in the measurement process. In other words, what reasoning would the court use to explain its measurement modifications to an independent observer? At the same time, innovative approaches to measurement should not be stifled as long as an effort has been made to balance a measure's feasibility and utility with its reliability and validity. How Will the Results Be Used? It is important for the Guiding Coalition to have a plan for interpreting, disseminating, and using the information from the measures once it is obtained (see Chapter 1). To some extent, the court has "a window of opportunity" during the measurement process. If court staff see the data from the measurement process being used to inform and guide the operations of the court, staff will be more likely to invest themselves in the process. If data seem to end up in a "black hole" somewhere in management, the measurement process will be compromised. Staff will lose interest, and the validity of the measures will be threatened. The coalition should inform staff ahead of time how the process will unfold and how the data will be used. For example: Will the information from several measures be collected before results are reviewed and action plans developed? Or will the results of each measure be reviewed as they become available to inform further data collection efforts? The Guiding Coalition also should consider how it will respond if the data indicate that problems exist. The mere act of collecting data on a particular activity will raise expectations that actions will follow to remedy problems. Therefore, if no actions are taken, the court may find its position worse than before the measurement was taken. For example, an employee survey may raise expectations that the court is interested in employee opinions. If the court subsequently fails to acknowledge any of the issues identified as a result of the data collection effort, employee morale is likely to suffer. In other words, a plan for responding should be in place before measurement begins. The actual response will depend on the nature of the findings. The plan should specify when the findings will be made available, which individuals will be told about the findings and the level of detail that will be provided to them, and whether problems will be addressed immediately or incorporated into the court's long-range planning. Once the coalition begins to discuss these questions, the answers may not be as straightforward as they first seemed. There may be political considerations regarding who gets what information when. These issues should be resolved as thoroughly as possible before proceeding with the measurement process. Although it will not be possible to resolve all problems in advance, it is important to have a vehicle for solving issues as they arise. The experiences of trial courts demonstrating the application of the measurement system also clearly suggest that the coalition should be prepared for print and broadcast media requests for court performance data. Some of the demonstration courts alerted the media that they had been selected to test the system. The advantage of this strategy is that the court begins the measurement process by giving the public and the news media a positive message: "We are interested in improving our services." The disadvantage, however, is that the court will be obliged to follow through with candor and, when problems are identified, with action. In some of the demonstration sites, for example, local media requested that they be kept informed regarding the results of measures. The coalition, therefore, should consider carefully the impact of involving the media in a process of self-assessment and internal management. Given, however, that individuals from the local media may hear about the measurement effort--either formally from court officials announcing the program or informally from someone participating in one of the measures--the coalition should develop a policy for responding to media requests. ------------------------------ Chapter 3 Educational Program on Trial Court Performance Standards At some point in the process of implementing the Trial Court Performance Standards and Measurement System, court officials (or individuals at the State administrative office of the courts) will find themselves in a position of having to educate others in the court system about the benefits and uses of the system. In some court systems, particularly those that have struggled with some of the concepts associated with the system, the educational process may be accomplished primarily by "word of mouth." Telephone calls and short meetings with relevant individuals to explain the purpose of incorporating the system into the court's operation and management (perhaps by using the implementation model shown in Figure 1) may be all that is needed. In larger courts or courts that have not spent much time thinking about the issues raised by the system, a more formal approach may be needed to reach the individuals who are key to the program's successful implementation. Accordingly, staff of the Institute for Court Management, the National Center for State Courts' education and information arm, in conjunction with staff of the Trial Court Performance Standards Project, developed a 2 1/2 -day educational program on the standards and measurement system.[16] The program is divided into four basic components: (1) reviewing the Trial Court Performance Standards and Measurement System; (2) using the system to plan, organize, manage, and lead; (3) using the system to measure performance; and (4) using the system to formulate, implement, and evaluate strategies. These components, which are described in greater detail below, are related to the steps of the implementation model shown in Figure 1. The program's agenda, objectives, and visual aids are presented in Appendix D. The visual aids are suitable to adapt as overheads. The program is intended to help participants use the system as a strategic planning tool and to organize and mobilize their courts and court organizations to achieve their missions and strategic goals. The program tracks the steps of strategic planning and management and links those steps to the system (see Appendix D). Descriptions of the four program components are provided as a resource for courts planning their own educational activities on various aspects of the implementation model. The summary materials provided in this chapter and in Appendix D should be sufficient to allow an instructor familiar with the system to design and deliver an educational program. It is assumed, however, that the individuals conducting the program know how to organize, plan, and conduct a basic educational program. It is also assumed that they understand basic adult education principles and are familiar with group dynamics. A court may choose to focus on one or all four components, depending on how it uses the system. For example, a court more interested in developing an overall mission (step 3 of the implementation model) before undertaking the measurement process would build its educational activities around the first two components--reviewing the system and using the system to plan, organize, manage, and lead. A court more interested in the actual measurement process would focus on information covered in the third component, using the system to measure performance. Another court might want to focus on all four components over a yearlong period, rather than all at once. Before educational activities are developed, the needs of the court must be determined. To make this point, it is helpful to draw an analogy between the measurement system, strategic management, and organizational transformation, and the tools needed to renovate a house. To rebuild an entire house, remodel one room, or even to add on a simple deck, a builder must first calculate the project's needs, create a blueprint, and have appropriate equipment such as hammers and saws. Component 1: Reviewing the Trial Court Performance Standards and Measurement System This first component includes (1) background information on the development of the standards and measurement system, (2) an overview of its philosophy and purpose, and (3) a discussion of the specific performance areas and standards. With discussion and a short break, an instructor needs approximately 3 hours to cover the material. If participants have read Trial Courts Performance Standards With Commentary prior to the session, the discussion can be shortened by highlighting those standards of primary interest to the court. This component also could be used as an introduction to a discussion of performance areas the court should focus on (see step 4 of the implementation model). The instructor should use the Court Performance Inventory (see Component 2 and Appendix E) to facilitate discussion. The introductory chapters of the documents Trial Court Performance Standards With Commentary and Trial Court Performance Standards and Measurement System Implementation Manual provide the basic information included in this component by describing the development and the underlying philosophy of the system. Assisted by the visual aids in Appendix D, the major teaching points of the first 1-hour unit, "Development, Philosophy, and Purpose of the Trial Court Performance Standards and Measurement System," are: o The standards and measurement system has a legitimate claim to the program participants' attention because of its development by court leaders, its demonstration in court environments, and its endorsement by court organizations. o The standards were developed by professionals and scholars who work in and with courts. It was not developed by an external body with only passing familiarity with the issues and constraints of courts. o The system has proven useful through an 8-year development and demonstration process. o The system represents a philosophy of court administration, a common language, tools for self-assessment and continuous improvement, and a conceptual framework for organizational transformation. o The system represents a major shift in the roles and responsibilities of trial courts, including a shift in emphasis from structures and processes to performance, from the needs of those who run the courts to the needs of those served by the courts, and from a view of courts as independent courtrooms to courts as organizations. Standard 3.5, Responsibility for Enforcement, and Standard 4.5, Response to Change, are highlighted in this introductory unit to underscore that the measurement system may suggest a significant transformation of the courts from isolated, inward- looking units--concerned largely with capacity building--to active, outward-looking organizations that endeavor to use their authority and resources for the greatest public good. The remainder of the material for this component covers the actual performance areas and standards described in Chapters 2 and 3 of Trial Court Performance Standards With Commentary. Component 2: Using the Trial Court Performance Standards and Measurement System To Plan, Organize, Manage, and Lead The second program component discusses how the standards and measurement system can be used as a tool for strategic planning and organizational transformation. An individual familiar with the concepts of strategic planning, vision, mission, goals, objectives, and evaluation should lead the discussion. With discussion and breaks, an instructor would need most of the day to cover the material in detail. As noted in the Program Agenda in Appendix D, this full day includes the afternoon of Day 1 and the morning of Day 2 and includes the following units: Using the Trial Court Performance Standards to plan, organize, manage, and lead; using the standards to create a vision, values, and shared meaning; recap and brief review; using the standards to determine a mission; and using the standards and measurement system to set goals and objectives. Teaching points for these units are suggested in Appendix D. The purpose of this component is to show how the various elements of the standards and measurement system are linked to good court management and strategic planning. For example, a court's mission could be developed from one or more of the five performance areas, the court's goals from specific standards, and the court's objectives from various measures related to each of the standards (see step 3 of the implementation model). This component also includes a description of the Court Performance Inventory (CPI), which is included as Appendix E. The CPI is a self- administered instrument for collecting individual perceptions about specific aspects of trial court performance. It is intended to familiarize individuals with the principles enunciated in the system and the empirically based approach to measurement of trial court performance, and to initiate the process of setting goals and objectives based on the system. As a prelude to establishing the general direction of a court's improvement efforts, for example, aggregated court performance profiles of groups of court employees (for example, judges, court administrators, and court clerks) could serve as the basis for discussions about court improvement efforts linked to specific standards. Component 3: Using the Measurement System To Measure Performance The third component focuses on the application of specific measures (see step 5 of the implementation model). It includes an overview of the importance of measuring results and provides examples of the specific measures that highlight various methods (for example, interviews, observations, record reviews, and surveys), instruments (for example, checklists and questionnaires), and individuals (for example, judges, court employees, attorneys, and jurors) involved in conducting the measures. The purpose of this component is to give individuals who will be involved in the measurement process hands-on experience with the measures. Because the measures are difficult to comprehend in the abstract, this component includes an individual and group exercise that provides participants an opportunity to read a specific measure and consider its application in the court. What issues will need to be addressed and what resources will the court require to conduct the measure? Because of this exercise, Component 3 could take 5 hours or longer to complete, depending on whether groups prepare and present reports of their deliberations and the number of groups participating (as an example, the educational prototype outlined in Appendix D uses five groups to address the system's measurement methods). Three separate units or modules are included in this component, two in the afternoon of Day 2 and one in the morning of Day 3 (see Program Agenda, Appendix D). The component begins with a brief presentation and discussion of the measurement system, including its development, key measurement questions, types of measures, advantages of performance measurement, and matters to consider before using the system. The instructor then asks participants to study the specific measures assigned to the participants (see Group Exercise in Appendix D) and to answer specific questions about the measures. Following an afternoon break, participants reconvene in five separate groups to discuss the measures (see Group Discussion of Measurement Exercise and Preparation of Group Reports in Appendix D). The following morning, a spokesperson for each group explains the measure, its benefits, resources needed for its application, and problems that may be encountered during its application. As noted on the Group Exercise form in Appendix D, the only way to become familiar with the measures is to try them. The individual and group exercises accomplish this objective. Component 4: Using the Trial Court Performance Standards and Measurement System To Formulate, Implement, and Evaluate Strategies What does a court do with the data it gathers as a result of the measurement process? This component focuses on step 6 of the implementation model. It includes an overview of the considerations involved in formulating strategies to address specific goals and objectives. Courts are encouraged to use the visual aids in Appendix D in conjunction with this component. Major teaching points of Component 4 are as follows: o Although presented in sequential fashion, steps of strategic planning in practice are overlapping-- that is, measurement helps to identify objectives and strategies and determine success or failure in meeting those objectives. o The scientific method or approach should be considered a part of good management, not an academic exercise. This component is relatively short--it could be presented in an hour or less. However, once data have been collected on one or more measures, it also could be used as the introduction to a strategy-building meeting of the Guiding Coalition. Used in this way, the information could be part of a much larger module in which the coalition reviews the results of the measures, determines whether court performance is acceptable with regard to a particular standard, determines how much improvement is needed in a particular area, and suggests possible approaches or strategies for achieving the desired level of performance. ------------------------------ 16.The development and delivery of this prototype educational program and materials was supported, in part, by funds from both the Bureau of Justice Assistance and the State Justice Institute. Some of the materials for the program were based on a training guide prepared for the State Justice Institute by the Center for Public Policy Studies. See B.J. Wagenknecht-Ivey, An Approach to Long- Range Strategic Planning for the Courts: Training Guide (Denver, CO: Center for Public Policy Studies, 1992). ------------------------------ Bibliography American Bar Association. Standards Relating to Juror Use and Management. Chicago, IL: American Bar Association. 1983. Belasco, J.A., and R.C. Stayer. Flight of the Buffalo: Soaring to Excellence, Learning To Let Employees Lead. New York: Warner Books. 1993. Blankenship, M.B., J.B. Spargar, and W.R. Janikowski. "Accountability v. Independence: Myths of Judicial Selection." Criminal Justice Policy Review 6(1)(1992), pp. 69-79. Bureau of the Census. Statistical Abstract of the United States, 1988. Washington, DC: U.S. Department of Commerce. 1989. Bureau of Justice Assistance. Planning Guide for Using the Trial Court Performance Standards and Measurement System. Washington, DC: U.S. Department of Justice. 1997. Bureau of Justice Assistance. Trial Court Performance Standards With Commentary. Washington, DC: U.S. Department of Justice. 1997. Bureau of Justice Assistance. Trial Court Performance Standards and Measurement System (Program Brief). Washington, DC: U.S. Department of Justice. 1997. Bureau of Justice Assistance. Trial Court Performance Standards and Measurement System Implementation Manual. Washington, DC: U.S. Department of Justice. 1997. Chapper, J., and R. Hanson. Three Papers on Understanding Reversible Error in Criminal Appeals. Williamsburg, VA: National Center for State Courts. 1979. Citizens' Commission to Improve Michigan Courts. Final Report and Recommendations to Improve the Efficiency and Responsiveness of Michigan Courts. Lansing, MI: Michigan Supreme Court. 1986. Clynch, E., and D.W. Neubauer. "Trial Courts as Organizations: A Critique and Synthesis." In Administration and Management of Criminal Justice Organizations: A Book of Readings, Stan Stokjovic et al. (eds.) 2nd ed. Prospect Heights, IL: Waveland Press, Inc. 1994. Cooper, C.S. Expedited Drug Case Management. Washington, DC: U.S. Department of Justice. Office of Justice Programs. Bureau of Justice Assistance. 1994. Covey, S.R. Principle-Centered Leadership. New York: Summit Books. 1991. Duren v. Missouri, 439 U.S. 357 (1979). Ellickson, P., and J. Petersilia. Implementing New Ideas in Criminal Justice (R-2929-NIJ). Santa Monica, CA: RAND Corporation. 1983. Flemming, R., P. Nardulli, and J. Eisenstein. "The Timing of Justice in Felony Trial Courts." Law and Policy 9(2)(April 1987), pp. 179-206. Gallas, G., and E.C. Gallas. "Court Management Past, Present, and Future: A Comment on Lawson and Howard." Justice System Journal 15(2)(1991), pp. 605-616. GMA Research Corporation. Washington State Judicial Survey. Olympia, WA: Office of the Administrator for the Courts, State of Washington. 1988. Goerdt, J.A., et al. Examining Court Delay: The Pace of Litigation in 26 Urban Trial Courts, 1987. Williamsburg, VA: National Center for State Courts. 1989. Goerdt, J.A., C. Lomvardias, and G. Gallas. Reexamining the Pace of Litigation in 39 Urban Trial Courts. Washington, DC: U.S. Department of Justice. Bureau of Justice Assistance. 1991. Goldkamp, J., and M. Gottfredson. Guidelines for the Pretrial Release Decision: Superior Court of Arizona, Maricopa County; Circuit and County Courts, Dade County; Boston Municipal Court; and Suffolk County Superior Court. Bail Guidelines Project. Philadelphia, PA: Temple University. 1985. Goodman, M.L. "Effective Case Monitoring and Timely Dispositions: The Experience of One California Court." Judicature 76(5)(February-March 1993), pp. 254-257. Gray, E.B. "Day in the Life of a Multi-Door Courthouse." Negotiation Journal 9(3)(July 1993), pp. 215-221. Hardenbergh, D. "Planning and Design Considerations for Trial Courtrooms." State Court Journal 14(4)(Fall 1990), pp. 32-38. Headley-Edwards, N., and D.A. Ryan. Comprehensive Adjudication of Drug Arrestees (CADA) Project, 1988-1990. San Jose, CA: Santa Clara County Office of the County Executive. 1990. Herbert, A., and R. Colton. Tables for Statisticians. New York: Barnes and Noble. 1963. Hewitt, W. Court Interpretation: Model Guides for Policy and Practice in the State Courts. Williamsburg, VA: National Center for State Courts. 1995. Jacoby, J.E. "Expedited Drug Case Management Programs: Some Lessons in Case Management Reform." Justice System Journal 17(1)(1994), pp. 19-40. Jacoby, J.E., E.C. Ratledge, and H.P. Gramckow. Expedited Drug Case Management Programs: Issues for Program Development, Executive Summary. Washington, DC: U.S. Department of Justice. National Institute of Justice. 1992. Johnson, S.S., and P. Yerawadekar. "Courthouse Security." Court Management Journal 3(1981), pp. 8-12. Kairys, D., J.B. Kadan, and J.P. Lehoczky. "Jury Representation, A Mandate for Multiple Source Lists." California Law Review 65(1977), pp. 776-827. Kiely, T.J. "Managing Change: Why Reengineering Projects Fail." Harvard Business Review 73(2)(1995), p. 15. King County Department of Public Safety. King County Department of Public Safety 1989 Annual Report. Seattle, WA. 1990. Knuth, D.J. The Art of Computer Programming, Vol. 2, Semi-Numerical Algorithms. Reading, MA: Addison- Wesley Publishing Company. 1969. Kotter, J.P. "Why Transformation Efforts Fail." Harvard Business Review 73(2)(1995), pp. 59-67. Krueger, R.A. Focus Groups: A Practical Guide for Applied Research. Beverly Hills, CA: Sage Publications. 1988. Luskin, M., and R. Luskin. "Why So Fast, Why So Slow: Explaining Case Processing Time." Journal of Criminal Law and Criminology 77(1)(Spring 1986), pp. 190-214. MacCoun, R.J., and T.R. Tyler. "Basis of Citizens' Perceptions of the Criminal Jury: Procedural Fairness, Accuracy, and Efficiency." Law and Human Behavior 12(3)(September 1988), pp. 333-352. Maddi, D. Judicial Performance Polls. Chicago: American Bar Foundation. 1977. Mahoney, B., et al. Changing Times in Trial Courts: Caseflow Management and Delay Reduction in Urban Trial Courts. Williamsburg, VA: National Center for State Courts. 1988. Martin, J.A. Approach to Long-Range Strategic Planning for the Courts. Alexandria, VA: State Justice Institute. 1992. Menaster, Spooner, and Greenberg. "Getting a Fair Cross-Section of the Community." Forum (1989), pp. 14-21. Moore, C. Group Techniques for Idea Building. Applied Social Research Methods Series, Vol. 9. Beverly Hills, CA: Sage Publications. 1987. Morgan, D. Focus Groups as Qualitative Research. Beverly Hills, CA: Sage Publications. 1988. Munsterman, G.T., and J.T. Munsterman. "The Search for Jury Representativeness." Justice System Journal 11(1986), pp. 59-78. Nagel, I. "The Legal/Extra-Legal Controversy: Judicial Decisions in Pretrial Release." Law and Society Review 17(1983), pp. 481-515. National Center for State Courts. The Americans with Disabilities Act: Title II Self-Evaluation. Williamsburg, VA. 1992. National Center for State Courts. Methodology Manual for Jury Systems. Williamsburg, VA. 1981. National Center for State Courts. A Supplement to the Methodology Manual for Jury Systems: Relationships to the Standards Relating to Juror Use and Management. Williamsburg, VA. 1987. National Institute of Law Enforcement and Criminal Justice. Multiple Lists for Juror Selection: A Case Study for San Diego Superior Court. Washington, DC: U.S. Department of Justice. Law Enforcement Assistance Administration. 1978. National Sheriffs' Association. Court Security: A Manual of Guidelines and Procedures. Washington, DC: U.S. Department of Justice. Law Enforcement Assistance Administration. 1978. Osborne, D., and T. Gaebler. Reinventing Government: How the Entrepreneurial Spirit Is Transforming the Public Sector. Reading, MA: Addison-Wesley Publishing Company. 1992. Philip, C. How Bar Associations Evaluate Sitting Judges. New York: Institute for Judicial Administration. 1976. Press-Enterprise Co. v. Superior Court of California, 464 U.S. 501 (1984). Press-Enterprise Co. v. Superior Court of California for Riverside, 478 U.S. 106 (1986). Schultz, W.L., C. Bezold, and B.P. Monahan. Reinventing Courts for the 21st Century: Designing a Vision Process. Williamsburg, VA: National Center for State Courts. 1993. Sponzo, M.J. "Independence vs. Accountability: Connecticut's Judicial Evaluation Program." Judge's Journal 26(2)(Spring 1987), pp. 13-17. Task Force on Principles for Assessing the Adequacy of Judicial Resources. Assessing the Need for Judicial Resources: Guidelines for a New Process. Williamsburg, VA: National Center for State Courts. 1983. Taylor v. Louisiana, 419 U.S. 526 (1975). Tyler, T. "What Is Procedural Justice? Criteria Used by Citizens to Assess the Fairness of Legal Procedures." Law and Society Review 22(1988), pp. 103-139. U.S. Congress. Senate. Committee on the Judiciary. Juvenile Courts: Access to Justice: Hearing Before the Subcommittee on Juvenile Justice. 102d Cong., 2d sess., March 4, 1992. U.S. Department of Health and Human Services. Final Report on the Validation and Effectiveness Study of Legal Representation Through Guardian Ad Litem. Washington, DC. 1994. Wagenknecht-Ivey, B.J. An Approach to Long-Range Strategic Planning for the Courts: Training Guide. Denver, CO: Center for Public Policy Studies. 1992. Williams, R.J. "Envisioning the Courts: Old Myths or New Realities?" The Court Manager 9(4)(1994), p. 45. Yankelovich, Skelly, and White, Inc. The Public Image of Courts: Highlights of a National Survey of the General Public, Judges, Lawyers, and Community Leaders. Williamsburg, VA: National Center for State Courts. 1978. Yin, R. Case Study Research Design and Methods. Beverly Hills, CA: Sage Publications. 1984. ------------------------------ Sources for Further Information For further information about the Trial Court Performance Standards and Measurement System, contact: Bureau of Justice Assistance Adjudication Branch 810 Seventh Street NW. Washington, DC 20531 202-514-5943 World Wide Web: http:// www.ojp.usdoj.gov/BJA Bureau of Justice Assistance Clearinghouse P.O. Box 6000 Rockville, MD 20849-6000 Tel: 1-800-688-4252 Fax: 301-519-5212 World Wide Web: http:// www.ncjrs.org Department of Justice Response Center Tel: 1-800-421-6770 National Center for State Courts 300 Newport Avenue Williamsburg, VA 23185 Tel: 757-253-2000 Fax: 757-220-0449 World Wide Web: http:// www.ncsc.dni.us ------------------------------ Appendix D Prototype Educational Program on the Trial Court Performance Standards and Measurement System Trial Court Performance Standards and Measurement System Guiding the Courts Into the Future Program Agenda Time Topic/Activity Day 1 8:30-9 a.m. Registration 9-9:30 a.m. Welcome, Introductions, and Course Overview 9:30-10:30 a.m. Development, Philosophy, and Purpose of the Trial Court Performance Standards and Measurement System (Component 1) 10:30-10:45 a.m. Break 10:45-12 noon Discussion of 5 Performance Areas and 22 Standards (Component 1) 12 noon-1:30 p.m. Lunch on Own 1:30-3 p.m. Using the Trial Court Performance Standard and Measurement System To Plan, Organize, Manage, and Lead (Component 2) 3-3:15 p.m. Break 3:15-5 p.m. Using the Trial Court Performance Standard and Measurement System To Create a Vision, Values, and Shared Meaning (Component 2) 5 p.m. Adjourn for Day 1 Day 2 Time Topic/Activity 8:30-8:45 a.m. Recap and Brief Review 8:45-10:30 a.m. Using the Standards To Determine a Mission (Component 2) 10:30-10:45 a.m. Break 10:45-12 noon Using the Standards and Measurement System To Set Goals and Objectives (Component 2) 12-1:30 p.m. Lunch on Own 1:30-3 p.m. Using the Measurement System To Measure Performance--Individual Measurement Exercise (Component 3) 3-3:15 p.m. Break 3:15-5 p.m. Using the Measurement System To Measure Performance--Group Discussion of Measurement Exercise and Preparation for Group Reports on Morning of Day 3* (Component 3) 5 p.m. Adjourn for Day 2 * Groups are encouraged to complete their reports on the evening of Day 2, if necessary. Day 3 Time Topic/Activity 8:30-8:45 a.m. Recap and Brief Review 8:45-10:30 a.m. Using the Standards and Measurement System To Measure Performance--Group Reports on Measurement Exercise (Component 3) 10:30-10:45 a.m. Break 10:45-11:30 a.m. Using the Standards and Measurement System To Formulate,Implement, and Evaluate Strategies (Component 4) 11:30 a.m. Adjourn ------------------------------ Learning Objectives After this course, participants will be able to use the Trial Court Performance Standards and Measurement System as a conceptual framework and as a tool for courts to accomplish the following tasks: o Creating a mission. o Defining court values and vision. o Establishing goals. o Identifying objectives and determining what results are wanted. o Devising strategies. o Measuring performance. ------------------------------ Group Exercise (Day 2, Component 3: Using the Measurement System To Measure Performance) The best way to understand the measures is to use the measures. There are 68 measures--too many to talk about in any depth. Several types of measurement methods (e.g., simulations, surveys, interviews, record searches, and small group processes) are employed across the measures. Participants are assigned to groups to discuss one of the data collection methods. First, participants will review their assigned measure and then meet in the group to discuss their initial reactions. The purpose of the group report is to describe the measurement method to the other participants. Tomorrow afternoon, you will break into groups to discuss one of the data collection methods used in the measurement system. Each group will review a different method. In preparation for these group discussions, we ask that you read the measure(s) assigned to your group. The measures are listed below. As you read your group's assigned measure(s), consider how you would implement the measurement method in your court system. What would you need to do at each step of the system's measurement process (planning/preparation, data collection, and data analysis and report preparation)? What resources would you need, what difficulties would you encounter, and what strategies would you use in conducting the measure(s)? Please note your initial responses to these questions in the space provided on the following pages. Tomorrow, you will have an opportunity to review your responses individually before discussing them with other members of your group. Group 1: Record Reviews Measure 2.1.1 Time to Disposition Measure 2.2.4 Compliance With Reporting Schedules Group 2: Surveys Measure 1.2.3 Perceptions of Courthouse Security Measure 1.2.6 Evaluation of Accessibility and Convenience by Court Users Group 3: Observers Measure 1.1.1 Access to Open Hearings Measure 1.3.5 Participation by Persons With Disabilities Group 4: Interviews Measure 3.4.3 Experience in Interpreting Orders and Judgments Measure 4.2.3 Evaluation of the Court's Financial Auditing Practices Group 5: Group Techniques Measure 4.2.1 Adequacy of Statistical Reporting Categories for Resource Allocation 1. Provide a brief description of your court (e.g., types of cases heard, size of jurisdiction, and number of judges). 2. List issues to address for the planning/preparation stage--who needs to be contacted, what resources are needed, what problems might come up, and how you would address them. Measure Name: Measure Name: 3. List issues to address for the data collection stage--who needs to be involved, what expertise is needed, what resources are needed, what problems might come up, and how you would address them. Measure Name: Measure Name: 4. List issues to address for the data analysis and report preparation stage--who can do the data analysis, what should be included in the report, who should receive the report, and how you would address potential problems. Measure Name: Measure Name: Group Discussion of Measurement Exercise and Preparation of Group Reports (Day 2, Component 3: Using the Measurement System To Measure Performance) Each group will prepare a 10- to 15-minute report tomorrow morning addressing the four questions below. Each group should develop its own approach to presenting its report. Be as creative as you wish. Make the assumption that your group is "selling" the measurement approach to a skeptical audience. Your group will have 1 hour and 45 minutes (and more time this evening if your group chooses) to discuss and prepare your group report. To facilitate the report preparation, please jot down your initial responses to the following questions (based both on your own thinking and your group's deliberations), and bring this sheet with you to tomorrow's session. 1. Briefly describe the major elements of the measurement system. 2. In general, what do you see as the benefits of using this method? 3. What resources are needed to apply the method as specified in the measure(s)? 4. What problems might you encounter in applying the method as specified in the measure(s), and what strategies might you use to overcome the problems? Group Reports on Measurement (Day 3, Component 3: Using the Measurement System To Measure Performance) Group 1: Record Reviews Measure 2.1.1 Time to Disposition Measure 2.2.4 Compliance With Reporting Schedules Group 2: Surveys Measure 1.2.3 Perceptions of Courthouse Security Measure 1.2.6 Evaluation of Accessibility and Convenience by Court Users Group 3: Observers Measure 1.1.1 Access to Open Hearings Measure 1.3.5 Participation by Persons With Disabilities Group 4: Interviews Measure 3.4.3 Experience in Interpreting Orders and Judgments Measure 4.2.3 Evaluation of the Court's Financial Auditing Practices Group 5: Group Techniques Measure 4.3.3 Equal Employment Opportunity *The prototype is based on an educational program developed by the staff of the Institute for Court Management of the National Center for State Courts. The development and delivery of this prototype educational program and materials was supported, in part, by funds from both the Bureau of Justice Assistance and the State Justice Institute. Some of the materials for the program were based on a training guide prepared for the State Justice Institute by the Center for Public Policy Studies. See B.J. Wagenknecht-Ivey, An Approach to Long- Range Strategic Planning for the Courts: Training Guide (Denver, CO: Center for Public Policy Studies, 1992). ------------------------------ Overhead Projections for Prototype Educational Program Key to Overhead Projections for Prototype Educational Program Exhibits 1-51 on the following pages can be converted to overhead projections for use during the educational program described on the preceding pages.* Day 1 Development, Philosophy, and Purpose of the Trial Court Performance Standards and Measurement System (Component 1) Exhibits 1-5 Discussion of 5 Performance Areas and 22 Standards (Component 1) Exhibits 6-11 Using the Measurement System To Plan, Organize, Manage, and Lead (Component 2) Exhibits 12-21 Using the Measurement System To Create a Vision, Values, and Shared Meaning (Component 2) Exhibits 22-28 Day 2 Using the Measurement System To Determine a Mission (Component 2) Exhibits 29-32 Using the Measurement System To Set Goals and Objectives (Component 2) Exhibits 33-42 Day 3 Using the Measurement System To Formulate, Implement, and Evaluate Strategies (Component 4) Exhibits 43-51 *There are no overhead projections for Component 3 of the agenda. Component 3 consists of a measurement exercise and group reports by the participants. ------------------------------ Appendix E Court Performance Inventory The Court Performance Inventory is intended to familiarize individuals with the Trial Court Performance Standards and Measurement System developed by the Commission on Trial Court Performance Standards, the National Center for State Courts, and the Bureau of Justice Assistance.* The inventory is designed to collect individual perceptions (that may be pooled with the perceptions of others) about specific trial court performance. Results of the inventory graphed onto the Court Performance Profile at the end of this appendix can be used to target further diagnostic efforts and management strategies. DIRECTIONS: Read each of the attached 50 statements. Based on your experiences with your court or another court of interest to you, determine whether you believe the statement is BROADLY TRUE or BROADLY FALSE about the way the court performs. If you believe the statement is BROADLY TRUE, place an "X" in the corresponding numbered box on the Court Performance Inventory Response Form that follows the last statement. For example, if you believe the first statement to be generally true about the court you are rating, place an "X" in the box with the "1" on the response form. Once you have completed the response form, add the number of boxes marked as BROADLY TRUE for each of the five columns. Each column refers to one of the performance areas. Use the five sums to complete the Court Performance Profile. 1.An observer sitting in any courtroom's public seating area will have no difficulty hearing judges, attorneys, litigants, witnesses, and other participants in the proceedings. 2.The total elapsed time it takes the court to dispose of cases once they are filed complies with national standards. 3.The final juror pools used by the court are representative of the demographic characteristics of the jurisdiction's population. 4.The allocation of personnel resources among case types is defensible and based upon logic and reason. 5.The general public (a) perceives the trial court and the justice it delivers as accessible; (b) has trust and confidence that basic trial functions are conducted expeditiously and fairly and that the court's decisions have integrity; and (c) knows that the trial court is independent, accountable, and not unduly influenced by other government components. 6.An undercover law enforcement official, dressed in plain clothes, will not be able to breach either the court's security systems that protect the public or confidential court files and records. 7.An examination of court financial records will reveal that the various types of funds for which the court is responsible are disbursed in a timely manner. 8.An examination of relevant case file documents and court records will reveal that the court closely adheres to key legal requirements. 9.An analysis of the court as an equal opportunity employer would reveal that race and gender distribution for each job category is generally reflective of the available labor pool for each category. 10.Justice system representatives (a) perceive the trial court and the justice it delivers as accessible; (b) have trust and confidence that basic trial functions are conducted expeditiously and fairly and that court decisions have integrity; and (c) know that the trial court is independent, accountable, and not unduly influenced by other components of government. 11.Interpreter services provided by the court are performed by individuals with language proficiency, interpreting skill, and knowledge of professional conduct. 12.The court promptly implements changes in substantive and procedural laws that are a result of Federal and State legislation and new Federal regulations. 13.A statistical analysis conducted of data collected from closed files for bail, bond, and release on recognizance decisions will reveal that these decisions are not based on extralegal factors such as the defendant's race or gender, the judge assigned to the case, or the geographic location of the court. 14.The court does a good job in disseminating information to the public about its programs and operations. 15.Court employees (a) perceive the trial court and the justice it delivers as accessible; (b) have trust and confidence that basic trial functions are conducted expeditiously and fairly and that court decisions have integrity; and (c) know that the trial court is independent, accountable, and not unduly influenced by other components of government. 16.Observers of court proceedings are likely to find all court personnel courteous and responsive. 17.The court keeps up with its incoming caseload by disposing of as many cases as are filed each year. 18.Record examinations and statistical analyses are likely to reveal that sentencing decisions of the court are based mostly on legally relevant factors and not on extralegal factors such as the defendant's race or gender, the judge assigned to the case, or the geographic location of the court. 19.A group of knowledgeable persons both within and outside the court would conclude that the court has acted responsibly in responding to public policy issues of concern within the jurisdiction such as domestic violence, discrimination, substance abuse, or others that could have affected the fair and effective administration of justice by the court. 20.Individuals who have had contact with the court such as litigants, jurors, witnesses, victims, or those conducting other business with the court consider the court's decisions and treatment of individuals as fair and equitable. 21.The court takes measures to reduce costs and facilitates affordable access to the judicial system for financially disadvantaged persons. 22.The court responds promptly to requests for information from the public. 23.A broad examination of appeal outcomes reflects that the trial court adheres to substantive laws and procedural requirements. 24.Selected knowledgeable individuals are likely to conclude that the trial court maintains its independence and institutional integrity, but that it still has good relations with other units of government. 25.Regular users of the court (i.e., court employees, attorneys, probation officers, and jurors) are likely to say that they are able to conduct their business with the court with relative ease and convenience. 26.A person relatively unfamiliar with the court will have no difficulty in locating and actually entering the courtroom in which a particular hearing is taking place. 27.The court promptly implements changes in administrative procedures required by the State supreme court and the administrative office of the courts. 28.The court does a good job of communicating clearly the terms and conditions of criminal sentences. 29.The court's responses to requests for information from the media are accurate and timely. 30.The results of a survey of regular court users, court employees, attorneys, probation officers, and jurors are likely to conclude that judges and other trial court personnel are courteous and responsive to the public and all others with whom they come into contact. 31.A person who attempts to determine the specific time and location of a particular court event will have no difficulty getting this information from the court by telephone. 32.Cases scheduled for trial are heard on the first scheduled trial date. 33.The court is well positioned and organized to enforce or facilitate the enforcement of its orders and judgments. 34.A group of knowledgeable individuals is likely to conclude that court personnel practices and decisions are fair. 35.Citizens are likely to report that access to court services is generally not hindered because of costs or complexity of procedures. 36.Persons with physical disabilities are able to conduct transactions in the court with relative ease. 37.The various services available from the court (such as indigent defense services, interpreter services, and mental health evaluations) are provided promptly. 38.A test of the timely retrieval of individual case files will reveal that the court's file control system is reliable and efficient. 39.The court has adequate statistical reporting capacity to make useful assessments of the relationship between the court's workload and the distribution of court resources. 40.Members of the bar who have appeared regularly in the court in the past year would assess the court's actions and decisions as fair and equitable. 41.Court observers are likely to find that litigants are treated with a high degree of courtesy and individual respect by judges of the court. 42.The number of pending cases exceeding national or State time standards for case processing is low. 43.The court clearly states the terms and conditions of obligations imposed as a result of adjudication of a civil dispute. 44.Court employees and media representatives are likely to be satisfied with policies and practices for responding to media inquiries. 45.Court staff will rate highly the degree of independent control that the court exercises over its fiscal operations, personnel, and services related to case flow. 46.Indigent persons who have never tried to obtain legal assistance are likely to obtain affordable legal assistance with relatively routine legal problems. 47.The court complies with established schedules for routine court reports such as statistical reports required by the State administrative office of the courts and the Equal Employment Opportunity Commission. 48.An examination of a sample of case file data will show that most files are complete and accurate. 49.The court conducts periodic internal and external audits of its financial practices and responds to auditors' suggestions for improvements. 50.Court employee responses to structured questions about fairness in personnel practices related to employee morale and competence are likely to reflect general satisfaction.