skip navigation

PUBLICATIONS

Register for Latest Research

Stay Informed
Register with NCJRS to receive NCJRS's biweekly e-newsletter JUSTINFO and additional periodic emails from NCJRS and the NCJRS federal sponsors that highlight the latest research published or sponsored by the Office of Justice Programs.

NCJRS Abstract

The document referenced below is part of the NCJRS Virtual Library collection. To conduct further searches of the collection, visit the Virtual Library. See the Obtain Documents page for direction on how to access resources online, via mail, through interlibrary loans, or in a local library.

 

NCJ Number: 89665 Find in a Library
Title: On Evaluating Evaluations
Author(s): R C Larson; L Berliner
Date Published: 1983
Page Count: 34
Sponsoring Agency: National Institute of Justice (NIJ)
Washington, DC 20531
National Institute of Justice/
Rockville, MD 20849
NCJRS Photocopy Services
Rockville, MD 20849-6000
US Dept of Justice NIJ Pub
Washington, DC 20531
Grant Number: 80-IJ-CX-0048
Sale Source: National Institute of Justice/
NCJRS paper reproduction
Box 6000, Dept F
Rockville, MD 20849
United States of America

NCJRS Photocopy Services
Box 6000
Rockville, MD 20849-6000
United States of America
Document: PDF
Language: English
Country: United States of America
Annotation: This paper proposes two simple constructs to assist in the evaluation of evaluations: a decision tree approach which focuses on one or more of the proposed evaluation designs and a model dealing with the realistic issues of evaluation inputs, processes, and outcomes.
Abstract: The analogy of a prospective buyer of a used car is used to illustrate the decision tree approach to 'should the decisionmaker hire an evaluator?' This example demonstrates that the decision to evaluate is an allocation of resources which can be justified only if the expected benefits outweigh the expected costs of the evaluation. The paper notes that imperfect evaluations reduce anticipated benefits, but still provide useful information for the decisionmaker. This simple analytic framework can also address costs of evaluation error, cost of the evaluator, and biasing of the evaluation design. When reviewing past decisions to conduct evaluations, however, one must attempt to replicate the state of information available to the decisionmaker at the time. The second construct provides lists of evaluation inputs, processes, and outcomes that must be examined. Inputs include the budget, duration and timing of evaluation, attributes of the program being evaluated, methodology, audience, and programmatic purpose. Examples of process components are interactions between evaluators and staff, response to the presence of evaluators, and use of information acquired by an evaluation. Outcomes include decisions by funding agency regarding the program, staff revisions in the program, and decisions by clients to modify participation. Tables, charts, 2 footnotes, and 17 references are provided.
Index Term(s): Evaluation of evaluation
To cite this abstract, use the following link:
http://www.ncjrs.gov/App/publications/abstract.aspx?ID=89665

*A link to the full-text document is provided whenever possible. For documents not available online, a link to the publisher's website is provided. Tell us how you use the NCJRS Library and Abstracts Database - send us your feedback.