skip navigation

PUBLICATIONS

Register for Latest Research

Stay Informed
Register with NCJRS to receive NCJRS's biweekly e-newsletter JUSTINFO and additional periodic emails from NCJRS and the NCJRS federal sponsors that highlight the latest research published or sponsored by the Office of Justice Programs.

NCJRS Abstract

The document referenced below is part of the NCJRS Virtual Library collection. To conduct further searches of the collection, visit the Virtual Library. See the Obtain Documents page for direction on how to access resources online, via mail, through interlibrary loans, or in a local library.

 

NCJ Number: 65336 Find in a Library
Title: EVALUATION OF PROGRAM IMPLEMENTATION (FROM EVALUATION STUDIES REVIEW ANNUAL, VOL 4, 1979, BY LEE SECHREST ET AL - SEE NCJ-65332)
Author(s): M Q PATTON
Corporate Author: Sage Publications, Inc
United States of America
Date Published: 1979
Page Count: 29
Sponsoring Agency: Sage Publications, Inc
Thousand Oaks, CA 91320
Type: Program/Project Evaluation
Format: Document
Language: English
Country: United States of America
Annotation: THE MEANING AND PURPOSE OF PROGRAM IMPLEMENTATION EVALUATION IS DISCUSSED, AND THIS FORM OF EVALUATION IS CONTRASTED WITH OUTCOMES EVALUATION.
Abstract: THE NEED FOR IMPLEMENTATION EVALUATION CAN BE ILLUSTRATED BY A PROGRAM EVALUATION WHICH YIELDED NEGATIVE RESULTS THAT LED TO THE PROGRAM'S TERMINATION BEFORE IMPLEMENTATION. THE FIRST STEP IN PROGRAM EVALUATION IS TO DETERMINE WHETHER THE PROGRAM WAS PROPERLY IMPLEMENTED. MAKING POLICY OPERATIVE IS A FUNDAMENTAL PROBLEM IN GOVERNMENT INTERVENTION; EVALUATION QUESTIONS SHOULD BE FRAMED IN THE CONTEXT OF PROGRAM IMPLEMENTATION. ALTHOUGH THE IDEAL EVALUATION INCLUDES BOTH OUTCOMES EVALUATION AND IMPLEMENTATION EVALUATION, EVALUATION RESEARCH HAS BEEN DOMINATED BY OUTCOMES EVALUATION. STANDARDIZED ACHIEVEMENT TESTS ILLUSTRATE OUTCOMES EVALUATION IN EDUCATIONAL RESEARCH; COMPARATIVE RECIDIVISM RATES, IN CRIMINAL JUSTICE RESEARCH. PURE OUTCOMES EVALUATION GIVES DECISIONMAKERS LITTLE INFORMATION ON WHICH TO ACT. IMPLEMENTATION INFORMATION OFTEN HAS GREATER VALUE THAN OUTCOMES INFORMATION. THE FIRST STEP IN UNLOCKING THE 'BLACK BOX' APPROACH TO EVALUATION IS FINDING OUT WHETHER THE PROGRAM MOVED FROM AN IDEA TO INITIAL IMPLEMENTATION. NEXT, CHARACTERISTICS OF THE FULLY OPERATIONAL PROGRAM SHOULD BE DETERMINED. EVALUATION DISASTERS CAN RESULT FROM FAILURE TO RECOGNIZE THAT PROGRAM IMPLEMENTATION IS NEITHER AUTOMATIC NOR CERTAIN. IMPLEMENTATION EVALUATION MUST BE ADAPTIVE AND FOCUSED ON USERS. THREE TYPES OF IMPLEMENTATION EVALUATION ARE EFFORT EVALUATION (QUANTITY AND QUALITY OF PROGRAM ACTIVITY), PROCESS EVALUATION (INTERNAL DYNAMICS OF A PROGRAM), AND TREATMENT SPECIFICATION. TREATMENT SPECIFICATION IS AN ALTERNATIVE TO LABELING, AS IT PERMITS IDENTIFICATION OF INDEPENDENT VARIABLES THAT ARE EXPECTED TO AFFECT OUTCOMES. USE OF LABELS TO AGGREGATE INEFFECTIVE PROGRAMS WITH EFFECTIVE PROGRAMS IS PROBABLY A MAJOR SOURCE OF NULL FINDINGS IN EVALUATION RESEARCH. DIFFERENT TYPES OF EVALUATIONS SHOULD BE MATCHED TO THE INFORMATION NEEDS OF DECISIONMAKERS AND INFORMATION USERS. A NOTE IS INCLUDED. (CFW)
Index Term(s): Evaluation; Evaluation of evaluation; Program evaluation; Program implementation
Note: REPRINTED FROM UTILIZATION-FOCUSED EVALUATION, 1978, P 149-177
To cite this abstract, use the following link:
http://www.ncjrs.gov/App/publications/abstract.aspx?ID=65336

*A link to the full-text document is provided whenever possible. For documents not available online, a link to the publisher's website is provided. Tell us how you use the NCJRS Library and Abstracts Database - send us your feedback.