skip navigation

PUBLICATIONS

Register for Latest Research

Stay Informed
Register with NCJRS to receive NCJRS's biweekly e-newsletter JUSTINFO and additional periodic emails from NCJRS and the NCJRS federal sponsors that highlight the latest research published or sponsored by the Office of Justice Programs.

NCJRS Abstract

The document referenced below is part of the NCJRS Virtual Library collection. To conduct further searches of the collection, visit the Virtual Library. See the Obtain Documents page for direction on how to access resources online, via mail, through interlibrary loans, or in a local library.

 

NCJ Number: 70982 Find in a Library
Title: Interpreting Data - Case Study from the Career Intern Program Evaluation
Journal: Evaluation Review  Volume:4  Issue:4  Dated:(August 1980)  Pages:481-506
Author(s): L E Datta
Date Published: 1980
Page Count: 26
Type: Program/Project Evaluation
Format: Article
Language: English
Country: United States of America
Annotation: The ways to analyze data for magnitude of effects, attribution of causality, and statistical reliability to increase the precision of evaluation are explored on the example of the Career Intern Program (CIP) for high-risk, low-income youth.
Abstract: Even though the evaluation showed the success of the CIP program, a further examination of results found that some differences between CIP students and the control group were not statistically significant, others were too substantial to be true, and still others were significant but troubling. Therefore, five questions were applied to the data, concerning the magnitude of effects, and statistical reliability. For example, it was found that the 'p' for reading achievement was only marginally significant (.05) despite the strong emphasis of the program on remedial reading and math. However, observation of students' behavior during pre and posttesting and analysis of the test results showed that at the posttesting 80 percent of the CIP students did not complete the test on time as compared with 50 percent of the control group, but almost all of the CIP students' answers were correct, while the control student's papers suggested guessing. With the correction for guessing, the CIP students' pre to post scores increased, while the control students'scores decreased. Corroborating a change in response style was the CIP student's performance on the Raven's test, a measure of basic reasoning ability, where accuracy, rather than speed counts. Examples of interpretation failure, tabular data, and eight references are included.
Index Term(s): Data analysis; Evaluation; Program evaluation; Research methods
To cite this abstract, use the following link:
http://www.ncjrs.gov/App/publications/abstract.aspx?ID=70982

*A link to the full-text document is provided whenever possible. For documents not available online, a link to the publisher's website is provided. Tell us how you use the NCJRS Library and Abstracts Database - send us your feedback.