U.S. flag

An official website of the United States government, Department of Justice.

NCJRS Virtual Library

The Virtual Library houses over 235,000 criminal justice resources, including all known OJP works.
Click here to search the NCJRS Virtual Library

Recognize the Face

NCJ Number
222014
Date Published
March 2008
Length
2 pages
Annotation

This paper describes research sponsored by the U.S. Justice Department's National Institute of Justice (NIJ) that is intended to improve the quality and resolution of frames of video surveillance tape, so that the faces of perpetrators captured on surveillance tape are identifiable.

Abstract

Under NIJ funding, the Visualization and Computer Vision Lab at GE Global Research has partnered with Pittsburgh Pattern Recognition (PPR) to develop computer vision and image processing technology that will improve the quality of facial images in video recordings. The underlying video processing technology is composed of face detection, active shape and appearance models, and super-resolution image processing. Initially, face detection algorithms locate the faces of persons seen by the video camera. Second, active shape and appearance models lock onto the individual three-dimensional shape of the face in each video frame, allowing it to be rotated to a frontal view. This enables a frame-to-frame registration of the face, so that all of the images can be combined. Super-resolution processing then reconstructs a higher resolution image of the face from several lower resolution video frames. With the core technology now developed, GE is currently building a prototype interactive video application for forensic video analysis. This forensic tool will allow the user to first select a face from a surveillance video clip. The system will then accurately lock on the face in each frame in 1 to 2 seconds. The resulting image will have higher quality and greater clarity than can be used for automatic identification using face-recognition software or be distributed on bulletins and wanted posters.