BJA monograph bannerBureau of Justice Assistance logo


Survey Process

The survey process can be broken down into three major steps: design, collection, and analysis.

Design

The design phase involves the actual writing of the survey. First and foremost, a survey needs to be clear: use plain language and simple sentences to reduce the chance for confusion. Experience also has taught that multiple-choice questions with preset answers (e.g., yes or no and multipoint scales, such as big problem, problem, and not a problem ) are more useful than questions that invite open-ended responses (such as in your own words and describe what you think about ). Preset answers are easier to use to code responses and analyze data, and people seem to prefer multiple-choice questions to help organize their thoughts.

Available Resources

The nature of a survey also depends on available resources. Staff size, time, money, and survey expertise determine whether a survey:

  • Is best administered by phone or in person.

  • Should be mailed out and filled in by respondents on their own or filled in by trained surveyors.

  • Should cover one topic or several topics.

  • Should be conducted in-house, by consultants, or by volunteers.

Regardless of the method used, a survey should be simple and short. Remember that you are asking citizens to volunteer time; do not keep them any longer than absolutely necessary or you may cause resentment. Some surveyors conduct a survey test run with a few community members and then ask for their feedback. Sample feedback questions include the following:

  • Were there questions you did not like or that did not make sense?

  • Was the wording confusing?

  • Were there questions missing that you wish had been asked?

Types of Questions

In general, community justice planners rely on three types of questions:

  1. Those that relate to specific program components (e.g., Would you support allowing low-level offenders to perform community service in the neighborhood? If so, what kinds of projects would you like to see them perform?).

  2. Those that solicit opinions about neighborhood strengths and weaknesses (e.g., Do you think local schools or churches or social services agencies are doing an excellent, good, fair, or poor job?).

  3. Those that obtain overall impressions about the community and provide a baseline for future surveys (e.g., Do you feel safe in your home, in the subway, or in the park? What is the most pressing issue facing the community—drug abuse, juvenile delinquency, crime, housing problems, or something else?).

Planners can get ideas for questions by looking at other surveys, including the National Crime Victimization Survey—a telephone survey conducted by the Bureau of Justice Statistics (for more information, visit www.ojp.usdoj.gov/bjs/cvict.htm)—and the Red Hook survey (see appendix A). In addition, the U.S. Department of Justice’s Weed and Seed national evaluation includes a survey that measures community perceptions of neighborhoods, public safety, and the criminal justice system.

Collection

There are many different ways to administer a survey. The Denver district attorney’s community prosecution program sponsored a survey in which teenagers who were members of a special youth empowerment team asked 225 of their peers about things they liked and disliked in their neighborhood. In West Palm Beach, Florida, community court planners adapted the Red Hook survey to meet their community’s needs—sending staff to conduct 60 interviews among targeted residences and businesses in the catchment area and repeating the survey every year to track changes over time. Others partner with local universities: Memphis’ Community Court contacted the University of Memphis’ Department of Criminology and Criminal Justice, which used graduate assistants to conduct both phone and in-person surveys during the course of a semester. Some hire outside consultants: Vermont’s Department of Corrections hired a private market research firm to conduct a statewide telephone survey about the department and its proposed reparative probation program. Similarly, the community court project in Minneapolis hired a local survey research firm to do a targeted telephone survey within the court’s catchment area. The survey asked residents about their knowledge and opinion of the project and its various components (e.g., community service, job training, and drug treatment).

Training

Whether the survey relies on volunteers or paid professionals, everyone conducting the survey should receive basic training that covers how to ask the questions, fill out the survey form, and make the experience pleasant and productive. The goal of training is to make surveyors comfortable with the forms and the target area. For example, some volunteers may be nervous about asking questions and bothering people. More experienced surveyors may be anxious about going into a new neighborhood, especially if certain areas seem dangerous. Training can include role plays of different situations, discussions about safety precautions, and practice runs of the survey to allay these concerns.

Volunteers

Community members make the best survey volunteers. In Red Hook, citizens were more willing to stop and take time to answer questions if the surveyor was also from the neighborhood. In this way, Red Hook volunteers were able to break down a traditional barrier that stands between surveyors and participants and foster the trust needed to engage participants. “The majority of the times, people come to the door and know me. They say, ‘Oh it’s you, Natasha, I’ll do your survey,’” said Natasha Richie, a Red Hook surveyor. “They feel better and more secure in talking to me.”

Target Area

A successful survey also needs a clear target area. This can be a geographic boundary or a demographic group, such as teenagers, the elderly, single parents, or other groups. In Red Hook, the target area is clearly defined by an elevated highway that physically divides the neighborhood from the rest of Brooklyn. This target area is subdivided into different geographic zones, and surveyors are assigned to each zone and work together to ensure adequate coverage of their area. Similarly, West Palm Beach divided its catchment area into six subareas and completed 10 surveys from each area to ensure a representative sample.

Analysis

The extent to which planners can use survey results depends in large part on their ability to analyze the data. Some projects have turned to experts—a graduate student familiar with statistical research or a local college professor. This document does not detail how to analyze data. Instead, here are a few pointers.

  • When selecting a software package, keep in mind the complexity of the survey and the number of questionnaires. To analyze small surveys, like the 60 questionnaires collected in West Palm Beach, a basic spreadsheet such as Lotus or Excel should be sufficient. Larger surveys in which hundreds of questionnaires are completed every year, such as Red Hook’s, probably will require a more flexible program such as Access or even SPSS.

  • Establish protocols for data entry. For example, those entering the data need to know what to do if more than one answer is circled.

  • Another important point to remember is that a survey can go beyond simply tabulating responses to questions. The survey data allow for comparing responses between subgroups or examining trends over time. This kind of analysis can help planners answer key questions such as the following:

—How has the percentage of residents who see public drinking, drug use, litter, loitering, or another specific problem as a very serious problem changed from year to year?

—Do older residents have a different assessment of the seriousness of local crime than younger residents?

—Do men and women want different types of services?

—Do African-American residents approve of police performance more or less than Latino residents?


Surveying Communities: A Resource for Community Justice Planners
May 2003