So far, we have talked about a number of different aspects of doing a survey, including
- How to select the cases for your survey (Chapter 2—Volume I);
- The different types of error that can occur in a survey (Chapter 3—Volume I);
- Things you need to think about when planning a survey (Chapter 4—Volume I);
- Different ways of delivering the survey to your sample (Chapter 5—Volume I); and
- Writing good questions (Chapter 2—Volume II).
In this chapter, we’re going to talk about how you carry out the survey. We’re not going to get into the nuts and bolts of doing a survey. There are lots of good books that will do this, and we’ll mention them in the annotated bibliography at the end of this chapter. Rather we’re going to describe the steps that every researcher must go through in carrying out a survey.
Developing the Survey
Let’s assume that you want to do a survey of adults in your county to determine their perception of the quality of life. You know that there are certain areas that you want to explore, including perceptions of crime and the economy. You want to develop a survey that can be repeated on an annual or biannual basis to track how perceived quality of life varies over time. You’re aware of other quality-of-life surveys to which you would like to compare your survey results. What should you do to begin developing your survey?
Looking at Other Surveys
It’s often helpful to look at the types of questions that other researchers have used. One place to search is Google (http://google.com) and Google Scholar (http://scholar.google.com). If you happen to be on a college campus that subscribes to the Roper Center for Public Opinion Research (http://www.ropercenter.cornell.edu), consider using iPOLL, which is a database of over 700,000 survey questions. You can search all these search engines by keywords. Entering the words quality and life will search for all questions containing both words in the question. Often what others have asked will give you ideas of what you might ask.
Focus groups are another tool that you can use in developing your survey. A focus group is a small group of individuals from your study population who meet and discuss topics relevant to the survey.1 Typically, they are volunteers who are paid to take part in the focus group. For example, if your study deals with quality of life, you might explore with the focus group what they think quality of life means and which issues, such as crime and jobs, are critical to quality of life. A focus group gives you the opportunity to discuss the types of information you want to get from your survey with a group of people who are similar to those you will sample.
A cognitive interview is a survey administered to volunteers from your study population that asks them to “think out loud”2 as they answer the questions.3 Cognitive interviews give you the opportunity to try out the questions and discover how respondents interpret them and what they mean by their answers. Let’s say that one of the questions you want to ask in your survey is “What is the most pressing problem facing the community in which you live?” In a cognitive interview, you can ask respondents how they interpret this question. What does “most pressing problem” mean to them? And you can ask them to take you through their thought processes as they think through the question and formulate an answer.
Behavior Coding and Interviewer Debriefing
Another way to pretest a survey is to conduct a pilot study, where you administer the survey to a small sample of respondents. Respondent behavior can be coded to help you identify problem questions. Gordon Willis suggests that you look for questions in which the following events occurred—“(1) Interrupts question reading (2) Requests repeat of question reading (3) Requests clarification of question meaning (4) Provides qualified response indicating uncertainty (5) Provides an uncodeable response (6) Answers with Don’t Know/Refused.”4 Interviewers can also be debriefed about problems they encountered while administering the survey.5
Asking Experts to Review the Survey
When you have a draft of the survey completed, ask survey experts to review it and point out questions that might be confusing to respondents as well as other types of problems. Most colleges and universities will have someone who is trained in survey research and willing to review your draft.
Pretesting the Survey
When you think you are ready to try out your survey, select a small number (25–40) of respondents from your study population and have them take the survey using the same procedures you will use in the actual survey. In other words, if you are using a telephone survey, then do your pretest over the phone. If it’s a web survey, then your pretest should be over the web. You probably won’t be using these responses as part of your data since you are likely to make changes in the survey based on the pretest results.
Here are some of the things that you ought to look for in your pretest6
- How much variation is there in the answers to each question? Questions that don’t have much variation will not be very useful when you analyze your data. For example, if you want to explore why some people are concerned about being a crime victim and others aren’t and if almost everyone is concerned, then this question doesn’t have much variation and there isn’t anything to explain. Of course, you can point out that there is near-universal concern about being a victim of crime, but that’s about all you will be able to say. You won’t be able to explore why some are more concerned about being a victim than others, since there is little variation in how respondents answer this question.
- How many respondents skip certain questions or say they don’t know how to respond? No answers and don’t knows could be an indication of a problem with the way the question is worded or it could indicate that the question asks for information that respondents can’t or don’t want to provide.
- Is there evidence of satisficing? Some questions require a lot of effort to answer, and sometimes respondents look for ways to reduce the burden of answering certain questions. This is what is called satisficing. For example, giving one-word answers to open-ended questions can indicate satisficing. Asking people what is the most pressing problem facing their community requires a lot of effort to answer. Answering with one word such as “crime” or “education” is one way to reduce the burden. We discussed satisficing in Chapter 3 (Volume I). You might want to refer back to that chapter.
- If you are asking respondents to skip particular questions based on their answers to previous questions, did the skip patterns work as you intended? For example, you could ask respondents if they are very satisfied, somewhat satisfied, somewhat dissatisfied, or very dissatisfied with their life in general. You might want to ask only those who are dissatisfied to tell you why they are dissatisfied. This requires a skip pattern in the questions. If you’re using a telephone or web survey, you can program that skip into the software you are using. If you are using a face-to-face survey, the interviewer will have to be instructed when to skip to the next question. If you are using a mailed survey, the instructions will have to be written in the survey. However you build the skip pattern into your survey, did it work as you intended? It’s important to check to make sure that the skip patterns are working properly before you begin the actual survey. The pretest is the place to check it out.
- How long did it take for the respondents to complete the survey? Do you think respondents will be willing to spend that much time on your survey? You can ask respondents in the pretest whether the survey took too long to complete.
- If you are using an interviewer-administered survey, did the interviewers report any problems during the survey? Be sure to debrief your interviewers after the pretest.
Pretesting is an essential step in preparing your survey so it is ready for delivery to your sample. Here are some other suggestions for the pretest.
- There are two questions that are always important to ask when preparing a survey. How are respondents interpreting the questions? What do respondents mean by their answers? We talked about the usefulness of cognitive interviews when you are developing your survey. They are just as useful during the pretest. Howard Schuman suggests the following probes. “Could you tell me why you say that?” “Would you explain what you meant by _______?”7 George Bishop suggests asking respondents to “think out loud” while answering the question.8
- Ask respondents to tell you about the problems they encountered while doing the pretest. Were there questions they had difficulty in answering? Were there questions that were confusing?
- If it’s possible, record the pretests so you can go back over them with the interviewers and talk about particular questions. These could be audio or video recordings. Remember that you will need to get the respondent’s permission to record the interviews.
Administering the Survey—Using Probe Questions
Administering the survey depends in part on your mode of survey delivery. In Chapter 5 (Volume I), we talked about the four basic modes of survey delivery—face-to-face, mailed, telephone, and web—and mixed-mode surveys, which combine two or more of these delivery modes. You might want to go back and look at this chapter again and at some of the references mentioned in the annotated bibliography.
One of the most important tasks of survey administration is to clarify the answers of respondents through follow-up questions. These types of questions are referred to as probes. There are a number of different types of probes. For example, we could ask respondents to “tell us more” or what they meant by a particular answer. Patricia Gwartney suggests some other probes.9
- Silence—Don’t be afraid of not saying anything for a few seconds. This can encourage respondents to expand on what they told you.
- Repetition—We could repeat what respondents tell us in their own words to encourage them to expand on their answers.
- Repeating the question—Another type of probe is to simply repeat the question and the response categories.
- Asking for help—Saying that you don’t understand the respondent’s answer and asking for help is a useful probe. Asking for help can encourage respondents to work with you to clarify an answer.
Some questions are particularly likely to require a follow-up question in order to clarify what respondents tell us. Here are some examples.
- Suppose we ask a respondent, “What is the most pressing problem facing your community today?” and the respondent says, “Crime.” We could probe by saying, “Could you tell me a little more about that?”
- Researchers often want to know a person’s race and ethnicity. Often we start with a question such as “Would you describe yourself as being Hispanic or Latino?” This could be followed by “What race do you consider yourself to be?” But what do you do if the person says that he or she is German or Italian? One approach is to probe by rereading the question, but this time asking them to select their answer from among a set of categories, such as White, American Indian, African American or Black, Asian or Pacific Islander, and other. Many surveys allow respondents to select more than one category. There also needs to be a category for “refusal.”10
- Sometimes we want to know what respondents do for a living. We might start by asking them if they are currently employed and, if they are, by asking “What is your current occupation (or job)?” Some respondents may not give you the information you need. Gwartney suggests the following probes: “What kind of work do you do?” “What is your job title?” “What are your usual activities or duties at your job?”11
Probing in Web Surveys
The way in which we probe depends in large part on the mode of survey delivery. Surveys that are interviewer-administered, such as face-to-face and telephone surveys, provide the interviewer with considerable control over the use of probe questions. Web surveys are not interviewer-administered, but technological advances give the researcher considerable control here as well.
There are some questions that you know will require a probe question. For example, if you ask someone their job title, you will need to follow that up with a question asking about the duties and activities of their job. If you ask people what they think is the most pressing problem facing their community, you might want to follow that up with a probe asking, “Why do you feel that way?” This type of probe can easily be built into any survey, including web surveys.
There are other types of probe questions that depend on what respondents tell you. Pamela Alreck and Robert Settle call these interactive or dynamic probes.12 For example, if respondents give you a one-word answer, such as “crime” or “drugs,” to the most-pressing-problem question, you would want to ask them to “tell me a little more about that.” That’s more difficult to carry out in a web survey unless you can identify the specific keywords for which you want to ask a probe question. In addition, you need to be using web survey software that allows you to use this type of probe question.
Probing in Mailed Surveys
Probing is more difficult in a mailed survey. Mailed surveys are not interactive. There is no contact between the interviewer and the respondent unless one provides the respondent with a telephone number or web address that they can use to contact you. Consequently, all instructions and questions have to be written out in the survey. This limits you to probes that can be anticipated in advance. If you are asking about the respondent’s occupations or jobs, you can include a probe question asking the respondents to tell you about their job’s duties and activities. If you are asking about attitudes or opinions on some issue, you can ask them to tell you why they feel that way. But there is no opportunity for following up on respondents’ specific answers. If they tell you that their race is Swedish, you can’t follow that up. You have to make your instructions clear and specific enough to make sure that the respondents know what you are asking.
Administering the Survey—Record Keeping
Another important part of survey administration is record keeping. It’s essential to keep good records regardless of the survey delivery mode. But the information that is available for your records will vary by the mode of survey delivery. In an interviewer-administered survey, you might have information about individuals you are unable to contact or who refuse to be interviewed. Each time you attempt to reach a potential respondent, a record must be kept of the result. These are often referred to as disposition codes. You should be sure to record the following information.
- Was the respondent eligible to be part of the survey or ineligible based on whom you were trying to contact or you don’t know? If the respondent was ineligible or you don’t know, why?
- Were you able to make contact with the respondent? If not, why?
- Was the interview completed? If not, why?
Patricia Gwartney has a detailed list of disposition codes for telephone interviews, which could be adapted for face-to-face surveys.13 You can also look at the disposition codes published by the American Association for Public Opinion Research.14
Often respondents are unable to do the interview at the time you reach them and the interview needs to be scheduled for a callback. This should be recorded on a callback form. You should attach a call record to each survey, which records each contact, the outcome, the date and time of the contact, the interviewer’s name, and when to call back along with any other information that the interviewer wants to convey to the next interviewer. If you are doing a phone survey and are using CATI software, the program will create this record for you.
In a self-administered survey, you probably won’t have much information about nonrespondents. You may only know that they didn’t respond. However, sometimes the respondents will contact you and indicate why they aren’t completing your survey. This could be because they have moved and aren’t part of your study population or because they don’t have the time or aren’t interested or because they have a problem about survey confidentiality. Be sure to record this information. But at the very least, you need to be able to report the response rate15 for your survey.
Another reason that good record keeping is so important is that it provides a record of the way in which you carried out your survey. For example, when you create a data file, you make decisions about how to name your questions and how you record the responses to these questions. An example is a person’s age. You would probably name this question as age and record the person’s age as a number. But what will you do if a person refuses to answer this question? You might decide to use 98 for any person who is 98 years of age or older and use 99 for refusals. You should record this decision in a permanent file so that you will remember what you did when you come back to this data file after several years. Or you might give someone else permission to use your data sometime in the future, and he or she will need to know how you recorded age. There needs to be a permanent record of the way in which the survey was carried out to enable future use of this survey.
Administering the Survey—Linking to Other Information
For some types of surveys, there are other administrative or organizational data that might be available. For example, if your population is students at a university, the registrar will have information about the students. If your population is employees in a large organization, there is bound to be information on these employees, such as length of time at the organization and salary. You might be able to link your survey to these types of administrative or organizational data.
This, of course, raises a series of questions that you must consider and answer before linking to these types of data. Here are just a few of these questions.16
- Do you need the individual’s informed consent? How do you go about getting this consent?
- Is it ethical and legal to access these data?
- What is the quality of the data?
- What is the cost of accessing these data?
- How do you access these data, and how much effort is it going to take to do so?
- How accurate will your matching procedure be? In other words, will you be able to accurately match your survey respondent with the correct person in the organization’s records?
- How do you maintain the confidentiality of the respondents?
Processing the Data
If your survey includes open-ended questions, you will probably want to code the responses into categories. Let’s consider the question we have been using as an example: “What is the most pressing problem facing your community today?” Responses to this question could be coded into categories, such as the economy, crime, education, traffic and transportation, and so on. You will probably want to divide each of these categories into more specific categories, such as lack of jobs, violent crime, and property crime. Once you have developed the categories, have two or more people code the data independently so you can see if the coding done by different individuals is consistent.
Editing the Data
In addition to coding answers to open-ended questions, you will want to review all the answers. For example, let’s say that you’re doing a mailed survey and you ask an agree–disagree question with the following categories: strongly agree, agree, disagree, strongly disagree. What are you going to do if someone selects more than one answer? With other survey delivery modes, you have more control over the types of answers that respondents give so you would be able to avoid this type of problem. But you still need to edit the data to check for completeness and consistency. You may need to have a category for uncodable, and you will definitely need categories for people who say they don’t know or refuse to answer questions.
There are several options for data entry. You could enter your data directly into a program, such as Excel, or into a statistical package, such as SPSS. If you are using CATI software or web survey software, such as Survey Monkey or Qualtrics, the data can be exported into a number of statistical packages, such as SPSS or SAS, or into an Excel file.
Data analysis is beyond the scope of this book. There are many good books on statistical analysis, and we’ll mention some of them in the annotated bibliography at the end of this chapter.
Writing the Report
Writing reports will be one of the topics covered in Chapter 5, Volume II.
Making the Data Available to Other Social Scientists
It has become commonplace for researchers to make their survey data accessible to other researchers by placing their data in archives, such as the Inter-university Consortium for Political and Social Research (ICPSR) at the University of Michigan and the Roper Center for Public Opinion Research at Cornell University. Depending on the nature of your data, you may or may not choose to make your survey data publicly available.
Regardless of your decision to make your data available, it’s important for you to document how the data were collected. If your data are in a file that can be read by a statistical program, such as SPSS, SAS, Stata, or R, you need to document how that file was created. At some later point in time, you may want to reanalyze your data or give it to another researcher for further analysis.
You may want to look in the archives of the ICPSR or the Roper Center for data that you might be interested in accessing. Mary Vardigan and Peter Granda provide an excellent introduction to data archiving, documentation, and dissemination.17
In an interviewer-administered survey, it’s important for the interviewer to be a good listener. Raymond Gorden talks about active listening and suggests that interviewers ask themselves several questions as they are listening to the respondent.
- Is it clear what that means?
- Is that really relevant to the question?
- Is the answer complete?
- What does that tone of voice mean?
- Should I interrupt now to probe or should I wait till later?18
Gorden also suggests a number of keys to being a good listener.19
- “Know your objective.” Interviewers should understand what each question is attempting to find out about the respondent and what the purpose of the question is.
- “Pay attention from the beginning.” Don’t get distracted and miss what the respondent is telling you.
- “Control your urge for self-expression.” Don’t interject your own thoughts into the interview. Remember it’s not about what you think; it’s about what the respondent thinks.
- “Listen actively.” As the respondents are talking, pay attention to what they are saying. Think about possible probes that you may want to ask.
- “Be patient.” Don’t rush. Let the respondents tell you in their own words.
In interviewer-administered surveys, interviewers need to be trained. It’s unreasonable to expect them to pick up what they need to know through on-the-job training. Here are some different training techniques. A good training program will combine several of these approaches.
You will need to provide documentation for interviewers to study and to have available for reference during interviews. These should include:
- Copies of the survey questions with skip patterns.
- List of questions that respondents might ask and suggestions for answering these questions. Questions might include, for example: How did you get my name and address or phone number? How long will it take? Do I have to do it? What’s the survey about? What’s the purpose of the survey? Who is the survey for? Is what I tell you confidential? There are some excellent examples of handouts on answering respondent’s questions in Don Dillman’s and Patricia Gwartney’s books on survey research.20
- Why people refuse and how you might respond to these refusals. For example, people might respond by saying:
- I don’t have the time to do it.
- I never do surveys.
- I’m sick now.
- I’m not interested.
- It’s nobody’s business what I think or do.
For some of these reasons, there’s an easy response. For example, if someone doesn’t have time to do it now or is sick, you should offer to call back at a more convenient time. If someone says they never do surveys, you should explain why this survey is important and worth their time. Don Dillman and Patricia Gwartney also have examples of handouts on how to handle refusals.21
- Interviewer manual including information on the following:
- Getting the respondents to participate
- The structure of the interview
- How to ask questions
- How and when to probe
- What to say when the respondent doesn’t understand a question
- Disposition codes that indicate the result of the contact
- Scheduling callbacks
- Time sheets to record hours worked
- Getting paidi
Interviewers should have the opportunity to practice the interview before actually starting data collection. A good place to start is to practice interviewing themselves. Have them read through the questions and think about how they would answer and what they might find confusing. Then interviewers could pair off with another interviewer and take turns interviewing each other. They could also interview friends and family.
Role playing is often a useful training device. Have experienced interviewers play the role of respondents, and simulate the types of problems interviewers might encounter. For example, problems often arise when asking questions about race and ethnicity. Respondents often give one-word answers to open-ended questions. These types of difficulties could be simulated in a practice session.
Another useful training tool is to have experienced interviewers work with new interviewers and coach them on how to handle difficult problems that arise. Experienced interviewers could listen to practice interviews and then discuss with the new interviewers how they might improve their interviewing technique. If it’s possible, record the practice interviews so you can review them and use them as teaching tools.
As has been mentioned in Chapters 3 and 5 (Volume I), a major concern of survey researchers is the declining response rates that all modes of survey delivery have experienced during the last 35 to 40 years.22 This has been one of the factors that have led to the increased cost of doing surveys. But the concern is not just over cost. The concern is also that this will lead to increased nonresponse bias. Bias occurs when the people who do not respond to the survey are systematically different from those who do respond, and these differences are related to the questions we ask. Increasing response does not necessarily decrease bias. Jeffrey Rosen et al. note that increasing response rates among those who are underrepresented is what is necessary to reduce nonresponse bias.23
We discussed survey participation in Chapter 3 (Volume I), so we’re not going to repeat the discussion here. Rather we want to emphasize that declining response to surveys is a serious potential problem since it increases the possibility of nonresponse bias. Take a look at our discussion in Chapter 3 (Volume I) of various theories of survey participation and how you might increase response rates.
Robert Groves and Katherine McGonagle describe what they call a “theory-guided interviewer training protocol regarding survey participation.”24 It starts with listing the types of concerns that respondents have about participating in the survey and then organizing these concerns into a smaller set of “themes.” Training consists of:
- “Learning the themes”;
- “Learning to classify sample person’s actual wording into these themes”;
- “Learning desirable behavior to address these concerns”;
- “Learning to deliver . . . a set of statements relevant to their concerns”; and
- “Increasing the speed of performance” so this process can be done quickly.25
For example, if the respondent says, “I’m really busy right now!” the interviewer might respond, “This will only take a few minutes of your time.” Basically what the interviewer is doing is tailoring his or her approach and response to the respondent’s concerns.26
- Tools for developing the survey
- Focus groups allow the researcher to get a sense of how people feel about the issues covered in the survey.
- Cognitive interviewing is a way to find out how respondents interpret the questions and what they mean by their answers. One way to do this is to ask respondents to “think out loud” as they answer the questions.
- Look at other surveys with a similar focus.
- Survey experts can review your survey and point out problems.
- Pretesting the survey
- Try out your survey on a small group of individuals from your survey population.
- Ask respondents in your pretest to talk about the problems they had taking the survey.
- In interviewer-administered surveys, ask the interviewers about the problems they had while administering the survey.
- Administering the survey
- Probes are follow-up questions that elicit additional information or clarify what the respondent said.
- There are many types of probes, including the following:
- Asking respondents for help in understanding their response
- It’s critical to keep good records of each attempt to conduct an interview and to keep an accurate record of the ways in which the survey is carried out.
- Sometimes you can link your survey data to other administrative records. However, this raises a number of ethical and logistical questions.
- Processing the data includes coding open-ended responses, editing the data, data entry, data analysis, and writing reports (covered in Chapter 5—Volume II).
- You may want to make your survey data accessible to other researchers. There are a number of archives which may be willing to house your data. Regardless of your decision to archive your data, you will want to keep good documentation of the process by which you created the data.
- When an interview is administered by an interviewer, it’s essential for the interviewer to be a good listener. Being a good listener is something people can learn to do.
- There are several approaches to training interviewers for face-to-face and telephone surveys.
- Providing copies of the survey and skip patterns, questions interviewers might be asked, how to respond to refusals, and interviewing manuals.
- Practice interviews
- Survey participation
- Survey response rates have been declining for the last 35 to 40 years.
- This increases the possibility of nonresponse bias.
- Increasing the overall response rate does not necessarily decrease bias unless you increase the response rate for those who are underrepresented in the survey.
- Developing the survey
- Pretesting the survey
- Earl Babbie’s Survey Research Methods30 has a good discussion of pretesting.
- Jean Converse and Stanley Presser’s Survey Questions: Hand-crafting the Standardized Questionnaire31 is another excellent discussion of pretesting.
- Gordon Willis32 has an excellent review of the various ways you can pretest your survey.
- Administering the survey
There are a number of very good books on how to do various types of surveys. Here are some excellent sources.
- Don Dillman’s series of four books on survey research
- Patricia Gwartney—The Telephone Interviewer’s Handbook 37
- Mick Couper—Designing Effective Web Surveys 38
- An excellent discussion of how to be a good listener is Raymond Gorden’s Basic Interviewing Skills.39
- Interviewer Training
Here are some good references on training interviewers.
These are excellent discussions of nonresponse, nonresponse bias, and increasing response.
- Data Analysis
Data analysis is beyond the scope of this book, but here are some excellent references.
- Earl Babbie—The Practice of Social Research 45
- Jane Miller—The Chicago Guide to Writing about Multivariate Analysis 46
- Your favorite statistics book. If you don’t have a favorite statistics book, take a look at Social Statistics for a Diverse Society 47 by Chava Frankfort-Nachmias and Anna Leon-Guerrero and Working with Sample Data 48 by Priscilla Chaffe-Stengel and Donald N. Stengel.
iThis list is not meant to be exhaustive. It is only meant to give examples.