Key Considerations for Survey Development
What do you want to know?
- Before developing a survey, consider what you are trying to accomplish.
- What is the overarching question(s) that guides your curiosity?
- Can the question(s) be answered by a group of relevant stakeholders (i.e., students, faculty, staff)?
- Consider whether or not the question(s) can be answered through other data sources.
- Institutional Research and Planning (IRP) – LITE
- Assessment Data Online Retrieval System
- Integrated Post-Secondary Data System – (IPEDS)
- National Survey of Student Engagement (NSSE)
- Survey of Earned Doctorates (SED)
- Institutional Research and Planning (IRP) – LITE
Writing Survey Questions
There is an art to writing survey questions. Following are some key considerations:
- Resist the impulse to write specific questions until you have thought through your research questions.
- Write down your research questions and have a hard copy available when working on the questionnaire/survey.
- Every time you write a question, ask yourself “Why do I want to know this?” Answer it in terms of the way it will help you to answer your research questions. “It would be interesting to know” is not an acceptable answer.
Source: (Bradburn, Sudman, & Wansink, 2004)
- Keep your questions neutral.
- Don’t ask for two things at once. Avoid double-barreled questions.
- If the question has ‘and’ or ‘or’ in it, check carefully.
- Use language appropriate for your target population.
- If there is a need to use technical terms, define them in the survey.
Sources for Survey Questions
Ideas for specific survey questions can come from existing instruments, colleagues, members of the target population (collected via focus groups or interviews), and your observations.
There are a variety of ways to collect data in surveys. Following are some common forms.
Multiple choice questions involved two parts: a probe that identifies the question, and a set of possible responses. Multiple choice questions can be divided into two broad categories:
Single vs. Multiple Answer: Some questions may have exclusive choices (only one answer can be applicable); others may involve questions where multiple response options can apply.
Scales: A specific type of single answered question, where the response options vary in intensity on a measure (e.g., strongly agree, agree, disagree, strongly disagree)
Following are guidelines for developing question scales.
- Scales should be balanced – equal positive and negative options. A “neutral” midpoint is optional.
- Response categories should be comprehensive, including “not applicable” or “don’t know” where appropriate
- Make sure the labels match the questions. Example: If the questions ask participants to rate their level of satisfaction, a dissatisfied – satisfied scale is most appropriate.
Open-ended survey questions ask respondents to write meaningful responses based on their knowledge, experience and/or feelings. While these survey items oftentimes provide contextual information, they require much work to interpret the results. It is recommended that if you use open-ended survey questions, that you include them at the end of a section or at the end of the survey.
Be advised that when writing open-ended questions, they should not be structured to elicit a yes/no response.
Building Your Survey
Georgia Tech has an enterprise license for Qualtrics which is a cloud-based survey building and deployment tool. This platform can be used to build and distribute surveys.
It is recommended that surveys are “pre-tested” on a small number of subjects from the population of interest and revised on the basis of their feedback. In addition, allow time for surveys to be reviewed by various constituent groups within the Institute [e.g., Institutional Review Board (IRB) and, Institute Survey Coordination Committee (ISCC)] where applicable.
While a sample needs to be large enough to protect the privacy of respondents, it should also be representative of the population under consideration. A representative sample is a subset of a population that reflects the population. To calculate the minimum sample size to be considered representative within a range of values above and below the sample statistic visit:
Sample Size Calculator
Unless you are incredibly fortunate, not everyone you invite will participate in your survey. It can be difficult to predict the participation/response rates for surveys. Survey response rates can vary widely depending on a variety of factors such as timing, population, and focus of the survey. For internal surveys, expect between 20-30% response rate based on the population and nature of the survey. Following are a few considerations that could help to improve response rates to surveys.
- Keep the survey short. It is helpful to include a progress bar to show participants who much time is left to complete the survey.
- Tell potential respondents the nature of the survey and how the results will be used.
- Who the survey is coming from can help boost response rates. Consider an authority figure, someone with name recognition (e.g., President, Provost, Deans, etc.)
- The timing of the survey can impact the response rate. Consider times when your population of interest is generally available (e.g., lunchtime).
- Send reminders at strategic intervals to non-responders and partial-responders.
- The survey should be mobile friendly.
- Promote survey through relevant channels.
Georgia Institute of Technology – Institutional Review Board (IRB) is charged with the responsibility of ensuring the Institute’s compliance with all federal regulations and internal policies and procedures governing the protection of human subjects in research. Select here to learn more https://researchintegrity.gatech.edu/irb/policies. Some research projects qualify for exempt status. To learn if your research project qualifies for exempt status, select the following link https://researchintegrity.gatech.edu/irb/central-irb
Internal surveys such as Point of Service (POS) surveys and satisfaction/engagement surveys administered to the Institute’s internal stakeholders in support of institutional effectiveness initiatives usually do not require a formal IRB application. For more information, please contact the Office of Research Integrity Assurance IRB at email@example.com.