Guidelines for Developing & Conducting Surveys

Request a consultation with ODS through our online Data Request Form as early in the planning process as possible.

Planning Stage

Determine the deadline for reporting the data the survey will collect. The complete survey process usually takes weeks to months from the first step to completing a report of the results. Creating a time line from the finish date to the first task will ensure that you complete the survey report on time.

Identify the questions you want the data to answer. Identify the overall reasons for conducting the survey. What do you want to learn? What will you know after you have conducted the survey? Decide whether this is a one-time or recurring survey.

Identify the data you need to answer those questions. Do you need performance data, opinions, comparisons based on some criteria, etc.? The data you need will form the foundation of the survey form. An OPIE consultation can help you decide if the survey will require reviews by IRB, Title IX staff, legal staff and/or MCO at CNM.

Identify the people from whom you will gather the data. Responses from which group of people will provide you the most useful data? Surveying students to determine their satisfaction with job placement services, for example, is useful only if the students you survey have used the services of the job placement office. Also, if your questions require data from people who meet specific characteristics, this is the step in which you identify those characteristics (students with 0-9 hours, students with 10-15 hours, students majoring in X, etc.). If your survey is part of a research project with participants are under 18 an IRB Approval is required.

Determine when to conduct the survey. In general, the information you are collecting determines when you conduct the survey. If your major interest has to do with information about expectations of a course, for example, the survey should be conducted at the beginning of the course. If, on the other hand, you are seeking information regarding student experience of a course or with the use of services, the survey would provide the most useful results at the end of the semester or after the use of the service. If you are measuring impact or change resulting from an experience, you may wish to survey the selected group both before and after the experience.

Design Stage

Design the methodology for conducting the survey. This is the step where you decide the procedures for conducting the survey: the number of people you will survey, how you will survey them (by phone, in class, a mailed form, or on the internet), etc. Additional methodological issues to consider are whether or not to make follow-up contacts and how to prepare the forms for data entry, which software to use if web-based (Microsoft Forms, SurveyMonkey, etc.) and who might be left out.

Many people are interested in web-based surveys as an alternative to more traditional methods such as by phone, in class, or a mailed form. Web-based surveys do have a number of advantages, including time and cost savings and increased flexibility. They are, however, more suitable for groups of people comfortable using computers and with internet access. Web-based surveys also tend to have lower response rates than more traditional methods, so more thought should be given to creating incentives for participants to take the survey. What will you do to maximize the response rate?

Consult with MCO. For external surveys or institution wide surveys, developers should consult with MCO for style reviews, CNM branding (unless using a pre-approved survey template) and distribution of surveys via mass emails. If applicable, create a schedule for MCO to announce the survey and send reminders for participation in the survey.

Design and produce the survey form. Creating a useful survey form requires careful thought and skillful application of some basic rules. Keep in mind that a survey form should be as brief as possible (aim for no more than one side of a single page at most) and should create as little frustration as possible to increase the likelihood that it will be completed and returned. The aim of a useful survey form is to help the people you are surveying give you the information you need in a form that is useful. Include only those questions which are important to the current study, for example, don't ask for "age" if it is not pertinent to answering your outcomes question.

Consider accessibility issues in the survey design and wording of questions. For instance, do not use hover options in the text; make the survey operate with the keyboard alone for those without a mouse.  Rename URL links with descriptive names for participants using screen readers. Images should include equivalent alternative text (alt text) in the markup/code. Make paper copies available on request. Make Braille copies available (contact CNM's Accessibility Services for help). For additional recommendations see: W3C Accessibility.

Start with most important questions as participants do not always completely finish surveys. The first questions should be chosen with care. They should "hook" the reader into answering the rest of the survey questions.

Ask questions in a logical order. If using a printed survey, avoid "contingency" questions; those where you check "yes" to one question, and then go to another set of questions elsewhere. They are confusing and tend to lower the number of completed survey forms returned. For electronic surveys, you can add “contingency” questions using skip logic which will automatically guide respondents to the question order that you specify.

Construct response categories carefully. Response categories must allow for all possible responses yet not be too long. If you are asking students how much time they spend studying, you would want to include "never" as well as "X hours every day" but you would not want to list all the number of hours in a day. You would provide categories of hours within the day, such as "1-3 hours per day", "4- 6 hours per day", etc.

Avoid double-barreled questions. A double-barreled question is a question composed of more than two separate issues or topics, but which can only have one answer. A double-barreled question is also known as a compound question or double-direct question. For example: How satisfied are you with your pay and work hours? This may create confusion for the respondent because they may want to rate each item separately. Additionally, when doing analysis, you cannot separate out the differences between the two items.

Consider the Following Benefits and Challenges of Different Question Types

  • Closed-ended questions (such as Likert scales, multiple choice, checkbox, yes or no) pose answer choices that make the analysis easy as the choices are easy to code and categorize. While closed-ended response choices could clarify the question, responses could also be perceived as leading. Closed-ended questions may frustrate respondents if their answer is not one of the possible choices or if there is no way to explain an additional or more complex response.
  • Open-ended questions allow respondents to provide specific, detailed and more in-depth opinions and feelings using their own voices. Analysis of responses to open-ended questions requires more time than closed ended questions and requires specific methodology (qualitative methods) to understand and describe additional insights provided the participants.
  • Combination type questions bring together the strengths and challenges of both of the previous question types allowing respondents to choose an answer option and also provide additional information if desired. Providing an ‘Other’ open-ended choice along with defined answer choices is an often used strategy.
  • When possible, have multiple reviewers who are similar to (or familiar with) those you plan to survey complete the form and give you feedback, then make improvements accordingly. Are the directions clear? Are the questions easy to understand? Does the format invite responses? How long did it take them to complete the form? Did your test respondents provide the types of responses you expect (in other words, did they understand the meaning of the question as you intended it to be understood)? Were accessibility issues addressed.

Begin each survey with an introduction that clearly explains:

  • the reason for the survey
  • participation is strictly voluntary
  • the level of confidentiality of responses (anonymous – no identification is collected or confidential – no identification released to anyone except the analyst)
  • notification that academic or contact information may be subject to FERPA regulations (if applicable)
  • how the data will be protected (who will see the raw data and/or analysis?)
  • how the data/ responses will be used
  • if the results or analysis will be published or distributed
  • contact information if respondents have questions or concerns

Example of Introductory Paragraph

Within (program name) we value and respect our students' educational experience and want to offer the highest quality programs. In an effort to inform program development we would like to take a minute of your time to ask about experiences that led you to (name) programs. This survey is voluntary and you may opt out of participating at any time by closing the survey. If you choose to participate your responses will be anonymous and will only be used for opportunities to improve our programs or for writing grants. You must be 18 years or older to participate. If you have any questions or concerns about the survey, please email us.

Conducting the Survey

Will the survey be administered by ODS or another department? If MCO will be sending out mass emails or announcing the survey in News Link, have you: 1) sent the survey link to MCO and 2) confirmed the schedule for announcements and reminders that MCO use to send invitations to participate in the survey?

Distribute survey forms, or send out invitations for web-based surveys, as outlined in the methodology (steps 6 and 7). As they are returned, track the number completed. For web- based surveys, work with the software application for extracting and analyzing survey responses. As the deadline for returning the survey form approaches, determine whether you will send a reminder to return it (if this is not being done by MCO). Reminders can be as simple as an email message with the link to the survey or a postcard reading "Did you submit your completed survey form yet?" or more complex, perhaps another copy of the survey form and a reminder note; or even a phone call to those people who have not returned the form, requesting that they respond over the phone.

Analysis and Reporting 

Analyze the data. There are several software packages available to analyze data. ODS is available to provide consulting services to assist you with your data analysis on an as- available basis.

Disseminate the findings. The final task of conducting a survey is to communicate the findings clearly and accurately so they can be used for making decisions. Your report should include a meaningful title. "Student Survey Results" says little; "Factors Related to Student Attrition" says much more.

To orient the reader to your report, include the purpose of the study and how the survey was conducted (the methodology used). Consider how you will maintain confidentiality or anonymity of the survey participants. Provide a summary of your results, including any tables or charts displaying data. And finally, draw your conclusions and make recommendations based on your findings.

In addition to a full report, consider producing a one-page summary of findings for all surveys with the following sections:

  1. What did we want to know?
  2. What did we learn?
  3. What do we do with the data next? What actions do we want to take based on the results of the survey?