Creating Surveys
Tips for creating surveys
Need to get information from your users?
Conducting a survey can be a quick and easy way to get data about your users. Creating a great survey is like designing a great user experience — they are a waste of time and money if the audience, is not at the focus of your process. It’s also easy to create a survey that lies to you, or skews your data with bias. Designing for your user helps you gather more useful and reliable information.
What is a survey, and why use one?
Surveys consist of a set of questions used to gather a participant’s information relevant to a topic. That information can include their attitudes, preferences, or opinions on the topic. As a research method, surveys allow you to quantify concepts.
Using surveys provide information to better understand your users to design better products. They help mitigate risk of designing the wrong, or a poor, solution for users. Surveys providing stakeholders confidence that a solution is, or will be effective.
Surveys can be an effective method of identifying:
-
Who your users are
-
What your users want or think
-
What they do
-
How they do it
-
Where they they do it
-
When they do it
-
What they think
Though they take some up-front effort, surveys can help you save time and capture data from a wider group.
Online surveying itself can be cost effective, but it’s good practice to offer an incentive to those who participate. It shows that you value people’s time and responses, while encouraging more to participate. Gift cards often work well, but keep that in mind as you plan out budgets and scale.
If you’re looking into richer information — the how and why behind something — consider user interviews or observation sessions. While planning your research for a project, figure out what you need to know and select the methods that will best get you that information. Most survey respondents tend not to type out detailed, explanative responses. Because of this, open-ended questions can be more difficult to ask effectively in a survey.
Quantitative or Qualitative: What kind of survey do you need?
The type of questions you ask will have everything to do with the kind of analysis you can make. If you won’t act on the data, don’t ask the question.
Surveys can be more effective when gathering quantitative information. Be sure that your data is broadly applicable to a large number of people. Use quantitative surveys when you need to ask questions that can be answered by a checkbox or radio buttons.
Qualitative surveys ask open-ended questions to find out deeper information, sometimes in preparation for doing quantitative surveys. They are good for comments, feedback, suggestions, and other kinds of responses that aren’t as easily classified and tallied as numbers can be. You can survey fewer people than in a quantitative survey and get rich data.
It’s possible to mix the two kinds of surveys, and it’s especially useful to do small, primarily qualitative surveys first to help you generate good answers to count later in a bigger survey.
Types of Questions
In closed questions, respondents have a fixed number of possible responses to choose from. For example: yes/no, multiple choice, checkboxes, or scale questions. In open questions, they can respond however they want, such as essays or short answers.
So when should you use each question type?
Open questions will give you much better qualitative data. If you need insight into how users think about a problem, open questions will give you a lot more detail. They allow for responses that you may not have accounted for if you’d used a closed question instead. Respondents will also be happy that you’ve given them an opportunity to express themselves.
On the downside, qualitative responses tend to take a lot longer to analyze, for the very reason that they can provide so much detail. If you’re expecting a lot of responses, be aware that including open questions can take a ton of work to analyze. Also, since the respondents will have to type out responses, this can lower your response rate. This is especially true for mobile users.
Closed questions tend to have higher response rates. It’s also much easier to analyze closed questions statistically. This can make them very useful when you’re trying to quantify things, such as how many of your users are interested in a given feature you’re proposing.
When using multiple choice questions, include an “Other, please specify” option. This can help you capture answers you might not have accounted for when writing your survey. These answers can help when you need to revise or revisit an survey to capture new or updated data. Also provide a “Not applicable” or “Don’t use” option to questions. It helps prevent people from skipping questions or giving fake answers. People get angry when asked questions they can’t answer, and it skews your data if they try to do it anyway.
Writing your survey
Keep it simple. Keep it short.
Make sure every question is worth it. Every extra question reduces your response rate, decreases validity, and makes all your results suspect. Ask only the necessary questions to get data about what you’re trying to learn. Don’t include unnecessary questions.
SurveyMonkey found people tend to spend less time on each question the longer a survey is. For surveys longer than 7–8 minutes, completion rates dropped by 5% to 20%.
Use simple and direct language. People read more text when there is a lot less of it. You are not your audience. Avoid jargon, advanced concepts, or abbreviations. Include clear and concise explanations when you can’t. By avoiding ambiguity, your questions can mean the same to everybody, so your data will be more clean.
Avoid double barreled/compound questions. These are questions where you’re asking two things at once:
Would you like to buy a new phone and laptop?
vs.
Would you like to buy a new phone?
Would you like to buy a new laptop?
Make sure your questions are only asking one thing at a time. Double negative questions should also be avoided.
Keep a logical flow to your questions. In order to make the questions easier and faster to answer they should be grouped with like questions and ordered in a logical manner.
Focus on behavior, not intent.
Only ask applicable questions. Don’t annoy people by asking questions that don’t apply to them. When respondents choose a particular answer, show them one or two more questions about that topic that would be applicable in that case. Choose a survey platform that allows conditional questions. They can help you display particular questions based on user responses. It helps keep your list of questions as short as possible for each respondent.
Minimizing Bias
After putting in the work to craft your survey, you want the data you collect to be as accurate as possible. It’s important that you write your questions in a way that doesn’t bias your respondents.
Biased surveys can influence you There are a few ways you can minimize and avoid bias when structuring your survey:
Avoid leading questions. A leading question is one that nudges the respondent towards the answer that you’re looking for. You don’t want to imply an answer through the phrasing of a question.
“How helpful is our app?” (Leading)
vs.
“Do you think our app is helpful?” (Good)
Avoiding priming responses. Priming is when questions or concepts earlier in a survey influence responses to later questions. Say your survey asks to rank a series of customer support options you offer. In the next question, you ask :
“Does our platform offer enough support options?”
By listing all of the support options first, you may have primed the respondent. They may be more likely to answer that you do offer enough support options, since you just listed a bunch of them, even if they don’t necessarily think that.
Be mindful of orders. How you order your answers, especially in long lists, impacts responses. Logical groupings, randomized lists, and short lists work better than long, alphabetical lists. Ordering issues can skew your data, so test alternative list orderings when you test your survey. Be mindful what you put as the first option in a list.
Because people scan instead of read, the first words of items in lists can cause them to overlook the right choice. Many people choose the first thing that sounds like it might be right and go to the next question. Items at the top and bottom of lists may attract more attention than items in the middle of long lists.
Balance your scales. How you choose your words can influence how people answer. When using scales, you want to avoid using wording that skews people more towards either end. Consider the following scale:
Extremely positive
Very positive
Somewhat positive
Not positive
The wording provided in the scale skews towards positive. There are three positive options and only one negative option. The negative option also uses the word “positive.” When using scales, also avoid options that overlap in their answers.
Use neutral wording. You don’t want to imply particular answers or give away your expectations. You can also unintentionally confront people with loaded questions. Avoid writing in a way that forces the respondent into an answer that doesn’t reflect their opinions. It can throw off respondents and is one of the leading contributors to respondents abandoning surveys.
Test your survey
Test your survey before sending or releasing it. This can even be with a colleague or someone from your organization to pilot the survey. Don’t go far into survey background. Only provide the information any potential participants would have. Give them clear direction for the type of feedback you are looking for. Possible directions include:
-
Are there any questions that didn’t make sense to you?
-
Are there any questions you couldn’t answer or were missing the answer you wanted to provide?
Testing can help make sure you have appropriate options available for your questions. It can help you minimize or remove any bias worded or structured into your survey. You can make sure you’re on track with asking the right questions in a way that is easily understandable as well.
Distributing your survey
You’ve written great questions. You’ve structured your survey to encourage people to complete it. You have incentives lined up to encourage participation. There are still things to be mindful of when you send it out. Be aware of how you distribute your survey and who it reaches.
When running a survey, you’re trying get a sample (the people who complete your survey) of the population that you’re trying to measure (e.g. your potential customers, target persona, or user group). Selection bias is when the sample is not representative of the population. If this occurs, the conclusions drawn from your survey might not represent the population accurately. What can cause selection bias? These are a couple common reasons:
Undercoverage. This is when a segment of the population isn’t properly represented. If you’re trying to learn more about store shopping habits, and only ask people at a mall, you miss out on people who don’t like shopping at malls, people who restricted from shopping at malls, and people who shop elsewhere. Asking only people who are connected to the mall can also skew your responses.
Nonresponse bias. This happens when the people who choose to complete your survey are meaningfully different from the people who choose not to. It’s not always true that the people who didn’t respond would have done so the same way as those who did. There are a number of ways to help check for and correct nonresponse bias. Following up with the people who didn’t respond can be a good way to help address and correct it.
Tying it together
It’s not easy to write good surveys. Surveys are a useful UX tool to provide input during the design process. Focus on creating a good experience for your participants by writing clear, appropriate questions. Designing an effective survey is going to produce the best results. It takes some thought and effort to make sure you’re doing it right. The key lessons outlined in the article are:
-
Have an objective in mind
-
Know what information you need to gather from your survey
-
Ask the right types of questions for what you’re trying to learn
-
Write clear and simple questions
-
Make the survey short and sweet
-
Avoid introducing bias
-
Test and refine your survey
Keep it short, keep it simple. Keep your participants and users in mind when writing your survey, and remember to engage with your audience.