Learn how to design a great survey, from planning to asking the right questions.
People get surveys all the time. Why should they take yours? And how do you know if their answers are accurate and thoughtful?
You can’t guarantee that everyone who gets your survey will take it seriously (or take it at all). But you can minimize bias and increase your response rate with survey best practices.
What are survey design best practices? They’re guidelines for optimizing a survey for better engagement and reliable results. In this guide, you’ll learn how to ask great questions, fix common survey errors, and make your survey shine.
Simply put, you conduct surveys to make informed decisions. For example, you might collect event feedback from people who attended your recent conference. Did they find the speakers engaging? Did they like the food? Have a good overall experience? Their feedback can then inform planning for your next event, ensuring that it will meet their preferences and expectations.
Yet the answers you get are only as strong as your survey. If your survey is unorganized, hard to read, or has many errors, you might see “satisficing.” Satisficing is when people don’t put effort into answering your survey, skewing your data.
You can make your surveys more engaging and show you’re respectful of respondents’ time by following best practices for offline and online survey design.
Let’s say you want to use surveys to conduct market research, like measuring brand awareness to inform your advertising. You might be tempted to ask lots of questions. The more data, the better. Right?
The problem is, if your survey is a lot of work to fill out, people might be less inclined to complete it. Plus, if your survey seems unfocused or poorly organized, people might lose trust in you and abandon your survey completely.
What are best practices for survey design? Before you start writing survey questions, it’s important to set a goal for your survey.
To do this, ask yourself: What am I trying to learn or measure? Who should take my survey? What do I want to do with their answers?
You might end up with a survey goal such as:
To improve our advertising, I want to survey people aged 25 to 34 to see how familiar they are with our brand.
This will help you stay focused and plan your survey design, from questions to survey question type.
Once you have a goal, you can get a better sense of the questions you need to ask. Your survey design plan should include objectives that go back to your goal. If your goal is to measure brand awareness within your target age demographic, you might have objectives to:
Remember to limit your number of objectives—these will help you choose which questions to ask. And if you have too many objectives, you’ll probably have too many questions for one survey.
You have a goal and objectives. Before crafting your survey questions, define what type of data you need to achieve those objectives. Determining whether you require qualitative or quantitative data will shape your entire question design process and ensure you gather information that effectively supports your goals.
Here’s an example of a qualitative survey question:
When you think of this product category, what brands come to mind?
You’re asking people to recall brand names on their own. What you’re collecting is more difficult to analyze, but it also may give you a better understanding of whether or not your brand is top of mind.
The insights you get will be different from those you’ll get from a quantitative survey question, or closed-ended question, which asks people to choose from a list of defined options. For example:
Which of the following brands have you heard of? (Select all that apply.)
The quantitative question takes much less effort to answer. But the data may not be as illuminating. That said, it might be a bit easier to analyze quantitative data, which yield raw numbers and percentages.
It’s up to you to determine the balance of closed and open-ended questions in your survey. Here’s what we generally recommend for open-ended surveys:
If your survey questions confuse, mislead, or offend your audience, you might not get accurate answers. Or any answers at all. Here are a few tips for writing great survey questions.
Rank the following products from your favorite (1) to your least favorite (5).
Which of the following words would you use to describe our product? (Select all that apply.)
Which factors most influenced your recent purchase? (Select up to 3.)
As a researcher or survey creator, you have goals or hypotheses in mind. Unfortunately, it’s common for researcher bias to creep into surveys. Here are some of the most common types of survey question bias and how to avoid them.
A leading question is written in a way that influences survey responses. For example:
At Voltara, we take pride in our product quality. How satisfied are you with your most recent product purchase?
The first part of this question might affect how the respondent views their experience, leading them to answer more favorably.
A better way to ask this question would be to leave out the first sentence altogether and make sure to give people a range of answer options from “Not at all satisfied” to “Extremely satisfied.”
A loaded question assumes something about the respondents that might not be true. The following example of a loaded question isn’t necessarily a problem:
Which factors influenced your most recent purchase? (Select all that apply.)
But if someone who didn’t make a recent purchase is forced to answer the question, their answer will be inaccurate, skewing your data.
A double-barreled question asks people to give only one answer to two different questions. Here’s an example:
How satisfied are you with the price and quality of our product?
If someone chooses “satisfied,” what are they responding to? What if they’re happy with the quality but not the price? It will be challenging to understand from their answer.
Absolutes use words like “every,” “always,” “all,” in the question prompt. These might make the respondent agree with a strongly worded question without allowing for more nuanced opinions. Here’s an example:
Do you always make purchases online?
Your respondents might make online purchases most of the time, half of the time, or on occasion. The absolute nature of this question, including the yes/no answer options, won’t provide useful data.
What’s a sensitive survey question? It depends on who you ask. Generally, questions about religion or faith, ethnicity, race, gender, age, sexual orientation, and income are considered sensitive. And you need to ask these questions in the right way or risk losing your audience.
Here are a few tips for asking sensitive survey questions:
When you planned your survey, you determined the type of data or insights you need. For example, closed-ended and open-ended questions will give you different insights. Plus, there are lots of different survey question types, from sliders to dropdowns that you’ll need to consider.
Yes, you should generally limit the number of open-ended questions on a survey. But a well-placed open textbox question can give a lot of meaning to your quantitative data.
Take a look at the Net PromoterⓇ Score (NPS) question, which asks someone how likely it is that they’d recommend a company to others. NPS is one of the leading metrics companies use across industries to measure customer loyalty.
People are asked to choose from 0 (not at all likely) to 10 (extremely likely). The result is one number between -100 and +100 which companies can track over time and compare to NPS industry benchmarks.
But a number can only tell you so much. That’s why it’s a good idea to ask people to explain their score with an open-ended question.
It will take more time to dig into the resulting qualitative data. But you can gain important insights or the “why” behind your numbers by taking a look.
For example, maybe people who give you lower ratings mention a negative interaction with customer service, while those with more positive ratings are happy with your product quality. Now you know where to focus your improvements.
There are many types of multiple choice questions you can use for your survey. Here’s a quick overview of some multiple choice question types, including what to consider.
A multiple choice question can be as simple as asking someone to choose one option from a list. You might ask a demographic question like, “Which of the following best describes your current relationship status?” In this instance, you’d likely only allow someone to choose one answer option.
You can also allow someone to choose multiple answer options by enabling checkboxes. In that case, you’ll want to let people know that they can select more than one answer (e.g. “Select all that apply”).
Other formats, like dropdown questions, can be helpful if you’ve got lots of answer options but don’t want to overwhelm your respondents. For example, an age dropdown question can be much easier to read and use on a mobile device.
You can also use the ranking question type, which allows someone to rearrange answer choices in their order of preference.
Remember that ranking questions don’t indicate how much or how little someone likes an item. For example, someone might love the television shows “The Office” and “Friends,” and rank them one and two in a list of five options.
But they may feel neutral or really dislike the other shows on the list. All you know is that they ranked “The Office” first and “Arrested Development” last. Make sure to review the pros and cons of ranking questions before you use them in a survey.
Sometimes, it makes sense to format your question as a grid, or matrix survey question. For example, you might ask someone to rate how satisfied or dissatisfied they are with five aspects of your customer service.
Instead of asking someone five separate survey questions, you can have them respond to different statements in one survey question. Here’s a video explaining how to use a matrix question in a survey:
Of course, there are pros and cons to matrix questions. Matrix questions are susceptible to straightlining, which is when people choose the same response for each question without taking the time to consider their answers. Here are a few tips for writing matrix questions to keep in mind:
Are your surveys great, or are they suffering from these 5 most common mistakes? Read our comprehensive guide to find out.
Once you’ve figured out which types of survey questions you want to ask, you need to think about how you want people to answer.
For example, when it comes to multiple choice questions, how do you want people to rate their satisfaction? They could choose from 1 to 10, select a smiley face, rate you with a number of stars. Or you could ask them to choose from a worded list, from very satisfied to very dissatisfied.
Let’s say you show a website feedback survey to people who just purchased from you online. You want to understand how easy it was for them to find what they were looking for on your site.
Now you’ve got some options. Do you ask them to rate their experience from 1 (not at all easy) to 5 (extremely easy)? Or do you remove numbers altogether and give them a list of worded answer options:
When it comes to numbered vs worded lists, here’s what to consider:
Whether you go for words, numbers, or even symbols, you have to pick the right number of answer options for your survey. Here’s how.
Sometimes, for the sake of brevity and clarity, you might use yes/no or agree/disagree survey questions. When you give someone only two options to choose from, they have to take a stance.
For example, if you ask an employee to answer yes or no to a series of statements about their experience, it might be easy to analyze the data.
But two answer options removes the possibility of a neutral answer option or any sort of nuance. How many degrees of “yes” or “yes, but” could get lost when someone has to choose from only two answer options? Think about your survey goal, then decide if two answer options will get you the most helpful insights.
For more nuance, you should provide more than two answer options. The Likert scale is one of the most widely-used methods for measuring feelings, behaviors, or opinions on a scale.
Not all rating scales include a neutral answer option like “Neither agree nor disagree.” But if you’re going to provide a neutral option, it’s generally accepted that 5- and 7-point scales are a good way to do it.
Of course, it’s up to you how useful different degrees are. The five answer options for the question, “I am satisfied with the culture of my workplace,” might be enough. If you needed a more granular breakdown of employee sentiment, you might provide even more answer options.
You have lots of questions, and it might be tempting to ask as many as you can in a survey. But covering too many topics and asking too much of your respondents may cause them to abandon your survey or rush through your survey, giving you inaccurate answers.
In fact, our recent research shows that surveys are getting shorter on average, with 53% of surveys containing five or fewer questions per page.
Of course, context matters. If you’re running a mandatory, yearly employee self-evaluation, you can worry less about survey length and more about ensuring the questions are clear and unbiased.
Question order matters because it primes your survey respondents. Priming is when you influence or prepare someone to answer questions a certain way. Here are some question order best practices:
Many times, people take your survey because they want to, not because they have to. For example, you send a patient satisfaction survey to people who recently got treatment at your medical center.
Patients don’t need to tell you about their experience. But with the right survey introduction, you could compel them to give you feedback because it’ll help you make improvements that are important to them.
In other instances, like employee surveys or training surveys, people are required to respond. But if you’re running market research or trying to reach a particular demographic, you’re asking respondents to do you a favor. That’s where incentives come in.
A survey incentive is what you offer someone to take your survey. It could be as simple as a small gift card for their time or entry into a raffle. Here’s what to consider:
Before you send your survey, you want to make sure it’s polished, professional, and accurate. Otherwise people might think it’s suspicious or have trouble submitting their answers. Here are some tips for getting it right before you send.
You might want to add your brand or logo depending on the type of survey you’re running. For example, if you’re surveying customers about a recent purchase, a visually striking survey might capture their attention and be more engaging.
You can also embed your survey into your email or website, making it easy for people to take it–and giving it a professional look. Here are a few other tips for improving the visual design of your survey:
Once you polish your survey design, it’s time to review and proofread it. Here’s how:
Don’t have time to read up on all of our survey best practices? Here are 10 ways to make sure you get reliable results from your survey or online form design.
Answer these questions before you start your survey: Why are you running this survey? Who are you sending it to? What are you going to do with the results? This will keep your survey focused and actionable.
Long survey questions? Break them up into shorter sentences. Use straightforward, simple language. Avoid jargon, technical language, and acronyms for general audiences.
Include any special instructions. For example, if you want respondents to choose multiple answers, say, “Select all that apply.”
We recommend sending a one-page survey with 10 or fewer questions. Your survey participants will thank you (and probably give you more useful feedback).
Most of your questions should be closed-ended, meaning people can choose from a list of answers. Limit your survey to one or two open-ended (textbox) questions.
Take a look at your question language. Are you leading someone to answer in a certain way? Are you assuming something about your survey participants that might not be true? If you’re not sure, ask someone with a neutral perspective to review your survey before you send it out.
You can also enable question, page, and answer order randomization in SurveyMonkey (if applicable to your survey).
Avoid double-barreled questions, which ask for feedback on two topics in one question. And make sure your answer choices don’t overlap, which will skew your data.
Sometimes you need to collect demographic information, like age or gender identity. Or you need to ask questions about sensitive topics. When you’re not sure how to ask, rely on pre-written sensitive questions designed by research experts.
If you collect health information, enable HIPAA-compliance and let people know their information is protected.
In most cases, when you send your survey, let people know:
If you need lots of survey responses, consider using incentives. This is especially helpful for market research or customer feedback, where people might not be as motivated to respond. Popular incentives include gift cards, sweepstakes entries, or promised donations.
Survey ready to go? Preview your survey, taking it as if you were a survey respondent. Look out for survey writing errors, logic issues, inconsistent rating scales, or basically anything that’ll hurt your data accuracy.
It’s easy to overlook mistakes and bias in your own surveys. Share your survey with others so they can test it out too.
The customer feedback survey is one of the most popular survey types. Are you asking the right questions? Check out these top 5 questions you should ask customers. And get to know how to run an effective customer service survey.
Think about every single interaction a customer, or potential customer, has with your brand: Advertising, website purchases, customer service, loyalty programs, and more. Then, make targeted improvements to your customer experience strategy by collecting customer feedback at those key touchpoints.
Before you write a survey, check out customizable survey templates that are designed to get you reliable data. This customer experience survey template will help you learn more about your customers and track customer sentiment.
Net Promoter Score (NPS) surveys are a versatile way to track customer loyalty across the customer journey. Here’s how.
Want to improve your customer satisfaction surveys? Check out this comprehensive guide on customer satisfaction survey best practices. Here are some other helpful resources:
Companies and institutions of every size should care about how their employees feel. Check out the different types of employee surveys designed to measure and improve all aspects of the employee experience.
For example, regularly run employee satisfaction surveys to measure employee happiness. To collect candid feedback from your employees, enable anonymous responses. You may also want to let employees know that the data will be viewed in aggregate and that individual responses can’t identify them.
Make sure you’re providing a great candidate experience to attract top talent and boost your employer brand. Send a candidate experience survey to anyone who interviews at your company or interacts with your recruiters. Ask questions about the effectiveness of your communication, fairness of the interview process, and more.
How enthusiastic and committed are your employees? The answers have a direct impact on your overall employee satisfaction and loyalty. Here are some tips for using employee engagement surveys:
Organizations use pulse surveys to get real-time insights from employees. These are different from more formal, regularly scheduled employee surveys like performance reviews. Here’s how they work:
When an employee leaves your company, take the opportunity to ask for their feedback. Send them an exit interview survey to find out about their experience and where you could improve.
Only survey people who are leaving your company voluntarily. (Use a separate process for those who are laid off or let go.) And make sure to ensure their confidentiality so they feel comfortable giving honest feedback.
Whether you’re writing a survey or questionnaire, consider your audience. As you can see from these survey best practices, there’s not one right way to run a survey. For example, student survey questions will be different from market research surveys. Here are a few more survey best practices and resources by survey type.
How easy is it for people to navigate your product? What’s their experience like? What problems are they having? What do they wish they could change? Will your new product or service meet their needs?
These are just a few questions you can answer with a user experience (UX) survey. If you’re looking to generate new ideas, you can ask more open-ended questions. But if you’re testing a product, give respondents more closed-ended questions that’ll help you make a final decision.
If you’re in charge of fundraising or run a nonprofit, use nonprofit surveys to boost your donations. Here are a few tips:
Net Promoter, Net Promoter Score, and NPS are trademarks of Satmetrix Systems, Inc., Bain & Company, Inc., and Fred Reichheld.
We analyzed the data collected on our platform. See what’s trending in 2024.
Employee productivity measures how your employees add to your business. Calculate and improve productivity rates.
Get to know 4 survey pitfalls that can seriously impact your results.
See examples and get expert advice on how to ask the right questions.
Discover our toolkits, designed to help you leverage feedback in your role or industry.
Get the best data from your survey. Learn how to find survey respondents people with these tools and tips from our survey research experts.
Enhance your survey response rates with 20 free email templates. Engage your audience and gather valuable insights with these customizable options!