Masterclass in Survey Design Best Practices for Survey Research

Last updated: 04 Dec 2024

Two colleagues, a woman and a man, planning their survey design on a whiteboard.

Survey design is key to effective survey research. In our recent webinar with Matt Hilburn, a marketing research statistician with over 15 years of experience, we delved into key principles and techniques that make a survey not just successful but invaluable to business decision-making. In this article, we'll explore these best practices. 

Defining the Research Problem and Objectives 

Before diving into survey design, the most crucial step is understanding the research problem you're addressing. This step is often skipped, leading to disjointed or ineffective surveys. According to Matt Hilburn: 

“Before writing any questions, we have to fully identify what the problem is. This is a frequently missed step.” 

Clearly defining the business problem helps streamline your survey research process by ensuring the questions you create can answer the business objectives. 

Start by conducting a problem audit, especially if the problem isn't well-defined by the stakeholders. A problem audit can involve: 

  • Interviews with key stakeholders. 
  • Review of previous research or secondary data. 
  • Understanding the history and context surrounding the issue. 

Once the problem is clear, the next step is to translate this into actionable objectives. These objectives guide the entire survey design process.  

For example, if the business objective is to understand why Gen Z customers are not engaging with a product, the research objective might be to identify the factors that influence Gen Z’s purchasing decisions. 

Clearly defined objectives help avoid scope creep—the tendency to add questions that aren't directly related to the problem you're solving. Keeping the survey focused on three to five objectives ensures that it remains concise and targeted. 

Get Started with Your Survey Research Today!

Ready for your next research study? Get access to our free survey research tool. In just a few minutes, you can create powerful surveys with our easy-to-use interface.

Start Survey Research for Free or Request Product Demo

Developing a Research Plan 

Once the research problem and objectives are clear, the next crucial step in survey design is developing a comprehensive research plan. This document serves as the blueprint for your entire survey research project and ensures that all stakeholders are aligned. As Matt Hilburn emphasized during the webinar: 

"By having a documented research plan that has been approved by all the stakeholders, you can always reference it, ensuring the survey stays focused and on track." 

A good research plan outlines the background of the project, the objectives, and the specific approaches you'll take to gather the data. It ensures that there is a clear road map for how the data will be collected, analyzed, and used to address the problem. Here's what your research plan should include: 

  1. Background: Provide context around the research problem. Why is this study being conducted? What are the potential outcomes and how will they benefit the organization? 
  2. Objectives: List the business and research objectives clearly. For each objective, outline the type of data you will need to collect. 
  3. Methodology: Define whether the data collection will be quantitative (e.g., surveys) or qualitative (e.g., in-depth interviews--IDIs or focus groups). Make sure the chosen method aligns with the problem you’re solving. 
  4. Sample Plan: This section identifies who will be taking the survey, where the respondents will come from, and the sample size needed. The sample plan ensures your data will be representative and with enough statistical precision to or achieve reasonable confidence intervals or identify important differences. 

Since most surveys involve convenience sampling, it's essential to clarify how you will ensure your sample is representative of your target audience. Additionally, setting expectations early on about the required sample size helps avoid common issues related to statistical significance later in the process. 

A well-crafted research plan acts as a safety net for the project. If, at any point, someone wants to change the survey’s direction or add new questions, you can refer to this document to assess whether those changes are consistent with the original objectives. 

Crafting Survey Questions: A Step-by-Step Approach 

Once the problem is defined and the research plan is in place, it’s time to begin crafting the survey questions. While it may seem intuitive to jump straight into question-writing, Matt Hilburn cautioned against this approach: 

"Step one for a lot of people in writing a survey is writing questions, but that’s almost the last step. The first step is everything we've already discussed—defining the problem, setting objectives, and creating a research plan." 

In other words, writing survey questions comes after all the foundational work is complete. The key is to ensure that every question aligns with the research objectives. Here’s how to approach writing your questions: 

  1. Convert Objectives into Questions: 
    Each research objective should be translated into one or more survey questions. For example, if your objective is to understand brand awareness, the questions might include: 
    1. “When you think of [product category], which brands come to mind?” (Unaided awareness) 
    2. “How familiar are you with [specific brand]?” (Aided awareness) 
  2. Stick to 3-5 Objectives: 
    Surveys that attempt to cover too many objectives risk becoming overwhelming for respondents. A good rule of thumb is to limit your survey to three to five key objectives. This keeps the survey focused and concise, while still gathering the necessary information. 
  3. Avoid Scope Creep: 
    As you develop your questions, it can be tempting to add extra queries that seem interesting but don’t directly relate to the objectives. 
  4. Create a Logical Flow: 
    Surveys should follow a logical progression, moving from general to specific questions. Start broad, with questions that warm up respondents, and gradually narrow down to more specific or technical items. This is often called the funnel approach. 
  5. Keep Questions Clear and Neutral: 
    Avoid common pitfalls such as double-barreled or leading questions. A double-barreled question asks about two things at once, like "How satisfied are you with our product’s quality and price?" These should be split into two separate questions. Similarly, leading questions (e.g., "Don’t you think our product is the best on the market?") can bias responses. Neutrality is key to obtaining accurate data. 

Writing effective questions is as much about strategy as it is about clarity. Always ensure that the language is simple and understandable to all respondents, avoiding industry jargon unless you are targeting a specialized audience. This ensures you collect high-quality data that directly informs your research objectives. 

Inside of a corrugated tube or funnel: Illustrating the funnel approach to survey design.

The Funnel Approach to Survey Structure 

The funnel approach is a proven method in survey design where the structure moves from broad, general questions to more specific and detailed ones. This strategy ensures that respondents are eased into the survey, reducing fatigue and keeping them engaged throughout. 

Key Components of the Funnel Approach: 

Introduction and Warm-Up Questions 

Start the survey with an introductory section that explains its purpose, estimated completion time, and any incentives being offered. This is also where respondents are reassured that their responses will be confidential or anonymous, if applicable. 

General to Specific Questions 

Following the introduction, include a few general questions that are easy to answer and get respondents into the flow of the survey. For example, if you’re conducting a survey on customer satisfaction, you might start with a question like, “How familiar are you with our brand?” 

Once respondents are engaged, you can start diving into more specific questions that align with your research objectives. The funnel approach ensures that respondents are ready to tackle more detailed questions because they’ve had time to think about the broader context.  

For instance: Begin with general questions like “How satisfied are you with your overall experience with our product?” 

Gradually introduce more specific questions like “How would you rate the quality of customer service you received during your last interaction?” 

Sensitive Questions and Demographics 

Any sensitive questions, such as personal income or demographics, should typically be saved for the end of the survey. By the time respondents reach these questions, they’ve already invested time in completing the survey and are more likely to provide accurate answers. 

However, there are exceptions. For instance, if demographic information is necessary for screening respondents (e.g., you need to ensure you’re only surveying a specific age group), those questions may need to come at the beginning of the survey.  

Screening and Filter Questions 

Incorporate screening questions at the start of the survey to ensure you are reaching the right audience. For example, if you're surveying only Gen Z customers, you could start with a question like, “What is your age range?” This ensures that respondents who don’t meet the criteria are filtered out early. 

By structuring the survey with a funnel approach, you create a natural flow that mirrors how people think. It reduces drop-off rates, increases data quality, and keeps respondents engaged from start to finish. 

Key Principles for Writing Survey Questions 

Effective survey design hinges on the quality of the questions. Poorly written questions can result in biased responses, low data quality, and ultimately, flawed insights. In the webinar, Matt Hilburn stressed the importance of ensuring each question serves a clear purpose and is directly linked to the research objectives: 

"It's really easy to say, ‘Let's ask this, let's ask that,’ but if those questions don't tie back to an objective, they don’t belong in the survey." 

Here are some key principles to follow when writing questions for your survey research: 

Avoid Double-Barreled Questions 

A double-barreled question asks about two things at once, which can confuse respondents and lead to unreliable data. For example: 

  1. Bad question: “How satisfied are you with our product’s quality and price?” 
  2. Better questions: “How satisfied are you with our product’s quality?” followed by “How satisfied are you with our product’s price?” 

Each question should focus on a single issue to ensure clarity and accuracy. 

Prevent Leading Questions 

Leading questions subtly push respondents toward a particular answer. This can bias results. For instance: 

  1. Leading question: “Don’t you think our product is the best on the market?” 
  2. Neutral question: “How would you rate our product compared to others on the market?” 

Keeping questions neutral ensures that you capture respondents' true feelings, rather than influencing them. 

Keep the Language Simple and Clear 

Surveys should be easy to understand. Avoid jargon or technical terms unless you are certain your audience is familiar with them. For example, instead of using industry-specific language, aim for clear, concise wording that is accessible to all respondents. This is especially important if you are surveying a general audience or consumers with varying levels of expertise. 

Use Scales Wisely 

When using rating scales, it’s important to consider both the number of points and the balance between positive and negative options. A common question in survey design is whether to use an even or odd number of points. Odd scales allow for a neutral middle point, while even scales force respondents to lean one way or the other. 

Matt suggested tailoring the scale to your specific needs: 

"If you’re conducting a regression analysis, you’ll want at least seven points on your scale. But if you’re just looking for a quick pulse, a three-point scale might be sufficient." 

Randomize Options to Avoid Bias 

Primacy and recency biases can influence how respondents answer questions. People tend to choose options that are listed first (primacy bias) or last (recency bias). Randomizing the order of response options can help mitigate this effect. 

Use Open-Ended Questions Sparingly 

Open-ended questions are valuable for gathering qualitative insights, but they can be time-consuming for respondents and difficult to analyze (though, recent advances in AI are significantly reducing the costs for analyzing open-end data). Use them sparingly and only when you need rich, detailed feedback. Matt recommends including at least one open-ended question as a data quality check: 

"An open-ended response is a great way to gauge data quality. If there’s gibberish in that response, it can be a red flag." 

By following these principles, you ensure that your questions are clear, unbiased, and directly related to your research objectives, which is essential for collecting reliable and actionable data. 

Get Started with Market Research Today!

Ready for your next market research study? Get access to our free survey research tool. In just a few minutes, you can create powerful surveys with our easy-to-use interface.

Start Market Research for Free or Request Product Demo

Survey Length and Engagement Strategies 

One of the most common challenges in survey design is balancing the length of the survey with the engagement of respondents. Surveys that are too long can lead to respondent fatigue, higher drop-off rates, and ultimately, lower data quality. However, surveys that are too short may not provide enough detail to address the research objectives.  

Matt Hilburn highlighted this balance during the webinar, emphasizing the importance of keeping surveys as short as possible while still collecting valuable data: 

"We should be targeting around 7 to 10 minutes for a survey, maybe 10 minutes max. People like you and I might enjoy taking surveys, but the average person doesn’t." 

Here are some strategies to help you maintain respondent engagement while optimizing survey length: 

  1. Focus on Core Objectives 
    Stick to the three to five main objectives you identified in your research plan. Adding unnecessary questions dilutes the survey’s focus and can frustrate respondents. By keeping the survey tightly aligned with your core objectives, you ensure that every question serves a purpose. This prevents “scope creep,” where additional, non-essential questions extend the survey unnecessarily. 
  2. Consider Split-Sampling 
    If you find that your survey is becoming too long, but all the questions are critical, consider using a split-sample approach. This method randomly divides respondents into groups, with each group answering a subset of the questions. For example, if you have 40 questions, you might divide respondents so that Group A answers 20 questions and Group B answers the other 20. This reduces the time burden on each respondent while still gathering all the necessary data across the full sample. 
  3. Use Incentives to Boost Engagement 
    Offering incentives is a powerful way to encourage respondents to complete your survey, especially when the survey is longer or more complex. Incentives can include discounts, gift cards, or entry into a prize draw. However, it’s essential that the incentive is proportional to the time required to complete the survey. Incentives help reduce non-response bias, where only the most motivated respondents (e.g., those who are very happy or very unhappy with a product) complete the survey, which can skew the data and reduce its overall validity. 
  4. Eliminate Redundant or Repetitive Questions 
    During the question-writing process, you may find multiple questions addressing similar issues. While some overlap may be necessary for validation purposes, be mindful not to overdo it. Review your survey carefully to ensure that no question is redundant, and each one contributes uniquely to your research objectives. 
  5. Pre-Test and Time Your Survey 
    Pre-testing your survey with a small group is an invaluable way to gauge whether it’s too long or if the questions flow smoothly. During this process, time how long it takes respondents to complete the survey. If the average time exceeds your 7-10 minute goal, it may be time to cut a few questions or simplify complex ones. Remember that surveys taking less than five minutes may indicate missed opportunities to gather in-depth data, so aim for balance. 

By focusing on these strategies, you can ensure that your survey stays within an optimal time frame, keeping respondents engaged without sacrificing data quality. A well-designed survey will always strike the right balance between depth and efficiency, ensuring you gather actionable insights without overwhelming your participants. 

Two men at a desk working on a laptop: illustrating analyzing data for data quality.

Maximizing Data Quality in Survey Research 

Here are some key strategies to ensure the data you collect is both valid and reliable: 

Pre-Testing and Soft Launches 

Before launching your survey to a full audience, it’s essential to pre-test it with a small group of colleagues or target respondents. Pre-testing helps identify any confusing or poorly worded questions and allows you to make adjustments before the full launch. Matt suggests conducting a soft launch to about 10% of your sample size before the full rollout: 

"You’ll want to time it, review the responses, and see if there are any issues before sending it out to the entire group." 

This helps catch issues like faulty logic, broken skip patterns, or unanticipated respondent behavior, giving you a chance to fix them early. 

Use Quality Control Questions 

Including attention-check questions is a simple yet effective way to identify respondents who may be rushing through the survey. These are typically easy-to-answer questions that verify whether the respondent is paying attention, such as: 

  • "Please select ‘Strongly Agree’ for this question." 

If respondents fail these checks, it may indicate that they are not taking the survey seriously, allowing you to filter out their data before analysis. 

Eliminate Responses from Speeders and Straight-Liners 

Speeders are respondents who complete the survey much faster than the average completion time, suggesting that they are not fully engaging with the questions. As a rule of thumb, Matt advises discarding responses for anyone who completes the survey in less than one-third of the median time. 

Straight-lining is another indicator of low-quality data. This occurs when a respondent selects the same answer for all items in a grid or matrix question, regardless of the content. While some straight-lining might occur naturally, extreme cases should be flagged for review. 

Monitor Open-Ended Responses 

Including at least one open-ended question in your survey serves as a quality check. Responses that consist of gibberish or unrelated text can signal poor engagement or even fraudulent responses. Matt suggests reviewing these responses to identify and remove any outliers: 

"An open-ended response is a great way to gauge data quality. If there’s gibberish in that response, it can be a red flag." 

Randomize and Rotate Options 

To reduce biases like primacy (favoring the first option) and recency (favoring the last option), consider randomizing or rotating the order of response options. This ensures that no single option benefits from its position on the list, leading to more reliable data. 

Quota Controls and Sample Monitoring 

When using panel providers or targeting specific demographics, it's important to set quotas and closely monitor the sample. For example, if you're aiming for an even split between age groups, you may need to cap the number of respondents in a certain category once the quota is met. This ensures that your final dataset is representative of the population you wish to study. 

By implementing these techniques, you can improve the overall integrity of your survey research and ensure that the insights you derive are valid, reliable, and actionable. 

Best Practices for Mobile-Friendly Surveys 

In today’s mobile-first world, ensuring your survey design is optimized for mobile devices is no longer optional—it’s essential. With a significant portion of respondents completing surveys on smartphones or tablets, creating a mobile-friendly survey can greatly improve response rates and data quality. Matt Hilburn emphasized the importance of this during the webinar, noting: 

"More and more respondents are using mobile devices to take surveys, so you have to design with mobile in mind from the start." 

Here are some practical tips to ensure your survey is mobile-optimized: 

Keep Questions Concise 

Mobile users tend to have shorter attention spans, so it's important to keep your questions clear, concise, and to the point. Long-winded questions can be difficult to read and may cause respondents to abandon the survey. 

Best Practice: Limit questions to one or two sentences whenever possible. Avoid unnecessary words or complex language that could confuse respondents. For instance, instead of asking, "To what extent do you feel that our product meets your needs?" try "How well does our product meet your needs?" 

Use Simple and Short Answer Choices 

When designing questions with multiple-choice answers, ensure the response options are easy to read and select on a small screen. Large grids or long lists of options can overwhelm mobile users and lead to poor data quality. 

Best Practice: Use radio buttons or dropdowns for single-choice questions and checkboxes for multiple-choice questions. Keep the number of answer choices to a minimum—ideally between 4 to 6 options. If you have more than six, consider breaking them into smaller, more manageable sets of questions. 

Optimize for Touchscreens 

Mobile users will be interacting with your survey via a touchscreen, so it’s crucial that your survey is easy to navigate. Buttons and interactive elements that are too small can frustrate respondents, leading to a higher drop-off rate. 

Best Practice: Ensure that all clickable elements—buttons, radio buttons, checkboxes, and dropdowns—are large enough to be easily tapped on a mobile screen. A good rule of thumb is to make these elements at least 44px by 44px in size to accommodate fingers of all sizes. 

Test on Multiple Devices 

Before launching your survey, it’s important to test it on different devices (smartphones, tablets, and desktops) to ensure it renders correctly across all screen sizes. Some surveys may look fine on a desktop but become unreadable on a smaller mobile screen. 

Best Practice: Conduct internal tests on multiple devices (iOS and Android) and operating systems. Look for any display issues, such as questions running off the screen, text overlapping, or buttons that are too small to click. Fix any problems before sending the survey to your audience. 

Limit the Use of Grids and Matrix Questions 

While grids and matrix questions (where respondents rate multiple items at once) are useful for gathering a lot of data quickly, they are not ideal for mobile devices. These types of questions often require horizontal scrolling, which can frustrate mobile users. 

Best Practice: Instead of using a single grid or matrix question, break it down into multiple, simpler questions. This makes the survey easier to navigate on a mobile device and reduces cognitive load for respondents. 

Consider the Total Length of the Survey 

Survey length is especially important on mobile devices. If a survey is too long, mobile respondents may abandon it before completion. Given the smaller screen size and the need for simplicity, it’s crucial to minimize the number of questions while still capturing the data you need. 

Best Practice: Aim for a survey length of around 5-7 minutes when targeting mobile users. Remove any non-essential questions and consider whether certain sections of the survey can be broken into a separate survey at a later time. 

Use Mobile-Specific Question Types 

Some survey tools offer mobile-specific question types designed to improve the user experience on smaller screens. These might include slider scales, drop-down menus, or tappable images. 

Best Practice: Where possible, incorporate mobile-friendly question types like sliders, which allow respondents to answer questions by dragging their finger across the screen. These interactive elements not only improve the experience but can also keep respondents engaged. 

By keeping these mobile-first principles in mind, you can ensure that your survey is accessible, user-friendly, and engaging for respondents, no matter what device they’re using. Optimizing your survey research for mobile helps you capture a more representative sample and reduces the likelihood of respondents dropping out mid-survey. 

Final Quality Checks  

Once your survey is designed, structured, and optimized, the final step in the survey design process is conducting thorough quality checks. Quality assurance ensures that the survey functions as intended, that all logic and routing work properly, and that you will collect valid, actionable data. 

Matt Hilburn stressed the importance of these final checks during the webinar: 

"Once the survey is launched, there’s no turning back. It’s crucial to check everything beforehand to avoid mistakes that could affect the quality of your data." 

Here are some key quality checks and best practices to follow before launching your survey: 

  1. Internal Review and Testing 
    Have colleagues from different departments review the survey on multiple devices to catch errors or usability issues. This ensures clarity, functionality, and readability across platforms. 
  2. Pre-Test with Target Audience 
    Test the survey with a small group of respondents from your target audience to identify confusing questions or navigation issues. Conduct live feedback sessions to pinpoint problems in real-time. 
  3. Soft Launch for Data Validation 
    Send the survey to 10% of your sample first to catch any issues with logic or data quality before a full rollout. Review responses for inconsistencies or technical errors and adjust as needed. 
  4. Quota and Sampling Accuracy 
    Use tools that automatically manage quotas and monitor sample diversity to ensure you’re reaching the right demographics. Stop collecting data for groups once quotas are met. 
  5. Logic and Routing Check 
    Test all skip patterns and display logic to confirm that respondents only see questions relevant to their answers. Walk through every possible respondent path to catch logic errors. 
  6. Align with Analysis Plan 
    Double-check that every question aligns with your analysis plan and contributes to your research objectives. Eliminate questions that don’t directly serve a purpose in your final analysis. 
  7. Run Final Spell and Grammar Check 
    Review the survey for typos, unclear instructions, or inconsistencies in language. While perfect grammar isn’t necessary, clarity is essential to avoid confusion. 

These steps ensure your survey design is polished, logical, and ready to yield high-quality data. Taking the time to verify every element prevents costly mistakes and sets the stage for actionable insights. 

Conclusion: The Path to Successful Survey Design 

Effective survey design requires a thoughtful, structured approach that prioritizes clarity, focus, and data quality. From defining the research problem and developing a clear plan to crafting questions and optimizing for mobile devices, every step contributes to the success of your survey research project. By adhering to the best practices outlined in this article, you’ll be well-equipped to create surveys that not only engage respondents but also yield the high-quality data necessary for making informed decisions. 

As Matt Hilburn reminded us during the webinar: 

"A well-designed survey leads to actionable insights that drive business decisions. Don’t skip the critical steps—take the time to get it right, and the data will speak for itself."