If you were to ask anyone on our Support Team what we would recommend you do before fielding, everyone would say “TEST!”. Test your survey then test it again. Then maybe test it several more times for good measure. Nothing saddens our fellow researcher hearts more than seeing a problem come into support that could have been easily avoided with some thorough testing. Waiting to discover these errors until after you have already fielded your survey is costly. You might have accidentally terminated respondents that qualified for your survey. Perhaps your custom code was not working and the data for all your respondents was not saved. Respondents are costly as are live survey mistakes. There are five key things we would recommend testing before fielding your survey.
1. Survey look and feel (spelling and display)
You want to make sure your survey looks good and sounds good. Spelling errors happen to the best of us and can be harder to notice if we have spent weeks on the same survey. Comb through your text to catch sneaky grammar or spelling mistakes. Having a fresh pair of eyes spend time checking your spelling can also help. Ask a colleague to catch what you might have missed.
You will also want to check that your survey is displaying each screen correctly. Previewing the question in the survey building platform is not enough. Each browser can behave differently with graphics, videos, and custom code. Checking how your survey looks on different screen sizes is also helpful in an era where we spend so much time on our phones. Sawtooth Software’s Lighthouse Studio platform has a handy “Test Mode” that allows you to change the size of the survey screen to match small cellphones, tablets, desktop, etc. No additional software required!
2. Survey flow (survey link, skip logic, quotas, and survey termination)
Be sure to go through the survey using different approaches when you test. Test the links that you will be sending your respondents. Make sure that all the questions that respondents should be seeing appear and questions that they do not qualify for are correctly skipped. Quota questions have skip logic for respondents that do not qualify. Try setting the quota limit to a low number and test that you are properly disqualified. Do the survey re-direct links or termination text appear correctly? You get the basic idea.
3. Survey questions are not ambiguous
You may understand what your survey is saying – but does your respondent? Will a respondent in one country read the survey question in the same way as another respondent? Your survey data is only as good as your survey quality. Check with a colleague (preferably one that is not familiar with your survey's subject matter), maybe even a focus group if you have the time, to make sure that you are asking the right questions and in the right way.
4. Data is saving correctly
Whether you are using custom code (Javascript, JQuery, HTML, CSS, etc.) or not, looking over your latest test data for data saving problems is always a good idea. Custom coding is probably the most likely reason for missing data or other issues, but problems can arise with simpler surveys as well. Try running through the live survey as a fake respondent with predictable choices and check your data for each question. Are you assigned to the correct quota? Are data inputs missing? Doing a simple gut check can help you feel more confident that your survey is ready for real data collection.
(*From one researcher to another, do not forget to delete all your test data before collecting real respondents! All sorts of problems can arise from forgetting you have test data mixed with real data.)
5. Make sure you can answer your research questions
If you were an architect building a bridge, you would not wait halfway through the building process to check that your bridge will make it to the other side. The bridge would be useless if it did not fulfill the whole purpose of building a bridge! The same goes for survey building. Even if your spelling is perfect, your survey follows the correct flow, your respondent understands the questions, and your data is saved correctly – your client will not be happy if the results do not answer their research questions.
Use your test data to run through the analysis. If you used random data your results will be random, but it can still show you what your report or simulator may look like. You can create your own data that demonstrates preferences if you want to go the extra mile. You may even want to send your client a simulator with fake data to make sure the setup will meet their needs. Bonus: it is not too late to add or modify some questions or change design settings since your survey is not live! Modifying live surveys can be very messy or destructive and should be handled with caution.
You will be more confident in your survey and potential survey results if you test these five things. If you do run into problems or questions, feel free to reach out to our Sawtooth Software Support Team. Best of luck on your research journey!
Christina is the manager of Client Services at Sawtooth Software. For more than four years, she has consulted with customers on their research and software usage. Christina has a background in psychology and business.