Surveys / Quantitative Research


To gather data on a specific area or concept. Quantitative research will give you a measure of attitudes, opinions, behaviours and general results from a larger sample but not the underlying reasons for them, i.e. it tells you the “what” but not the “why”.

When To Run This

You want to understand broad trends of attitude or behaviour.


~1 hour


  • A significant amount of people willing to answer your survey - see point #4 below for more info


  • Host platform



  • Decide if a survey is right for your research. What is it that you really want to find out? 

    1. Will a survey help you with this problem?
    2. How does the survey answer the problem?
    3. Who are the right people to ask?
    4. How can you reach them?
    5. What do you need to make them understand your questions?
    6. Which (if any) statistical methods do you need to measure the data?
  • Prepare your questionnaire. If you decide to go ahead, structure your questions (see the section below for tips), and host it on an intuitive, easy-to-use online platform, such as Survey Monkey, Google Forms or Typeform.

  • Decide on how many responses you need for your data to be statistically significant. Discuss with your client and do a guess-and-check. The more precision you want, the bigger the sample size you’ll need.

  • Find people to do your questionnaire. If you have specific user criterias this can be done through an external agency. If it’s a general subject that everyone can relate to, you can recruit through the DAN network or on your social media network, e.g. Facebook and LinkedIn.

  • Test your survey on a small group to make sure there are no issues, and refine it based on the responses and feedback.

Action It:Action It:

  • Send the survey out. Keep an eye on the response count to make sure the survey is working. 

  • Update the questions and answers if necessary. If you have a feedback field in your survey, such as an open field for text comments, read a few of them just in case respondents bring up areas where they felt there was no suitable option as a response. Update the survey as required.

Next Steps:Next Steps:

  • Analyse your data. Read the responses, measure your results against the key objectives of your questionnaire, and draw your conclusions.



  • Start with questions that are easy to answer - people are less likely to stop once they have already put some work into answering the survey.
  • Be clear on the questions you want answers to. Use simple language (no jargon).
  • Avoid biases introduced by the question itself. For example: 
    • Biased: “Are you satisfied?”
    • Neutral: “How satisfied or dissatisfied are you with this process?”
  • You can use aided and unaided questions 
    • Open field - e.g. “Which brands of chocolate do you know”, followed by an empty box.

      Pros: respondents might include more information, including feelings, attitudes and their ideas of the subject. This helps researchers to better understand the respondents' true feelings on an issue.

      Cons: it takes more time for respondents to answer and might produce a lot of unnecessary data.  
    • Multiple choice - e.g. “Which of the following brands of chocolate do you know?”, followed by a list of brands.

      Pros: This is versatile, intuitive, and produces clean data that’s easy to analyse.

      Cons: the results are limited to the choices you provide, and if your answers aren’t comprehensive, your results might be biased. 
  • You can use the following types of fields to display your answers:
    • Radio button - for single choice. Selections are listed and only one selection can be made.
    • Checkbox - for multiple choices. Selections are listed and more than one selection can be made.
    • Short/long answer - these fields are open and you can set a maximum number of characters to limit responses. 
    • Matrix - multiple questions presented on a grid. They’re generally easy to interpret (and answer) since the scales and answer options stay the same across all items. 
  • Allow for neutral and “N/A” responses. When you force a respondent to give an answer, it can pollute your results with non-responses masquerading as real answers.


It's Going Well When:

A variety of responses are coming through

Watch Out For:

  • Respondents that are mostly neutral - Consider leaving the whole participants data out of the final reading
  • Outlier data - Consider leaving this data point out as to not skew the average