10 Things I Try to Find Out About Surveys

I like taking surveys. Not because I prefer being a random data point that will help tip the scale on statistical significance or because I enjoy answering how likely am I to recommend the product to my friends and colleagues (on a 0-10 scale no less), it is because I see every survey as a puzzle that begs to be solved. Here is a list of what I try to find about surveys:

  1. What decision is the marketer trying to make? I am not interested in those surveys that are simply collecting data for the sake of it or selectively seeking information to add data lipstick to something they are already doing.
  2. Is the data actionable? For instance, for pricing decisions are they asking only about attitudinal willingness to pay?
  3. Is each question necessary or could they have figured out the answers without some of the questions they are asking? For instance, “How much are you paying for Microsoft services?”
  4. Have they done the necessary qualitative research and not simply cut-n-pasted a template survey?
  5. Are they finding all the information they will need with the survey? For instance,  what use in asking about school preferences if the survey did not ask if the respondent is a decision maker for the child?
  6. How likely is it  their questions will confuse other respondents? For instance, giving options like  “Never”, “Rarely”, “Seldom” and Occasionally” all for just one question.
  7. Are they sampling the right target population?
  8. Is the survey designed to find psychographic segments and not just demographics?
  9. What kind of cross-tabs and regressions will they be running on the data and how reliable will that be?
  10. Finally, are they trying to solve too many decision problems with just one survey?

What is your take on surveys?

4 thoughts on “10 Things I Try to Find Out About Surveys

  1. Robert
    I am with you on this. If the numbers do not help solve a decision problem they are not relevant. Besides, the question itself could be worded wrong or in a biased way to elicit a more favorable response.
    I remember reading (not sure how true or the exact book name) Ogivy, found the flaw in asking respondents “have you read the book Gone with the wind” and changed it to “do you plan to read Gone with the wind?”.
    There is also the case of stated preferences vs. revealed preferences.
    The net is you need to push back on how it is relevant to the problem at hand.

    That said when used correctly the ratings questions do help.
    Sometimes the likelihood rating could be used as a standin for computing metrics like price elasticities (e.g., how the value changes with different prices). Other times the corresponding numeric rating could be used to regression analysis to predict other customer preferences.

    Like

  2. Hey Rags, I work in advertising (you may already know that).
    Why do some clients give me “stats” like, 98% of respondents say they are “somewhat likely” to “very likely” to agree that such and such does blah, bah, blah.
    This is from people who’ve presumably earned MBAs and are working at Fortune 500 companies, and they combine their numbers making them meaningless. I’m just a guy who thinks for a living. Why don’t they?

    Like

Comments are closed.