So you did a survey, received great response rate and found a great percentage of people prefer your product. Or you see a study quoting a major survey from a renowned market research firm that finds, “customers are willing to pay price premium for great customer service”. We can take these results on their face value and act on it, or we can ask some key questions to see the errors in the way survey us conducted or the data is analyzed.
In today’s WSJ, Carl Bialik, who writes The Numbers Guy column points out some of the common yet not so easy to recognize errors in surveys and interpreting the results.
- Leaving out Key Groups: While researchers take care to find a representative sample of people, the survey population may limit or omit a few key groups that can skew the results. The worst form of this is using “convenience sample”, surveying only those that are available to us rather than the target population.
- Respondent Honesty: There is inherent challenges in getting respondents to answer the survey honestly. Be it about a sensitive subject or simple intention to purchase we tend to mask our responses or give responses that we think the survey taker wants to hear. The problem is worse when the survey is administered in person or over telephone.
- Losing Segmentation Differences: There may not be enough representation of sub-groups to make any segmentation differences. On the other hand if there are enough samples, ignoring segmentation difference and treating the data only in aggregate may show a completely different aggregate result.
- Hidden Variables: This is the flip side of the point above. Responses to a question could show statistically significant difference between two segments. For example, women may state higher willingness to pay for Green products than men, but it hides hidden variables that are not accounted for by the survey.
- Not Asking the Right Question: The survey simply may not have asked the right question, be it the correct vernacular of the target population or asking unambiguous question.
- Not Seeking Data for All Hypotheses: The survey may narrowly focus on one hypothesis and seek only data that will prove or disprove that hypothesis. Data can fit any number of hypotheses, before designing the survey all those must have be surfaced and must be included in the survey. For example, WSJ surveying parents about their children’s performance and WSJ subscription may fail to ask about other things the children do, parent’s education and involvement.
Tags: Customer Metric, Hypothesis