Creating a useful customer survey is no easy task, but it’s worth pursuing. Few other forms of feedback allow you to gather such a large volume of data so quickly on any set of questions.

While some of our other favorite ways to gather customer feedback focus on active listening during one-on-one sessions with customers, customer surveys provide an opportunity to poll your users on questions that might otherwise go unanswered.

But surveys inherently have a few serious problems, and these issues are only compounded when you create survey questions without a game plan.

Today we’ll look at some proven ways to turn your surveys into a reliable source of insightful customer information.

10 customer survey tips for collecting genuine insights

Customer surveys have a few glaring problems, but that doesn’t mean they should be dismissed as a useful resource.

Not surprisingly, most of these problems revolve around getting accurate answers from respondents:

  • No matter what you do, research has shown that there will always be a small minority of people who will lie on your survey, especially when the questions pertain to the three B’s: behavior, beliefs or belonging. (Here’s a review of this topic from Cornell University.)
  • Furthermore, sometimes people will give inaccurate answers completely by accident. Numerous publications have noted that predicting future intentions can be quite difficult, especially when done via survey.

Fortunately, research also offers solutions to these consistent problems with surveys. A joint study by Survey Monkey and the Gallup Group offers some good insights on creating and structuring surveys that can keep these problems to a minimum.

Below, let’s look at the study’s most important takeaways so you can get a clear picture of how to improve your surveys.

1. KISS (Keep it short, silly)

Applying this spin on the traditional KISS principle is important for assembling a successful survey.

Your biggest concern is being clear and concise, or in finding the shortest way to ask a question without muddying its intent. It’s not just about reducing the character count; you must eliminate superfluous phrasing from your questions.

At the same time, overall survey length remains important for keeping abandon rates low. Think about the last time you sat around and excitedly answered a 30-minute questionnaire. It’s probably never happened.

2. Ask only questions that fulfill your end goal

Be ruthless when it comes to cutting unnecessary questions from your surveys.

Every question you include should have a well-defined purpose and a strong reason for being there. Otherwise, it should be put on the chopping block. Depending on the survey’s purpose, it may not matter how a customer first came in contact with your site. If that’s the case, then don’t ask. Do you need to know a customer’s name? If not, again, don’t ask.

Including that question you thought couldn’t hurt to ask only adds unnecessary bloat that could send survey takers hunting for the “back” button.

3. Construct smart, open-ended questions

Although it’s tempting to stick with multiple choice queries and scales, some of your most insightful feedback will come from open-ended questions, which allow customers to spill their real thoughts onto the page.

However, nothing makes a survey more intimidating than a huge text box connected to the very first question. It’s best to take on brief questions first to create a sense of progress, and then give survey takers who’ve made it to the closing questions the opportunity to elaborate on their thoughts.

One strategy is to get people to commit to a question with a simple introduction, and then follow up with an open-ended question such as, “Why do you feel this way?”

4. Ask one question at a time

We’ve all been hit with an extensive series of questions before: “How did you find on our site? Do you understand what our product does? Why or why not?”

It can begin to feel like you’re being interrogated by someone who won’t let you finish your sentences. If you want quality responses, you need to give people time to think through each individual question.

Bombarding people with multiple points to consider leads to half-hearted answers by respondents who will just be looking to get through to the end — if they even stay with the survey at all. Make things easy by sticking to one main point at a time.

5. Make rating scales consistent

Common scales used for surveys can become cumbersome and confusing when the context begins to change.

Here’s an example: While answering a survey’s initial questions, we were told to respond by choosing between 1-5, with 1 = “Strongly Disagree” and 5 = “Strongly Agree.”

Later on in the survey, however, we were asked to evaluate the importance of certain items. The problem: Now 1 was assigned as “Most Important,” but we had been using 5 as the agreeable answer to every previous question.

That’s incredibly confusing. How many people completely missed this change and gave inaccurate answers, completely by accident?

6. Avoid leading and loaded questions

Questions that lead respondents towards a certain answer due bias in their phrasing are not useful for your surveys. SurveyMonkey offers a great example of a leading question to avoid:

“We have recently upgraded SurveyMonkey’s features to become a first-class tool. What are your thoughts on the new site?”

This is a clear case of letting pride in your product get in the way of asking a good question. Instead, the neutral, “What do you think of the recent SurveyMonkey upgrades?” is better to use.

Remember to cut out language that caters to ego or contorts a respondent’s understanding of what’s being asked. To avoid loaded questions, stay away from any presupposed facts or assumptions.

A well-known example on disciplinary action with children is as follows:

“Should a smack as part of good parental correction be a criminal offence in New Zealand?”

The assumption here is that smacking a child is inherently a part of “good” parental correction, when in fact that is simply the opinion being argued. You can avoid loaded questions in your surveys by eliminating emotionally charged language that hints at preferences or assumed facts.

For the sake of this article, we’re assuming any survey you’d run to collect feedback from your customers would be genuine in that intention. Critical readers can tell whether the intention of your survey is to collect authentic insights versus building support for a cause or other purposes. When the goal is to honestly learn something, don’t risk annoying your participants (and muddying your data) with leading questions or other tactics designed to get the responses you want to see.

Poll It’s safe to assume the goal of this survey was to collect a different kind of feedback.

7. Make use of Yes/No questions

When you are asking a question that has a simple outcome, try to frame the question as a Yes/No option.

The Survey Monkey study showed that these closed-ended questions make for great starter questions because they are typically easier to evaluate and complete.

These questions can also be used to qualify the respondent with less of an ego bias, such as asking a question like, “Are you considered an expert in [blank]?” vs. “What level of expertise do you have in [blank]?”

8. Get specific and avoid assumptions

When you create questions that implicitly assume a customer is knowledgeable about something, you’re likely going to run into problems (unless you are surveying a very targeted subset of people).

One big culprit is the language and terminology you use in questions, which is why I’d recommend staying away from industry acronyms, jargon, or references.

One of the worst assumptions you can make is to assume people will answer with specific examples, or to explain their reasoning. It’s better to ask them to be specific, and let them know you welcome this sort of feedback:

“How do you feel about [blank]? Feel free to get specific; we love detailed feedback!”

Gathering and sharing customer feedback is essential to making informed product decisions

9. Timing is important

Interestingly, the Survey Monkey study found the highest survey open and click-through rates occurred on Monday, Friday and Sunday respectively.

There was no discernible difference between the response quality gathered on weekdays versus weekends, either, so your best bet is to seek out survey-takers first thing during a new week or to wait for the weekend. Perhaps Monday has such high response rates because nobody feels like working.

With regard to sending frequency: Companies might conduct customer surveys once a year, or at most, once per quarter. And while that’s great, it’s not enough to keep a pulse on customer satisfaction — you don’t want to wait 90 days to find out your customer is disgruntled. Between surveys, you’ll want to keep a keen eye on your customer satisfaction ratings and other metrics. Reporting tools (such as Help Scout reports) can help you turn every conversation with a customer into a feedback session.

10. Give them a bonus

It sometimes makes sense to entice customers to take your survey: a variety of data show that incentives can increase survey response rates by 5 to 20 percent. These incentives could be a discount, a giveaway or account credit.

A valid fear is that a freebie may detract from the quality of responses, but a few studies show that this isn’t likely to be the case.

Last but not least, to ensure you don’t lose your shirt, be sure to make incentives something you can financially handle, such as an extended trial of your software for a period of time.

Editor’s note: This post has been updated for accuracy and freshness. The original version first appeared on the Help Scout blog on April 24, 2013.

Measuring Customer Satisfaction Shouldn’t Be Complicated

CSAT scores, response times, and other metrics that matter — here’s how to make customer service data work for you.

Help Scout

Help Scout

Help Scout makes customer support tools that keep customers happy as you grow. Try it free today!