There are many ways to get a quantitative look at how your support team is doing, but far fewer to assess the care and thought being put into your replies.

Measuring customer satisfaction thus becomes an easy metric to spin your wheels on. But you shouldn’t throw the baby out with the bath water; for many teams, a simple set of questions will do.

The customer satisfaction metrics that matter

Here are seven questions you can use to measure customer satisfaction and the answer the overall health of your customer service:

  1. Are customers’ expectations being met when they talk to support?
  2. What has the workload been like?
  3. What has customer activity been like?
  4. Have there been any outliers recently?
  5. What have our response times been like?
  6. How likely are customers to recommend us to their friends and colleagues?
  7. How much effort do customers expend when they solve their problems with us?

1. Are customers’ expectations being met?

Help Scout - happiness ratings

Having a good read on the overall quality of your support means you need to collect feedback in a large volume across an extended period of time. Making it easy to collect is the only way to go; the easier you make it to give feedback, the more feedback you’ll get.

This is why we built customer satisfaction ratings right into our help desk — once enabled, customers can rate their service at the bottom of every reply you send. Now you’re able to collect reactions over multiple conversations to get a rough gauge on the satisfaction you’re delivering over the week, month, or quarter.

We purposefully calculate these ratings like the Net Promoter Score. We take the percentage of “Great” ratings and subtract the percentage of “Not Good” ratings to get the overall customer satisfaction score.

Help Scout - Happiness Score

Once you have enough conversations, you’ll use Reports (above) to get a bird’s eye view of how customers feel about your replies.

2. What has the workload been like?

Help Scout Conversation Report

A Conversations Report would be the Help Scout equivalent, but this is where you outline what the volume of requests has been like, and what that means for the future. Conversations are tied to customer satisfaction because you can’t deliver great service if your team is overloaded, or if you’re spending more time on FAQs than helping your customers succeed.

Total volume

Does the team need to discuss bringing on another person (or two)? Why was Monday so busy? Was there an outage or an issue with the product? It looks like far more customers were helped this month than last—why was that?

Types of questions

When using tags to categorize and sort conversations, you’re also going to have data available on top tags for any given time period. Why was the “Refund” tag used 13% more often this month? Better get to the bottom of it.

Types of responses

If your team is using Saved Replies to answer conversations, you’ll see how many instances a particular reply has been used. Say you notice that the Pricing: Subscriptions reply has been inserted quite a few times. Maybe you need to make the pricing options clearer on the website? Your update is a good time to bring it up.

3. What has customer activity been like?

Help Scout Conversation Report - Busiest Time of Day

It’s helpful to showcase what times are hectic and when things quiet down, and if there are any trends that you’ve seen over the long-term. This will help the team know when they are most needed, and consistent trends will sway how you hire in the future.

We ended up searching for a customer champion in Europe to better cover our bases with activity we were seeing in the evenings and early mornings. You don’t want your customer satisfaction scores to plummet because of single busy time period that you could easily handle by shifting schedules.

4. Have there been any outliers recently?

Help Scout - Team and User Reports

The Team Report and User Report will help you dial down into how much work—and what kind of work—each of your team members is doing. Use them as jumping off points to identify red flags.

As Cassie Marketos explains, common narratives that can crop up are seeing someone moving too fast or noticing that one person is being overloaded with difficult conversations:

Users moving too fast or overloaded

Discussing these numbers publicly can let the team know that adjustments are needed. One support manager I spoke with gave the following example (paraphrased):

If I see someone is being unfairly burdened with too many of the long, difficult requests we are getting, I’m going to say something in the update. I know that it’s happening on accident, meaning the team doesn’t realize it and the person probably thinks she’s just doing her part.

It’s my job to step in and say, “Hey folks, Stacy has contributed in a big way this past month with the toughest tickets, but we need to lighten the load for her a little and make sure we’re pitching in and taking on some of the more difficult conversations.”

Finding out their teammate needs help is all that the best people need to hear.

Internal transparency can resolve the issue easily for the right teams.

5. What have our response times been like?

Help Scout Productivity Report

The Productivity Report is what we use at Help Scout.

There are only so many “Sorry for the wait!” messages you can send before customers stop waiting and start getting in line for your competitors. Speed may not count for everything when it comes to customer satisfaction, but it sure counts for a whole lot.

Your update is a place where you can discuss how quickly the team has been replying and why those response times were what they were.

Results can be achieved when this information is shared throughout. What steps will be taken next to improve? Is there a certain time where the team is lagging? Why is that? What goals are you setting and how are you staying accountable? Will a new escalation Workflow be used to keep older emails moving?

6. How likely are customers to recommend us?

Help Scout - nps rating

The Net Promoter Score — a loyalty measurement approach first put forward by Fred Reichheld of Bain & Company — is still a popular and fairly useful way to gain a snapshot of how your company is perceived through customers’ eyes.

As a quick refresher, NPS is based around one question: “Would you recommend XYZ Company to a friend?” Responses are most often collected through a survey that asks participants to rank their likelihood of recommending you on a scale of 1 to 10.

  • Those who indicate a 9 or 10 are “promoters”.
  • Those who indicate a 7 or 8 are “passive”.
  • The rest are “detractors”.

The model is simple: we all want more promoters than detractors. But bear in mind that this score is ephemeral; it’s about understanding current sentiments from a 50,000 foot view.

If your score is negative, then this is a red flag that customers are dissatisfied. If your score is positive, great—keep up the good work. Either way, follow up on some of your ratings; if you don’t understand why your customers rated you the way they did, you’ll be left with no idea of how to improve. Start with asking:

  • Why the customer gave you the rating they did.
  • What your company could do to get to a 9 or 10. Opinionated customers will have plenty to share.

7. How much ‘effort’ are customers expending?

Here’s where things get interesting. Matthew Dixon, Karen Freeman, and Nicholas Toman from the CEB, shook up the customer service world a few years back in a Harvard Business Review article titled Stop Trying to Delight Your Customers.

Armed with a convincing set of data, they sought to prove that extra effort spent on delight was overrated, and that true customer satisfaction and loyalty comes from reducing real and percieved effort.

The chart below, adapted from data featured in The Effortless Experience, shows the disparity between effort vs. delight:

potential impact Source: The Effortless Experience

Moments of “wow” do add a bit of delight, but extra (perceived) effort severely sabotages how loyal customers are to a company. If dealing with you feels like a pain, customer satisfaction and loyalty take a nosedive, and no amount of delight can save you.

The CEB now recommends gauging your customer effort score by asking customers how easy they felt it was to get the answer they wanted. A seven-option survey is too heavy for day-to-day emails, so a lightweight approach would be to edit your signature to include a single link to “Rate My Reply.” Help Scout users often do this to ask about overall customer satisfaction.

But if you wanted to measure effort, it’d be better to ask: “How easy was it to get the help you needed today?” Very Easy, Okay, and Not Easy should work.

customer satisfaction rating

When a customer responds with “Not Easy,” you now have an opportunity to follow up and ask why: “Because I followed your documentation step-by-step and still had to contact you! It was incredibly confusing!” That’s feedback you can dig into and act on.

Having a clear answer for these questions and doing what you can to reduce customer effort across every experience with your support team is a key part of creating true customer satisfaction.

Gregory Ciotti
Gregory Ciotti

Greg is a writer, marketing strategist and alum of Help Scout. Connect with him on Twitter and LinkedIn.