12 Key Customer Service Metrics + 4 Real Example Reports

Mathew PattersonMathew Patterson

Customer service is a highly measurable activity, and the support software you use inevitably gives you access to a ton of customer service metrics. Call volume, chat times, resolution rates, interaction counts, and myriad other numbers are more easily recorded and measured today than ever before.

But having access to that data is only the first step. The bigger challenge is deciding what data matters, how to report that data to your leadership, and what context is needed to help the rest of the company understand the impact your work is having on the business (and your customers).

In this post, we’re going to try to simplify that challenge by presenting you with 12 meaningful customer service metrics, a process for choosing the right metrics for your team and company, and some sample customer service reports shared with us by other support leaders.


Prefer to watch a video instead? Check out this 30-minute webinar on the essentials of customer reporting:

12 meaningful customer service metrics

Customer service metrics can easily be measured at the level of the individual support request and then aggregated to report on overall team performance and individual customer service agents.

Case-level metrics

  • Cases by time created: Review the volume of new conversations created in any given timeframe. This can help you identify times when your customers are most active and help you better set staffing levels to match demand.
  • Cases by topic: If you use tags or custom fields to label conversations, you can quickly spot changes in volume that might indicate a problem in your product or the effectiveness of an improvement. For example, has that new redesign reduced questions about how to update a password?
  • Cases by locale: Understand quickly where you have the most customers needing help so you can support them appropriately, or perhaps consider adding options like localization or support in other timezones.

Individual agent metrics

  • Resolved cases: How many conversations did this person close in a given time period? Averages aren’t always illuminating, but trends over time can reveal top performers and those who may need some more help.
  • Customer interactions: A team member can be doing a wonderful job while showing few resolved cases. Measuring individual interactions is useful in comparing workload and working style.
  • Customer satisfaction: When customers rate their service experience, they may also be rating the product or service, so any individual rating isn’t necessarily meaningful. Looking at longer-term rating averages for individuals and across the team is more useful for spotting champions or those needing to improve.
  • Average handle time: For individuals, having a low average handle time can reflect their level of comfort and skill with the work, meaning they get through cases quickly. Be careful to review it in the context of the type and complexity of questions they are answering.

Team-level metrics

  • Time to first response: How soon after a customer requests your help are they getting an initial reply? Customer expectations for response time will vary from channel to channel, so it’s worth splitting your metrics out by channel, too.
  • Interactions per resolution: Generally, the fewer interactions it takes to satisfactorily resolve a conversation, the happier customers will be. If that number is rising, it can be an indicator of product or service issues — or of a shift in the type of customer you are helping.
  • Customer satisfaction: Keep an eye on changes in your team-wide satisfaction rates as an indicator of the success (or otherwise) of improvements in the products and services as well as in customer support itself.
  • Average handle time: Handle time reflects how long a conversation is open before the next action is taken by your team. Long gaps might mean there are opportunities to improve processes, training, or tooling to get that next answer back more quickly.
  • Customer contact rate: This rate measures the percentage of your active customers who request help in a given month. Improved self-service options, bug fixing, copywriting, and product design can all help reduce that rate as you grow.

How to measure and report on the right metrics

As a customer service leader, you have access to most of the numbers above — and probably a ton more. The challenge is deciding which to report on, who to report it to, and how it should be presented.

To figure out the most important metrics for your team, consider these three questions:

Why is measuring a specific metric important?

The point of your customer service team is (I hope!) not to generate nice-looking graphs and reports. It’s to provide great service to your customers. Metrics are just a more measurable proxy for the real outcome.

For example, Kristin Aardsma is head of support for Basecamp, a company that considers their great service and fast response times to be product features. For Aardsma’s team, the combination of first-response time and customer satisfaction is a meaningful way to tell if they are staying on track.

Another example: During the high growth days of Mailchimp, Bill Bounds’ single most important job was hiring enough new staff to maintain support quality. In his words, “We were so focused on growth and getting enough people in that my primary concern was really on, ‘Hey, we’re not done hiring yet.’” So Bounds’ primary metrics were trends of volume per agent and customer satisfaction level.

When you are clear about why you are reporting, you can decide more easily what you should measure and report on and — equally important — what not to measure and report on.

Who are you reporting to?

Understanding your audience is critical to communication in all forms. What matters most to your frontline support team might not make any sense to your CEO who doesn’t have that ground-level perspective.

What you show and how you explain it might differ considerably depending on who you are reporting to. At Campaign Monitor, customer service reporting is done at three levels, and the contents of those reports are slightly different each time:

  • Individual agents are emailed daily reports on their personal activity and their team’s activity.
  • A monthly report is shared on the internal wiki with the whole company. These reports remove some of the individual agent details but add some long-term perspective.
  • The highest level of reporting is presented on a couple of slides to the senior management team with some written comments to explain the trends on display.

As a global and distributed company, that’s a great way to make sure everyone is up-to-date.

Alternatively, SurveyGizmo’s team is all in one building. The director of customer service presents the weekly reports in person to the support team, and there is an open discussion that senior managers are invited to attend. Physical proximity means that their whole team gets the full context and can ask for clarity easily.

Make sure to determine who you are reporting to and what they care most about. That will help direct you to the right measures.

What outcome do you want to see?

“What gets measured gets managed,” said Peter Drucker, America’s father of management philosophy.

It’s an appealingly concise piece of wisdom: You will effect change on those things you pay attention to. But as unemployed phrenologists will attest, something that is measurable is not necessarily meaningful.

“There can be too much emphasis on fluff numbers in support,” says Help Scout’s Justin Seymour. “The team likes to know what our goals are, what types of conversations we’re having, and how we’re moving the needle month to month.”

The customer service leader is in the best position to understand where the biggest opportunities are for the company. For Bounds at Mailchimp, he needed to quantify his need for more support staff, so he focused his reports on telling that story clearly and accurately.

Campaign Monitor, meanwhile, is a product company at its core, and identifying ways to improve the customer experience through a better product is a big focus of customer service reporting.

Your management team can’t have the perspective you can as the customer lead, so you need to lead them honestly and efficiently to a greater understanding of what action needs to be taken — and you can do that through consistent, clear reporting.

The qualities of a perfect customer service metric

Ultimately, the metrics you choose to report should meet all of the following criteria:

  • Meaningful — They should tie back to something your company wants to achieve. For example, when your goal is highly responsive support, time to first response is an ideal metric. Resolution time may not matter.
  • Moveable — You should measure things on which your team can have impact. If you find that something you’re measuring doesn’t matter, you have the freedom to drop that metric.
  • Authentic — Your reports must tell a true story. It’s possible to use real numbers to send a misleading message. Be honest even when it hurts.
  • Contextualized — Numbers in isolation can be stripped of meaning, so provide them in context.
  • Consistent — The trends over time are usually more important than specific data, and looking back over a quarter or a year can give you some fantastic insights and encouragement.

Building an impactful customer service report

When creating reports, follow these guidelines to make sure your reports are truly impactful:

  • Focus on trends — The direction of change usually matters most. Having an 80% customer satisfaction rate may not sound great, but a month-on-month increase from 70% to 80% is excellent news.
  • Direct limited attention to anomalies and changes — Your leaders are busy people, and they have a limited amount of attention to give you. Make sure it’s easy for them to understand what your reports mean. Consider providing an overall summary. For example: “We received 20% fewer questions about exporting this month, so the reworking we did in the app saved us 12 hours of support time already!”
  • Look for correlations that tell a bigger story — Looking at individual metrics is useful, but understanding the connections between them is where the real insight can come.

Combining metrics can also help you identify deeper issues. For example:

  • “When our email time to first response goes above four hours, we see consistent dips in customer satisfaction.”
  • “Answering billing questions takes us three times the average ticket length.”

Below is an example from my experience at Campaign Monitor. Our reporting tool could tell us when tickets arrived and how long customers were waiting for a first reply, but it couldn’t show us how many tickets were waiting for us to respond to at any given time.

By exporting data from our help desk and combining it with a week’s worth of manual measurements, we could produce a single chart that showed the correlation between larger queues and higher waiting times.

larger queues lead to longer waiting times

Our support team reviewed this chart, which stimulated a discussion about the stress and impact of a large queue of waiting tickets. Davida, our Head of Support, worked with her team to split our main queue into smaller, more manageable chunks. That change created a significant decrease in response times without adding any new resources or changing the volume of tickets.

4 customer service report examples

Whether you’re building out your first customer service reports or you’ve been producing reports for years, there is always an opportunity to make those reports more effective at driving improvements in your business.

Consider the four example customer service reports below — each from a real customer service team — to brainstorm some new ideas for your own reports.

Note: The format and structure of these reports are real, but we’ve obscured the actual numbers.

1. Help Scout

The Customers team at Help Scout meets weekly to discuss general team business. We believe reports are best evaluated as part of a conversation, not a simple list of metrics. Individual goals are discussed in weekly one-on-ones with a player’s coach.

The head of support presents the team’s goals once a month during the company leadership meeting. In Help Scout’s quarterly company-wide Town Hall meetings, the head of support presents a slide or two refreshing the company on team goals, the progress we’ve made, and any upcoming changes and hiring plans.

When evaluating a reporting goal, we aim to define four things for the team:

  • Why do we care about this?
  • How are we currently doing?
  • What are the limitations of this metric?
  • A simple summary of the main takeaway we want the team to know
example report from help scout's customer service team

A few notes:

  • While we rely on our own reporting tools, our internal support reporting focuses on the narrative these metrics tell.
  • We use reports to keep a quantitative eye on our goals, but we never treat these numbers as “hit at all costs.” An overly rigid focus on quotas can often backfire and lower quality and team motivation.
  • Volume of data should always be taken into account, and different timeframes may be useful to examine different metrics. For example, we may evaluate the team happiness score once a month but individual happiness scores looking back six months.

2. Shinesty

“Here at Shinesty, all stakeholders share reports from their department in what we call our Q4 post mortem,” says Antonio King, Director of Experience. “We build reports and list findings within the information/data we’re sharing. Additionally, we share insights to gain feedback or to deploy another set of eyes.”

King came on as Support Leader in 2016. Since then, Shinesty has begun looking at self-service statistics to identify any service gaps, as well as looking at more high-level metrics.

Shinesty looks at the following self-service metrics:

  • content views
  • top articles
  • bounce rate (Google Analytics)
  • sessions (Google Analytics)
  • searches (Google Analytics)
  • pages/session (Google Analytics)
  • missing articles/content gaps
  • feedback ratings
  • deflections
  • handling time per deflection

A few notes:

  • Contextual explanations are included in the reports directly to frame the report with an overall story.
  • Data comparisons to previous periods help add meaning to the graphs.

3. Celtra

“My primary purpose in reporting is to show that we’re doing a consistently good job — and that there are no red flags to be aware of,” says Vuk Lau, Director of Client Support at Celtra. “I share my reports monthly in a Google Doc with our Sales and Service executives — and with my team.”

Lau makes these reports available for everyone in the company to view, and he also produces more detailed reports, including hourly and daily distribution, client comments, and CSAT metrics, quarterly and annually.

A few notes:

  • The support volume is broken down by region, team, and tier.
  • Individual agent performance is also tracked.
  • The label breakdown helps identify the major sources of incoming support requests.

4. Jayride

At Jayride, the team stays on top of reports by touching base daily, weekly, and monthly. Reports are posted in a Google Sheet so each team can track their own progress.

Aaron Lewin, Head of Customer Service, says they hold a daily 10-minute meeting with management and department heads where they talk about “what we did yesterday, what we’re doing today, roadblocks, and wins. At the end of standup we also review the overall company targets (Passengers travelled, Booking Unit Profitability). All team members are encouraged to attend and listen.”

Levin meets weekly with the head of operations to discuss his personal reports. Then, each month, each team showcases their progress to the entire company. These reports include conversions, resolution time, and support unit costs.

A few notes:

  • The support team has conversion targets that are tracked separately for pre-booking and post-booking interactions.
  • Individual agent performance is also tracked.
  • The label breakdown helps identify the major sources of incoming support requests.

What metrics will you report on next month?

Customer service metrics matter. What you choose to report on and how you report it can make a real difference in the level of service you provide.

Don’t waste your valuable time compiling reports that provoke no questions and generate no action. Bill Bounds said it beautifully: “Metrics only tell you where to look for the story; they don’t tell you the story itself.”

Pick the right metrics and use them to tell a compelling story about how your customer service team is contributing to your company’s goals.

Mathew Patterson
Mathew Patterson

After running a support team for years, Mat joined the marketing team at Help Scout, where we make excellent customer service achievable for companies of all sizes. Connect with him on Twitter and LinkedIn.

Join 251,101 readers who are obsessed with delivering great customer service

Expertly curated emails that’ll help you deliver an exceptional customer experience.