AI and the Gravity of Mediocrity

AI and the Gravity of Mediocrity

I work on the support team at Help Scout, and we have recently started using AI drafts — automatically generated answers added as notes for support agents to use, edit, or refer to — in our own support queue. It’s been an enlightening experience so far. 

I’ll admit, at first I was skeptical about how much value AI-generated writing could bring to our own support queue, which is varied and can get pretty complex. After several weeks of using AI drafts, though, I’m becoming a believer. This tool has definitely saved me time and effort in just the first few weeks of use. Sometimes I only need to make one small tweak to the automatically generated draft. Other times, the draft is maybe 60% of the way there and I’ll need to make some smaller edits or provide specific customer account information. So far, I’ve been thoroughly impressed with what this tool is capable of and I completely understand the hype. 

That said, it’s also been making me think about Hulu’s hit drama, “The Bear.” Well, more specifically about the crisp white t-shirts worn by Jeremy Allen White in his role as Carmy. He’s an undeniable heartthrob playing a great character, but those shirts deserve their own IMDB entry.

A $3 white tee from Target does ostensibly the same job as Carmy’s $50 shirts — they both cover your skin — but they are worlds apart in effect. The difference in quality is apparent, even if it can be hard for the average person to grasp. You might need an expert to tell you why they feel so different

Now often a basic shirt is all you need, and a lower cost product is perfectly acceptable. When you want to make a real impact, though, quality shows out. 

That’s really how I’m feeling about AI-written answers. 

In my experience so far, AI-generated answers fall closer to a $3 tee — not all of them, but enough that I wanted to talk about it here. They just aren’t exceptionally good yet. Generative AI can easily handle basic questions like “What does Customers Helped mean in my email report?”— but even then it often feels like something is … missing. 

Missing pieces

Let me be clear: What generative AI does is incredible, especially compared to older generations of AI tools we have trialed internally. Generative AI basically does the job answering the customer’s question; it’s polite, its replies are formatted well, and the Docs links included are correct. It’s definitely good enough. If we were starting from a lower level of quality, or if I was less experienced, it might actually be better. But as someone with a lot of answers under my belt, the quality of these AI-written replies is sometimes lacking. Many of the AI drafts I see in our queue feel a little plastic and robotic compared to what my support team colleagues write.

One of our values at Help Scout is Craft over Convention. We take our craft seriously, and one way we put that into practice is by holding an incredibly high standard across the company, especially on our support team.

The AI tools available today just can’t deliver that standard in writing without (often significant) human editing. AI drafts aren’t able to consistently anticipate customer needs or elaborate on why the proposed solution would work. They aren’t able to match the customer’s tone or suggest a new feature that was just released. AI writing is also missing the human elements that endear customers to a support team: the small gesture of saying, “Wow, that bug really bothers me too. We’re working on it!” or the “Happy Friday!” sign-offs, or the playful “I Dream of Jeannie” GIF that Kristi uses when she’s completing a simple task that makes a support interaction feel like magic. AI is missing the connection part of the customer experience. 

Now, I know this. I know that sometimes the draft is missing this human element. I could go through and edit that AI draft, but my concern is that I’m feeling a pull to just hit send. Is that laziness? Is that carelessness? Maybe! But this pull to hit send may also be coming from the idea that it seems more efficient in the short term not to make the edits. It sometimes feels like AI answers can create this new opportunity to send a “just OK” response where our team would never have drafted something so dull on our own. 

It’s not that our team is hyper focused on volume or speed metrics, either. Our support team isn’t encouraged to answer as many customer questions as possible; in fact we take great care to focus on quality. And still I’m feeling this pull to take the more “efficient” action and just send the AI draft as is. When “pretty good” feels so much faster than “really good” and there are plenty of other customers waiting, the temptation gets very strong. 

The incredible value of AI drafts

So AI generated writing isn’t exceptional (so far). Should we reject AI writing entirely? Absolutely not! There are so many situations in which an AI draft is hugely valuable. 

Whether it is overcoming the blank-page problem, or providing an answer that is 80 or 90% of the way there in a few moments, or prompting my memory with a fact from a conversation years ago that’s relevant today, AI drafts can be truly impactful. We would be foolish not to make use of those capabilities in service of our customers. Your competitors surely will be.

What I do think is that if your goal, like ours, is to go beyond “good enough” and provide consistently high-quality customer support, you should be thinking about how to mitigate this pull toward mediocrity as AI tools roll out.

Alternatively, could it be true that such a level of quality doesn’t actually matter?

If AI “gets the job done,” do we need humans? 

If customers can get the answers they’re looking for more quickly and continue on about their day, we’ve got to step back and ask, “Is an AI generated answer enough?” And sometimes, the answer is yes! There are times where customers are looking for answers that already exist in your public facing resources, like Help Docs or FAQs, and they might just need help surfacing that answer. AI is a perfect fit for that scenario. There is absolutely a place for the fast food version of support, where a solid answer right now is worth much more than a hand-crafted response hours later.

That said, if you’re hoping to offer Michelin star-level customer support (or even something more on the Zippy’s end of reasonably priced dining), most of the time there is additional nuance and context needed. Excellent customer support means the aim is not only to answer customers’ questions but to delight those customers. 

People love Help Scout because of the product, yes. But they also love it because of our Help Scout-iness. It’s not efficient (in the moment) to spend time screen recording a video for a new customer to explain in a different way what’s being said in the help article that you already sent them, but it is effective and valuable. Long-term efficiency shows up as retention and loyalty and purchasing your service in their next job three years from now. All that human nuance and context eventually saves money and creates lasting customer relationships. 

Overcoming AI’s urge to be average 

Thoughtfully making decisions about exactly where to place AI in customer support teams — and how to use it — will have a big impact on both support pros’ day-to-day work and the overall customer experience that they’re able to provide. 

As we invite AI into our support teams, I think there are a few things we can do to make sure AI makes a positive impact on customers and support pros alike: 

  • Define great. How do you know when a draft (whether one AI wrote or one that you wrote yourself) isn’t meeting your standards? First you’ve got to set those expectations for your team. What does a great email look like? We’ve got some inspiration for you here, but every team is different. Create your own style guide to help set the standard that all customer communication is held to.

  • Be thoughtful about incentives. Which metrics does your team prioritize? Some teams might focus on the number of replies sent by a support agent or how fast customer inquiries are being resolved. While speed and volume are obviously important, it’s helpful to balance these along with metrics like customer satisfaction and retention. Incentives drive behavior, so make sure your incentives aren’t inadvertently pulling down quality.

  • Create a culture of ownership. Another great Help Scout value is Owning the Outcome. This applies to each and every support email. Make it a part of your team’s culture to discuss a sense of ownership in the support queue. Drafts might get started using AI, but ultimately, it’s a support pro’s responsibility to make sure the reply is top notch. If the AI draft is fine, then great, but you have to stand behind it.

  • Track it. Are responses that started with an AI draft answered or resolved more quickly? Do happiness ratings differ when AI writing is used? Regularly reporting on the impact that AI is having on your team and your customers will be crucial in determining how and where to best implement AI in your support team. “What gets measured gets managed” is a cliché because it’s true.

  • AI is only as good as its data. There are times when an AI draft comes across the queue that knocks it out of the park; every definition is correct, each detail is in the right place, and it makes me wonder how it is even possible for a machine to be so smart. Actually, it’s very simple how that is possible: The AI generating these drafts is being trained on Help Scout’s dozen years of excellent historical replies and incredibly detailed help documentation. To get high-quality AI drafts, your team will need to be meticulous about continually providing the most up-to-date information for your AI tools to imbibe. This includes reviewing AI drafts for quality, continually improving documentation, and always looking for ways to improve. 

Be the Bear

When presented with new technologies, manufacturers had a difficult decision; they could continue using expensive machines to create high-quality t-shirts or replace those machines with new technologies and create mass produced t-shirts of lower quality. Carmy’s $50 crisp white tees can’t be created using newer technologies alone. 

The same is true with AI. You can’t get the same high-quality customer support experience only using AI — at least not in every case. You need your support team to make it great. 

Lucky for us, our decision in the support world isn’t binary. We don’t have to choose between AI technology and humans. We have the opportunity to bring in new technologies to work alongside highly skilled support professionals rather than replace them. Our hope is that these kinds of AI tools will make it easier for support teams to amplify their own skills and provide excellent customer experiences. 

That bright future only exists if we’re able to use AI tools thoughtfully. Remember that using AI does not automatically equate to the same great customer support; it only makes it faster. On the contrary, if left unchecked, it could actually mean worse customer support at scale. 

As you decide how AI will fit into your support team moving forward, I would encourage you to try and remember Carmy’s crisp white tees. The detail and care in something as simple as a t-shirt made a real impact on The Bear’s success. Fight for moments for your support team to have that same powerful impact on customers as well.

Like what you see? Share with a friend.
Christine Chavez
Christine Chavez

Chrissy is a Technical Support Specialist at Help Scout, where we’re focused on providing people-first support to all of our customers. You can connect with her on LinkedIn.