Sure, we follow our fair share of best practices here at Help Scout. We can’t be innovating all the time. Some tools just work. That’s why we keep them in the toolbox.

The problem with following best practices, however — as Jay Acunzo writes in his excellent new book, Break the Wheel — is that “Just because something is the most common approach doesn’t mean it’s the best approach for us.” No one ever innovated beyond what was previously thought possible by playing the same game everyone else was playing. Making the best decisions for our business requires trying new things.

That’s why we’re committed to experimentation, too. We played with a number of new ideas this year — some turned out great; some landed with a thud. But we learned something from each one, and we’re sharing some of those learnings with you here.

We share these stories not as warnings or as new best practices to introduce in your own businesses’ playbooks, but rather in a spirit of transparency and vulnerability. We hope any insights found here will inspire more experiments (and more community-building-via-story-sharing!) in the years to come.

The Beacon Launch

We started talking publicly about Beacon way before it was ready. Starting in January, we wrote monthly blog posts previewing the product we were building and giving updates along the way. It’s not something we had done before, but it felt like a strategy worth trying because chat has become a popular topic in our market.

We wanted our customers to know we were working on a new product — one that had the potential to change their business in a positive way. We also wanted potential customers to know we were developing a chat product. Since Beacon takes a different approach than other products in the market, we thought it was important to explain why, and to include our audience in many of the design decisions.

Lessons learned: We got thousands of current and potential customers excited about the product, and many of them resonated with our approach to designing it. The process, however, put undue stress on the product teams. It felt like we were releasing the product before it was ready, and we spent the first couple months getting the product up to par.

If we had to do it over, we may have run the same campaign but over a shorter amount of time, and not until we knew the product met our standards for quality. We got excited, and the marketing got out a little too far in front of the product.

Cross-functional crews

Our Design and Product teams used to work on whatever project was next, and engineers were organized by expertise (e.g., Java versus JavaScript) and software layers (e.g., APIs versus Web app). This year, we reorganized Engineering, Product and Design into cross-functional crews that owned the long-term business missions of Communications, Productivity and Growth.

As a globally distributed team, we were concerned that reorganizing into new teams (where four hours of overlap among all team members may not always exist) would create new silos and communication challenges — but because these cross-functional teams would be staffed with all skills needed to deliver on their missions, we’d minimize cross-team dependencies and deliver higher quality software faster — and with less rework.

Lessons learned: The experiment is ongoing and we’re still learning, but it feels successful so far! The crews have improved our ability to execute on multiple projects at the same time, and everyone feels more empowered to do what’s needed to complete their missions.

More video content

On our customer support education platform HelpU, we’d settled into a regular article-publishing schedule. It’s a great way to share ideas, but longform articles take time to read, and not everyone likes to learn that way. So in 2018, we tried some new video formats, ranging from much shorter videos to in-depth live webinars.

We thought we might reach a new audience and provide useful information to people who had never read our articles. One concern was that the short videos might not provide enough value, because being actionable is an important part of HelpU’s identity.

We learned that producing video can be time intensive — but also that there are ways to create value without investing a ton of time. The constraint of communicating an idea within one or two minutes forces us to carve away the fluff and get quickly to the point without compromising Help Scout’s personality.

On the other end of the scale, longform webinars have become an effective way to explore new topics and learn from experts, and build a larger audience through partnering with those experts.

Lessons learned: People have been responding well to our video efforts, although we haven’t yet settled on a path to consistently releasing them and growing a video audience. Content should never be “one and done” because different kinds of learners will find it and use it in different formats when we make it available.

Product management

Our product management approach used to be leadership-driven — in fact, we caused a bit of a stir with this blog post about how we don’t have any dedicated product managers. Since then, however, our approach has evolved to the extent that this year we introduced Product Leads and a Product Management function to the company, focused on more collaborative, distributed decision making.

The goal of this change was to enable teams to be more effective, and deliver more customer value by decentralizing decision making and introducing better processes for building quality products.

Lessons learned: We expected resistance to the new process and a general a fear of change, but everyone from top to bottom was open to new ways of working and new people involved in decision making. The hard part turned out to be the easiest. Making these changes remotely, however — building trust, collaboration and a new process without being in the same room — was unexpectedly harder. We’ve since shifted the balance toward more synchronous communication, which has created more opportunities to collaborate and build trust.

Refreshing older blog posts

We’ve periodically refreshed older pieces of content, but this year we put a significant amount of time and effort toward updating older content for relevancy and re-publishing it. We were hoping this would significantly increase our traffic through organic search.

We refreshed about 30 pieces, including this one and this one.

Lessons learned: Updating old pieces of content keeps them fresh, so it’s worthwhile on its own — but we didn’t get the traffic boost we were hoping for. Some refreshes performed well, but other refreshes resulted in lower traffic than before, and we reverted a couple of those to their previous state. Overall, the effort netted a 3-4% increase in organic growth on the content we refreshed.

Pricing A/B tests

The Growth, Product and Design teams worked on multiple pricing experiments this year, and developed a framework for A/B testing our pricing. We thought that in testing our prices, we’d discover great data about how we should price.

Lessons learned: We learned quickly that testing prices for a SaaS business model can take time, so we developed an approach where we’d roll out a new test as another continued to soak and evolve. This allowed us to run several tests without waiting for one cohort to be significant. We’ll continue to refine, but so far, it seems we have a scalable and repeatable process for A/B testing prices.

Related: Pricing Psychology: 10 Timeless Strategies to Increase Sales

Outreach on Twitter

Our Support-Driven Growth team experimented with reaching out on Twitter to new trial customers with welcome tweets. Kristi hunted down Twitter handles for each customer in this segment and reached out with clever tweets like this one:

We thought we might discover a new channel for engagement and build a stronger relationship with trial customers in an avenue beyond email.

Lessons learned: This was a complete flop. No one responded. We saw maybe one or two likes, and zero correlation between tweeting and conversion. Oh, well! No effort is entirely wasted when you learn something, right?

Chat on the marketing site

We experimented twice with using Chat on our marketing site to drive more large-company leads to sales, once using Drift and once using Beacon.

Chat has a huge impact on increasing conversions from trial to paid accounts. Prospects convert to customers more readily when they’ve chatted with our team in-app during their trials, so we thought it would have a similar effect earlier on in the funnel. We hoped it would increase engagement from our marketing site to trial signups and lead to more large deals.

Lessons learned: The number of chatters from large companies was low. Instead, we saw lots of existing customers, job seekers asking questions, small prospects — and no direct correlation between starting a chat and moving to trial in any customer segment. Chat is labor intensive, and the extra muscle didn’t provide much extra movement on that goal.

‘Pops Pals’

Previously, when our People Ops team consisted of two people, one handled hiring and the other handled onboarding. When we hired a third People Ops team member, we changed the model so three people now share all talent acquisition and management activities, and every team in the company has a designated People Ops team member, or “Pops Pal.”

With a Pops Pal embedded in each team, they can see the shifting priorities and needs of that team and proactively recruit. They can also start to build solid relationships with people on those teams to support them as they evolve.

Lessons learned: We’ve been able to hire great people faster. This type of shift does not happen overnight, of course: There is untangling of logistics and permissions, plus shifting processes that aren’t fully baked into a new structure and that require training and context sharing. Our Pops Pals are building new relationships with team leads and coaches, and learning new areas of the business.

The Company Plan

We launched the new Help Scout Company plan so teams of 25+ can get up and running with Help Scout for a flat fee during the first year. After that, it includes a flexible User count each year to keep the price predictable.

We felt this plan would be unique in the market and would eliminate several barriers to switching products. It doesn’t get any easier than a single flat fee per month for a year, unlimited Users and free concierge onboarding services with your account rep. In short, we thought it’d be an easy decision.

Lessons learned: We’re still in the learning process, and while the results have been good, we still think the plan has more potential. When pricing in an established market is different or unique in any way, you have to educate folks and build trust. What seems like an easy decision to us hasn’t yet proven to be an easy decision for customers. We remain still enthusiastic about this plan and how it solves so many of our larger customers’ challenges, but we have more learning and experimenting to do.

Any insights to share from experiments your team ran this year? Tell us in the comments!

Create a better experience for your customers with Help Scout.

Get Started
Emily Triplett Lentz

Emily Triplett Lentz

Emily is the blog editor and content strategy lead at Help Scout. You can find her on Twitter.