How Intercom's CS Team Started using our CX Score to Improve Support Quality | Community
Skip to main content

How Intercom's CS Team Started using our CX Score to Improve Support Quality

  • July 15, 2025
  • 7 replies
  • 351 views

Laura Sicat
Employee

Hi folks 👋

 

I’m Laura, Intercom’s Senior Program Manager of Quality Assurance and Continuous Improvement. In this post, I wanted to share how we’ve been using our very own CX score to improve our support quality.

 

What is the CX Score?
 

The CX score is a composite measure of support quality that blends three key elements:

  1. Resolution status – Were the customer’s issues actually resolved?
  2. Customer sentiment – How did the customer feel about the interaction?
  3. Service quality – Was the response clear, helpful, and timely?

Unlike traditional metrics like CSAT, which rely on voluntary customer responses, the CX score leverages AI to evaluate every conversation. This gives us more consistent and comprehensive insights into the quality of our support.

 

Why We Use the CX Score
 

1. It’s a More Stable Metric Than CSAT

We’ve found that the CX score provides more consistent WoW insights than CSAT. That stability makes it easier to spot meaningful trends and prioritise where to focus our efforts - without being distracted by one-off anomalies or the randomness of who left feedback that week. It’s now fully ingrained in our weekly support operations meetings, and it’s strange to think back on when it wasn’t.
 

2. It Helps Us Promote Quality

The CX score gives us a data-backed way to identify conversations that need attention. Whether we’re looking for opportunities for coaching, where we should manually QA (e.g. conversations with CX = 1, 2, 3), or deep diving into core processes. Instead of chasing sporadic feedback, we can zoom in on the conversations that actually matter for the customer experience. I will note that it’s not supposed to be a quality metric for your agents, so keep in mind that the focus is on your customer’s experience, not if your agents followed your processes. 
 

3. It Supports Customer Focused Coaching and Accountability

The score has been instrumental in bringing our human support team along on our AI-first journey. It creates a shared language for discussing quality — and it holds both Fin (our AI agent) and human teammates accountable for delivering great support.

 

Change Management: Bringing the Team on the Journey
 

Rolling out the CX score was more than just a technical change, it was also a cultural one. We knew that for it to succeed, we had to bring our team along with us. That meant:

  • Setting clear expectations about how the score would be used
  • Listening to feedback from teammates
  • Introducing gradual targets to guide adoption

By treating the rollout as a collaborative effort, we were able to foster trust in the metric and align the team around our shared goal: delivering better support for our customers.

 

One last thing…
 

Using our own tools gives us invaluable insights into how they perform in real-world scenarios, and using the CX score internally has been a game-changer for our support team. It helps us focus on what really matters: quality, resolution, and customer experience.

If you’re already using CX score, we’d love to hear how you’re applying it in your own workflows. And if you’re curious about how it could fit into your support strategy, don’t hesitate to reach out!

7 replies

  • New Participant
  • July 28, 2025

Thanks for sharing this, Laura. It’s really insightful to see how the CX score is being used beyond just metrics and actually driving meaningful improvements in support quality. I especially liked the point about focusing on the customer’s experience rather than just agent performance, such a powerful shift in mindset. Appreciate the transparency around change management too; bringing the team along with the process really makes a difference. If anyone's interested in seeing how this approach works in practice, definitely check this out.


Laura Sicat
Employee
  • Author
  • Employee
  • July 30, 2025

Appreciate your comment ​@aristotle345 ! Very interested to hear if you or your team have been using the score and how you’re finding it if you’re willing to share.


  • New Participant
  • August 9, 2025

Interesting read tracking a CX score is a smart way to spot where the customer experience can be improved. I’ve found a similar concept works in gaming, too. In competitive titles or even casual ones like BSD Brawl , feedback loops and performance tracking can really shape how players engage and improve over time. It’s all about giving people the right data to make their experience better.


  • New Participant
  • August 12, 2025

Hi Laura, thanks for sharing! I love how the CX score uses AI to give a clearer, more consistent picture of customer experience compared to traditional metrics. It’s great that it focuses on quality and resolution without just policing agents. Also, your approach to rolling it out with team collaboration really stands out, building trust is key. Looking forward to hearing more success stories!


Forum|alt.badge.img+2
  • Connector
  • August 15, 2025

Hi Laura, 

Thank you for this post. It couldn’t have come at a better time! I’m the quality manager in support and I was so glad to read your article! We’re a little over one month in to our Intercom journey and I’m looking at ways to utilize the CX score to expand into areas of the customer journey we’ve never had before.

I’d like to learn more about the manual QA checks on conversations with 1, 2 or 3 ratings. Are there guidelines/forms used to review the conversation? How are those reviews tracked/validated? How is a score manually changed if findings show the score was incorrect? See, lots of questions 😅

Being new to Intercom I’m a sponge just wanting to soak up all the info I can! 

 


Laura Sicat
Employee
  • Author
  • Employee
  • August 19, 2025

Hey there ​@Katie L 👋 Welcome to Intercom! I hope you’re finding the first few weeks smooth - I know any new tool is a bit of a challenge! If you’ve not already checked it out, our Intercom Academy could be a great resource for you.

 

We haven’t published exact guidelines or forms, but I can give you my two cents on what we’re checking in those reviews!

  1. Do we agree with the CX score? Really important note here is that the CX score is based on the customer opinion, so we don’t exclude these on our side to keep CX top of mind across the team - but if you are using it as a metric for your CS agents, you may want to created a tag (e.g. #CXExclusion), add the tag for any you disagree with and then make sure your customer report on CX excludes conversations with this tag.
    • Keep in mind there’s no way to change the CX score applied by the system, so this tagging and excluding method is more of a workaround for those who want to use the metrics as a KPI. On our side, we use it as a team goal so don’t use exclusions.
  2. Fin performance - did Fin have all the information it needed to answer this customer query? how did the guidance used by Fin impact the conversation, positively or negatively?
  3. Agent performance - could the agent have showed better problem ownership. centred the customer experience more in the interaction? 
  4. Was the customer’s issue ultimately resolved? If not - reopen
  5. Did the processes used in this conversation help expedite a resolution or improve CX?
  6. Do we need to follow-up with the customer on this conversation? This could mean we reach out to apologise for an experience, or reopen the convo to ensure it’s resolved.

 

We’re getting teammates involved in these reviews and prioritising CX= 1 & 2 to ensure we’re learning as much as we can about our customer experiences and where we need to move the needle.

 

I hope that’s helpful!


Forum|alt.badge.img+2
  • Connector
  • August 21, 2025

This is great and very helpful as we embark on this journey! Your time and response is much appreciated!

 

Thank you!