How can i have continuously updated messages data on my data warehouse using the Intercom Message Export API

  • 14 August 2022
  • 3 replies

Hi everyone, i currently use the export job API to retrieve the message stats that we use to calculate series and messages performance metrics such as open rate, ctr, bounce and spam rate and so on.


We do this by running a script on google apps script that has the following functions:


  1. Creating an export job with the desired (and user defined) time range
  2. After the export job is created, constantly check untill the job status is completed
  3. Once the export job is completed, the export job is downloaded and written to a spreadsheet page, from which all metrics will be calculated.


Since this process currently runs on google sheets, it is highly limited in terms of volume of messages that can be analyzed, as gooogle sheets has a high performance penalty when handling more than 50.000 rows,(our current row count for a full week of messaging is about 100,000 rows, which means that we can retrieve results for shorter-than-a-week intervals at each time)


Aditionally, this format doesn't provide historical data, since each time we run this process, the spreadsheet pages are cleared so we don't append duplicate rows on it. (this could be solved in google apps script itself, but since google sheets has a hard time handling large ammounts of rows, it would make no difference),


I'm looking forward to developing an ETL solution that recurringly extracts messages from a given time interval (such as the last full week/month) and appends this to a messages database, updating previously added rows.


I have little knowledge on python programming, and an intermediate one on SQL DML, has anyone else solved this in a similar way?


I'm eager to receive ideas and any sort of tips on this one and can also share the summarization queries that i have built to consume the data from this source.


3 replies

Userlevel 3
Badge +2

Hey @user1207​, do you think you can advise on that? You seem well versed in this subject 🙌

Userlevel 4
Badge +5

Hey @user38​ Racheal from support engineer team here 👋


This falls a bit out of the scope of the Intercom product, since you're looking to analyze the data outside of Intercom, after export. I wonder if there might be some useful suggestions on building out a solution using Python on Stackoverflow as well. Hopefully another community member can chime in here!

Hi Racheal, this makes sense!

Upon some more investigation we understood that it is up to our teams to design and develop these data extraction pipelines. I am now bringing this necessity to our management.

Thank you!