Our company offers software which contains databases our end users need for their work. The databases sometimes use difficult terminology that is hard to interpret for our users. In an attempt to improve the ‘readability’ of the databases, I have recently uploaded a bunch of ‘metadata’ associated with the most used databases in our software - breaking FIN as a result.
Prior to starting the upload December 2025, I had contact multiple times with my Intercom account manager to determine the ‘optimal’ way to upload a large amount of qualitative (text) data to Intercom. The data on our end originally is stored in JSON format - a format we can’t directly upload directly to Intercom. After some discussions, we opted to split the JSON file into smaller .txt files, which we manually uploaded as ‘snippets’ to Intercom. We have spent a considerable amount of time setting this up. Now the system is ‘stuck’ since Christmas 2025, with no viable solution yet in place. No new content we upload is ingested by FIN since end of 2025.
A simple example is related to our pricing - FIN keeps telling users our 2025 pricing, while the content is updated to 2026 content - FIN literally is stuck in the past. The only short term ‘solution’ we are offered, is to remove the uploaded content & to pause the sync. Given the amount of time we spent setting this up, I am open to removing the content / pausing the sync, provided there’s a good alternative solution to uploading the content such that FIN can use it to help our users.
It’s a shame, and the issue isn’t addressed for ~1.5 months now. Intercom claims to be AI first - but it isn’t able to handle large text files or lacks guidance on how to upload the data in a sound manner.
Has anyone experienced something similar?
Is anyone aware of a fix/solution for the above described issue?
What is the optimal way to upload large text files to avoid this issue in the future?