Skip to main content

I have duplicates of some users, all with the same email address. How did this happen? How can I fix it?

Hi Eric,

 

We are specifying both the user name and id so that we can differentiate but the script gets an HTTP 409 response if there is more than one Intercom account with that same email address (but different userids).

 

The script we are using from our website is as follows:

 

    window.intercomSettings = {

      app_id: 'l234ab5c'

    };

       window.intercomSettings.email = 'my.email@gmail.com'; 

       window.intercomSettings.userid = '1633067';

   

 

Why does this not uniquely identify the single Intercom account?

image


@eric f11​ does this also work when eliminating duplicate leads? I have leads up to 4 times with the same email but different User IDs. I already fixed the issue generating the duplicates, but now I need to remove those duplicates.

 

The problem is that I can't seem to be able to match the userID column (only email) when importing the CSV, so not sure how I can select the leads to eliminate.


It is super annoying that Intercom still does not have this basic feature of merging duplicates after so many years, while useless features seem to proliferate.


Hi thanks for the CSV suggestion. I will try using this. Is there an easy way of re-assigning the tickets/conversation off the people who you have identified as needing to be archived? I have currently identified 200 user-user duplications so manually going through each one and re-assigning the tickets isn’t viable.


Hi @Shaunt, Ebenezer here from Engineering Support👋.

I am more than happy to submit this as a feature request for our engineers to look into.

Could you reach out to us with this suggestion via Messenger? It will be a bit easier to keep track of this request with the feature.


Any update on the ability to merge duplicate users?
Like Intercom we have evolved and improved over the years. So has our user ID and people with old app version keeps creating duplicate. I don’t want to loose historic data by just randomly archiving the old one.
Any viable solution scoped in the last 3 years?


Reply