How to INSERT IGNORE with Write Database operator....?

tmyerstmyers MemberPosts:21Contributor I
edited December 2018 inHelp

Hi all. I'm using a Write Database operator to append rows to a MySQL table. I have a unique index consisting of 2 columns in the destination table. When my exampleset contains a row with a pair of values that already exists in the table, the Write Database errors and halts. I would like the operator to ignore such errors and continue to append all the rows where the pair of these values DON'T already exist in the table (i.e. to behave like a INSERT IGNORE statement).

How can I run this operator so it would ignore such errors and only append "new" pairs of values?

Thanks in advance for any suggestions,

Tim

Tagged:

Best Answer

  • BalazsBaranyBalazsBarany Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified ExpertPosts:949Unicorn
    Solution Accepted

    Hi,

    if you're fine with updating existing values (based on the unique index), you can use the Update Database operator. It will search for existing records (you need to select the two columns that contain the index values), insert the record if the key wasn't found, and update otherwise.

    If you really only want to insert new data, you could get the existing keys from the table (Read Database), do a Join with e. g. Left for identifying existing records, and just keep new records for the Write Database.

    Regards,

    Balázs

Answers

  • tmyerstmyers MemberPosts:21Contributor I

    Thanks Balázs. I was afraid of that. The table I need to read has 30M rows :mantongue:

    Reagrds,

    Tim

  • BalazsBaranyBalazsBarany Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified ExpertPosts:949Unicorn

    I sometimes solve this problem with an additional "staging" table, which has the same structure as the main one has.

    You append to the staging table, then using Execute SQL you only select new rows (e. g. a LEFT OUTER JOIN), put them into the main table, and truncate the staging table.

    This should scale to huge example sets as the database is doing most of the work internally.

Sign InorRegisterto comment.