work around for inserting into table with unique constraint.

  • i have a file that sometimes contain, similar records, and i have a table that has a unique constraint, because we do not want multiple records getting into the table. while running my dts, i get the error, that pops up when the process is trying to insert a duplicate reord into the table. is there a work around, like can i put the data in a staging table and then use a stored procedure to do the insert. I would like to have the duplicate value not inserted into the table

     

    Thanks

    [font="Comic Sans MS"][/font]It is what it is.

  • Yes, you can use a staging table, but you'll still have to deal with the duplicates in your data at some point.

    John

  • its a datafile from a vendor and they are aware of the problem, but in the meantime, how do i insert while ignoring duplicates

    [font="Comic Sans MS"][/font]It is what it is.

  • I take it that where duplicates exist in the column that is supposed to be unique, you want to insert one row only?  If that is the case, you will need to decide how you choose which one.  For example, your data may contain Name, Age and Height, and you may decide that where two people have the same name, you want to keep the tallest.  In which case, you'd insert everything into your staging table and then search these forums for information on how to remove duplicates.

    John

  • We are assuming you do not want to cleanse the offending rows for proper data. In other words, the rows will either be exact duplicates or meaninglessly different. Assuming that case, then the first row is ok to process and the rest of the duplicate rows need to be ignore.

    The easy way to handle these type of duplication errors is to work with the Options tab of the transform. You will see exception handling capabilities there:

    1. Enter the full path to the exception file you want to create. This will at least let you know what rows were duplicates.

    2. Adjust max error count to 9999

    3. Uncheck the Use Fast Load option.

  • You sir are a genius. i will try it, i feel it will work. i will get back to y'all.

    thanks for the help

    [font="Comic Sans MS"][/font]It is what it is.

  • If you just want to ignore the duplicates and don't need them copied to an exception file, you can also use "CREATE UNIQUE INDEX ... ON ... (..) WITH IGNORE_DUP_KEY" to eliminate the errors.

  • thanks

    [font="Comic Sans MS"][/font]It is what it is.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply