Import Records... a lot of records

  • Hello all, First, sorry by my english. I need to insert more than 10 millon data from a text file into a table with have another millons of records all this by a program. I understand if i doing an INSERT BULK statement and exist and not unique record (problem with unique Key) will cause an error and cancel the importation. To resolve that i can import the file into another table and work with this but it's very important the performance. For that i ask: Is there any other way I can do this? Please Help me!!!

  • I don't know if this helps but I had to recently load 43 million rows from one SQL server to another.  Here were my steps:

    1. Move the 43 million to a text file.

    2. Script out all of the indexes and constaints from the target table.

    3. Remove all indexes and constraints from the target table.

    4. Make sure I had plenty of room on the drives.

    5. Create a DTS package that moved the data from the text file to the table.

    6. Run it.  Monitor if you have little wiggle room growth or let it run through the night.  I monitored it.

    7. Rebuild the constraints and indexes with your scripts.

    Hope that helps.

  • When I've used DTS to transfer a lot of records, I set the "Insert Batch Size" to 100,000 records or so. I sometimes got timeout errors if it was trying to do all the records in 1 huge batch. Another advantage of setting the batch size, is that you can then see your destination table grow by getting a count. It gives you an idea of how it's progressing.      Perhaps there's a downside to using batch size, but it hasn't caused me any trouble.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply