Trouble moving 650 million records from one table to another

  • I am trying to import 650 million records from a table in a database that will eventually be removed from the server to a table in my database. I'm trying to use an OLE DB Source, with a query to get the records, and send it to an OLE DB Destination. I have set the connection property's packet size to 32,000 and set the rows per batch and max commit size if the OLE DB Destination to 100k each. My problem is that my memory of 60 gig is getting full and the process pretty much slows to a crawl. What can I do to correct this? I thought setting the max commit size and rows per batch would prevent this.

  • Maybe the initial fetch is causing problems.

    Can you loop/batch the whole thing to SELECT and process x thousand rows at a time?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • unfortunately not. There is no column that I could use that way. I noticed that I didn't remove the index on the destination table, hopefully removing the index will make it run faster.

  • dndaughtery (8/29/2009)


    ..I have set the connection property's packet size to 32,000 and set the rows per batch and max commit size if the OLE DB Destination to 100k each. My problem is that my memory of 60 gig is getting full and the process pretty much slows to a crawl. What can I do to correct this? I thought setting the max commit size and rows per batch would prevent this.

    100,000 rows is a *very* large rows-per-batch and max-commit size. Remember that more than one batch may have to co-exists at the same time, both because clean-up/startup of each batch could overlap and because of the pipeline-processor's desire to try to run parallel streams (5 simultaneaous streams would not be odd, if you do not restrict it). 1000 is a more typical setting and 10k is usually considered the upper limit of workable settings.

    [font="Times New Roman"]-- RBarryYoung[/font], [font="Times New Roman"] (302)375-0451[/font] blog: MovingSQL.com, Twitter: @RBarryYoung[font="Arial Black"]
    Proactive Performance Solutions, Inc.
    [/font]
    [font="Verdana"] "Performance is our middle name."[/font]

  • Export in native format using BCP and then import it.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply