Suggestions for inserting 420 million rows into a table?

  • Outline:

    Destination table is a flat table for reporting.

    A job runs over the weekend inserting to the destination table selecting from our reporting database.

    Uses select - into

    Errors out with:

    Executed as user: ''. The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases [SQLSTATE 42000] (Error 9002) The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases [SQLSTATE 42000] (Error 9002). The step failed.

    Thanks,

    DK

  • detail your doubt!!!! 😀


    [font="Times New Roman"]rfr.ferrari[/font]
    DBA - SQL Server 2008
    MCITP | MCTS

    remember is live or suffer twice!
    the period you fastest growing is the most difficult period of your life!
  • I'd export the data using bcp and import it into the target db using bcp again.

    Make sure the target db is set to bulk-logged recovery or Simple recovery.

    The current approach will be fully logged during the insert to ensure transactional consistency (regardless of the recovery model of the target db) hence killing the transaction log.

    There are several aternatives (I, personally qualify as second best options):

    a) insert the data in batches (with the risk of incomplete data at the target)

    b) increase the useable size for tempdb and the log to allow to finish



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply