Parsing tab selimited text into SQL yable

  • Hello! I have a weekly tab delimited file that is currently being parsed row by row in .net code and then inserted into a new table. The table created has 503 columns with varying row counts from week to week. The current process takes way to long because each row is parsed one by one.

    I'm looking desperately for a quicker way to accomplish this process.

    Ingredients:

    1. Tab delimited text file

    2. MS SQL Server database

    3. Program is run .Net

    Has anyone run across a similar situation or can anyone suggest a time efficient way to get this done?

    Thanks in advance for any help!

  • u can use the data import wizzard and create a dtsx process to do this. setting it up can be a pain in the but. Thats comming from someone who was more accustomed to dts though. Or you can go into business inteligents studio and create a dtsx to do it as well. Dont know if u can add attachments to these tickets however if u can i'm sure one of these guys could do that rather quickly

  • you can also write vba i think it's called which is code behind the actual excel file to create a connection and insert the records into db. Probably wont help timewise though.

  • use SSIS or bcp. If you play with SSIS at all, you can preview the contents of the text file before running the import.

  • While using BCP, SSIS etc will undoiubtedly help you, they tend to be problematic with the problems you have with varying column numbers coming in.

    Therefore if you already have a program parsing the file and inserting in to SQl Server I would look at ways to speed that up, as you have presumably solved much of the problems with varying column numbers etc already.

    I have had similar problems of varying format files to load , sometimes inserting millions of rows and have managed to get excellent performance by a combination of:

    a) Do not use a command that is just an insert statement being strung together for each row. Work out the insert, set it up with parameters then prepare it. Then for each row call with the appropriate parameters.

    b) Define a reasonable number of rows to insert in a batch. Start a transaction, insert lots, and commit say every 1000. Lots less log file I/O than one trasnsaction per row.

    c) If a and b are not enough and you can get the data to a datareader of some sort, then investigate the bulk insert api. A few hours later and it will really fly.

    Good luck

    Mike

  • I have used Bulk Insert to do this but my files have been Comma Serperated

    I also have a table speacially for this purpoase. In your case you have a table with the 503 columns. Do a Bulk Insert. then write the logic to update the other tables

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply