July 30, 2013 at 2:17 pm
Is there a way, to import csv Files with different count of fields (all nvarchar (max)) with one Stored-Procedure in one fix table (with Import-ID) , without having to make much adjustment? The Files are flat Staging csv-Files. Maybe with pivot/unpivot Function etc...
Field and Rowdelimiter are allways the same.
Thank you
Kind Regards
Nicole
😉
July 30, 2013 at 3:33 pm
If you want a fix table, the only option would be to import the lines as they are to crack them later. Or size the table for the maximum possible number of fields.
But if you want to import them with a stored procedure in T-SQL, your only option is BULK INSERT which is not very flexlible.
A stored procedure in the CLR is a better option, but an even better option is probably a console-mode C# program - or why not SQL Server Integration Services? (Not that I know SSIS myself, but I've heard about it. 🙂
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
July 30, 2013 at 3:49 pm
info 58414 (7/30/2013)
Is there a way, to import csv Files with different count of fields (all nvarchar (max)) with one Stored-Procedure in one fix table (with Import-ID) , without having to make much adjustment? The Files are flat Staging csv-Files. Maybe with pivot/unpivot Function etc...Field and Rowdelimiter are allways the same.
Thank you
Kind Regards
Nicole
😉
Use Bulk Insert to read just the first line of a file into a single column temp table. Split that column on the delimiter to extract the column names. Use those column names to create a table dynamically and to create a dynamic Bulk Insert.
Make sure that you "de-louse" the column names to make sure they're "Bobby Tables" proof.
--Jeff Moden
July 30, 2013 at 3:56 pm
Jeff Moden (7/30/2013)
Use Bulk Insert to read just the first line of a file into a single column temp table. Split that column on the delimiter to extract the column names. Use those column names to create a table dynamically and to create a dynamic Bulk Insert.Make sure that you "de-louse" the column names to make sure they're "Bobby Tables" proof.
So that reinforces what I said, that you should do this outside T-SQL, that is in C# or SSIS. What Jeff suggests leads to an orgy in dynamic SQL. Dynamic SQL is all about string manipulation, for which C# is much better fitted than T-SQL.
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
July 30, 2013 at 5:56 pm
Erland Sommarskog (7/30/2013)
Jeff Moden (7/30/2013)
Use Bulk Insert to read just the first line of a file into a single column temp table. Split that column on the delimiter to extract the column names. Use those column names to create a table dynamically and to create a dynamic Bulk Insert.Make sure that you "de-louse" the column names to make sure they're "Bobby Tables" proof.
So that reinforces what I said, that you should do this outside T-SQL, that is in C# or SSIS. What Jeff suggests leads to an orgy in dynamic SQL. Dynamic SQL is all about string manipulation, for which C# is much better fitted than T-SQL.
While it certainly can be done outside of SQL it wouldn't lead to an "orgy" of Dynamic SQL anymore than it would lead to an orgy of loops in C# (which it wouldn't). It's just not that difficult.
I can't speak of SSIS because I don't use SSIS.
--Jeff Moden
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply