SSIS question

  • I need to write to a CSV file from a table in the DB. This is fairly easy and I have done this. But I have to create multiple file and each file can have only 99 records and the file name should be file1....file2...filen

    In my query I have SELECT TOP 99 * FROM Table 1 then I write to the CSV file...after that I am stuck..

    any help whould be greatly appreciated.

    Thanks

  • It seems to me that you'll need to create a script object in a package to do the job. That will give you the most flexibility. Select the entire record set then count the records you write to the file from your record set within your script.

    I don't know of a way to do this with SSIS objects, other than a script object.

    Chris

    Learning something new on every visit to SSC. Hoping to pass it on to someone else.

  • just an idea:

    select *, (row_number() over (order by [object_id],[column_id]) - 1) /99 as FileNo

    from sys.columns

    As you can see, data in the table will be given incrementing FileNo for the froup of upto 99 records.

    Then you can query the result of per FileNo...

    _____________________________________________
    "The only true wisdom is in knowing you know nothing"
    "O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!":-D
    (So many miracle inventions provided by MS to us...)

    How to post your question to get the best and quick help[/url]

  • The rownumber idea looks like a great way to start. At some point it seems you are going to have to perform the file creation process an indeterminate number of times.

    Some options:

    1) Put DTEXEC in a cursor. Use the DTEXEC SET option to pass: a parameter to your SSIS query to return the correct records; and the filename itself to the package

    2) Put a for each loop in your package and loop through a record set that contains a distinct list or you filenames/record sets.

    3)Scripting-not so familiar with that

    So choose you method of iteration (sounds better than loop or cursor).

    Think about your skill sets. Option 1 emphasizes T-SQL skills, 2 SSIS skills, 3 scripting skills.

  • I created a file and has 1000 records now I need to split that file to numerous file having 99 records per file each. How can I do that in SSIS I tried using a For Loop and then rowcount but could not go further

  • What is your source? First it is a table, than it is a file.

    We need to know this or we can't help you.

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • You could try using an expression on the connection string for your destination that constructs the file name from variables that are incremented each time you SELECT your TOP 99.

    [font="Courier New"]sqlmunkee[/font]
    [font="Courier New"]Bringing joy and happiness via SQL Server since 1998[/font]

  • Koen Verbeeck (3/8/2012)


    What is your source? First it is a table, than it is a file.

    We need to know this or we can't help you.

    I could not figure out how to write to multiple files from a table source. So I created one file and now trying to split that one big file based on N rows into multiple files.

    e.g. BigFile.csv - 1000 rows

    file1 - 100 rows

    file2 - 100 rows

    ....

    ....

    file10 - 100 rows

    Thanks for your help.

  • Ok then it sounds you have left the realm of SQL Server. Maybe this.

    http://www.freewaregenius.com/2009/07/30/how-to-split-a-very-large-text-or-csv-file-by-a-specific-number-of-lines-rows/

    Or back the situation up to a single recordset and explain to us what you have tried when you failed to do this in SQL Server.

  • Do you know upfront how many little files you need to create?

    Need an answer? No, you need a question
    My blog at https://sqlkover.com.
    MCSE Business Intelligence - Microsoft Data Platform MVP

  • Koen Verbeeck (3/8/2012)


    Do you know upfront how many little files you need to create?

    No idon't

  • If you can put a row number on it what about using a conditional split and something like this?

    RowNumber >= @[User::RecordLow] && RowNumber <= @[User::RecordHigh]

    and update the variable after each insert to file and set it back to 0 at the end of the run?

    I haven't tried this on my own but it looks like it might work

  • Recently did this.

    Create a proc performing the file grouping like above with row_number().

    Perform an execute sql task to determine the max file iteration you need from the query. Can also be done with a max() aggregate in a dataflow.

    Use this as a feeder into a For Loop as an iterator. Filter on the iterator and create your files via an expression fed dataflow.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

  • Have to ask, but why the restriction on the number of records in each file? Personally, I can't think of a good business case for this.

  • Lynn Pettis (3/8/2012)


    Have to ask, but why the restriction on the number of records in each file? Personally, I can't think of a good business case for this.

    In my case, it was an arbitrary restriction due to feeding a mainframe system. Some hills aren't worth dying on, so I implemented it instead of fighting the good fight.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

Viewing 15 posts - 1 through 15 (of 15 total)

You must be logged in to reply to this topic. Login to reply