Run multiple foreach loop containers concurrently

  • I created a package that exports data into flat files, compresses them and then sends them via ftp. I would like to save time by starting the ftp process for each file when compression is complete but continue compressing the next file while the previous one is being ftp'ed. Is this possible?

    In SQL there are no absolutes, it always depends...

  • I think this can be done. But there should not be any dependency between the FTP job and the file creator job. That is - they should be asynchronous.

    I assume all the files created have dynamic names,

    Create two SSIS packages separately. First one should have some logging facility which would put the filename created into a table - say - filesToFtp - having a column say fileName.

    The 2nd package would have a forEachLoop container to loop through the table - pickup the filename/location and do the required ftp.

    If you require any heads up - you can refer :

    once a file is processed - delete the records from filesToFtp. In case there is no record present on the filesToFtp - the package successfully finishes.

    Now schedule the 2nd package as a cyclic one (with period say 1 min or whatever suitable) and kick off in parallel with package 1...

    This would need some more tweaking (so that package 2 doesn't run infinitely) for best optimization - but do let me know if this sounds sensible.

    [font="Comic Sans MS"]--
    Sabya[/font]

  • Thanks for that. Something similar occurred to me this morning. I was hoping there would be something a little simpler. But I have quite a few similar ftp jobs so I can build a more robust solution where several packages insert into a filestoftp table and an ftp package monitors the table and sends the files as they appear.

    In SQL there are no absolutes, it always depends...

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply