how to loop over source text files and send data to respective database tables of sql server 2008

  • Hi All,

    My problem is looping over the source text files and sending the data to respective sql server tables.

    I have tried like this ..

    I have kept a for each loop container. and created a variable("textfilename") to store texfile names.

    and another variable to store tables names of that particular schema.

    Inside that for each loop, i kept execute sql task-- to truncate the table data before I load it.

    and under this, I kept dataflow task. In the data flow task, I took one falt file source and one sql server destination.

    In the flat file source ----under expression-- connection string-- I have given my text files path along with the variable name.

    and I am thinking that I need to use script task to correlate textfilename and tablename.

    (For ex: my text file name is "products.txt" and my table name would be "stage.product" ,and textfile is "Accounts.txt" and my table name wouldbe "stage.accounts" ---- stage is the schemaname).

    But I am unable to write the code in the script task. I am going wrong somewhere I

    feel.

    Can anybody help. please I need this task to finish as early as possible.

    Thanks alot

    Veena

  • How many files are we talking about? My main issue is that you have to set source structure, field mapping, and destination mapping at design time. What this means is that if you have 5 different structures (source to dest) then you need 5 different data-flows. What you can do is use expressions to get the right data-flow to run and the wrong ones to not.

    Within a F-E-L I often add an Exec SQL task that does something like SELECT 1 just so I have a place to start and then add on success arrows to the data-flows, then use expressions on the arrow to add an additional requirement, this means that it is On Success AND whatever the expression needs to succeed.

    Clear?

    CEWII

  • hi CEWI,

    Thanks for the reply.

    Actually my task is to use single flatfile source(several textfiles- suppose 10 textfiles are there) and single sql server destination(10 tables are there in the sql server). This should be done under single dataflow task. I think I need to use two different variables to store txtfilename and tablename.

    And to loop over these variables I am using for each loop container. and trying to use script task to correlate textfile1 data should load into table1 and textfile2 data should load into table2, and so on.

    Please guide me ...

    Thanks

    veena

  • Are all the files/table the exact same structure? If not you have a problem. And you can't do this with a single data-flow.

    The input schema and mapping of source columns to destination columns MUST be set at design time. Not the filename that it is coming from or the table it is going to. What this means is that you can't just take FileA and say put it in TableM.

    Given this requirement how many different structures are you working with? If every file has a different structure this is going to be painful.

    CEWII

  • hi CEWII

    Thanks for the reply.

    Textfile format is same(all are .txt files). Can it be possible using Script task/ script component mapping between each text file source and each destination table?

    Mainly,I need to take each .txt file dinamically and then finally load into respective table in one particular destination ------- Any other ideas to accomplish this ?

    Thanks

    veena

  • vithasun (2/19/2010)


    hi CEWII

    Thanks for the reply.

    Textfile format is same(all are .txt files). Can it be possible using Script task/ script component mapping between each text file source and each destination table?

    Mainly,I need to take each .txt file dinamically and then finally load into respective table in one particular destination ------- Any other ideas to accomplish this ?

    If I read your post correctly, not only are they all txt files but the internal structure is always the same. I assume that the destination tables have different structures (different name isn't a prolem). Keep in mind field order would not be considered a difference in the structure. I know of no way to remap at runtime. The different file name and table name isn't an issue, totally easy to do. How many different mappings are there ?

    CEWII

  • hi CEWII

    Thanks for the reply.

    Actually I have all text files only. But each text files has different number of columns.

    i)So, i have kept for each loop container and configured it, and created a variable to store text file name.

    ii) Inside the for each loop container, I have kept data flow task.

    iii) In the data flow task, I have kept one flat file source and one oledb destionation.

    iv) in the flatfile connection string expression, I gave the textfilename variable which i have created previously.

    v) And executed it. But as each text file have diffent number of columns,now the mapping of columns for text file to destination table is problem.

    as the mapping is happened in the design time.

    vi) I try to use script task to map the columns of text files columns to destination table/tables.

    But i am not aware of how to write coding in the script component, do I need to use script component as source/destion/transformation, and what to give in the connection Manager property of script component and do I need to choose anything in the inputs and output columns of script component or do i need to code ?

    I really need this one,could you please help doing this.

    Thanks alot

    Veena

  • Ok, I can't be any more clear on this.

    Field mapping CANNOT be done at run-time. It doesn't matter what script you write.

    CEWII

  • I'm sorry to be a little short with you but you don't appear to be listening to what I'm saying.

    The mapping you want to do CANNOT be done at run-time, so stop trying to force that path. If you want to continue to push that then I have helped you all that I can.

    If you want to make this work then you need to do some analysis. I have asked this question before, this will be the last time.

    How many variations of source to destination mappings are there? This defines the number of data-flow components needed.

    You still just need one for-each-loop.

    Once you answer this question we will move on to the next step.

    CEWII

  • Hi CEWII,

    I think I could not understand what do you mean by "how many variations of source to destination mappings are there?" ( Sorry I am really new to SSIS, I just started learning it).

    Thanks

    Veena

  • We'll use this example, 10 files, 6 tables. Of those 10 files there are 3 that have the same structure and go into a table with the same structure. What I'm trying to find out is how many mappings we have to build..

    F1(10 fields) -> Table1(10 fields)

    F2(same struct as F1) -> Table1(10 fields)

    F3(11 fields) -> Table2(11 fields)

    F4(9 fields) -> Table3(9 fields)

    F5(8 fields) -> Table4(8 fields)

    F6(same struct as F4) -> Table3(9 fields)

    F7(same struct as F4) -> Table3(9 fields)

    F8(same struct as F1) -> Table1(10 fields)

    F9(10 fields) -> Table5(10 fields)

    F10(15 fields) -> Table6(15 fields)

    What we show here is 6 different mappings that we have to build, there are some variations possible even in this, these variations are not tied to SSIS their are directly tied to the layout of the source data and the destination table. IF you have 10 input files and 10 tables and there are no variations only a 1-1 match then the problem is you have to build 10 data-flows, we CAN get it to call the correct data-flow at run-time for the input file, but all 10 must exist.

    Is this clearer?

    CEWII

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply