DTS problem (ONE source table to Multiple tables)

  • hello,

    I want to do some DTS job which will transfer data from one table to several tables. How can I leverage DTS tasks? I saw that transform data task only support ONE target table.

    here's the case:

    Source table:[User]

    Target table:[Person],[Address],[Userlogin]

    I need fetch data of [User], insert some fields into [Person], then get the primary key (auto increment), then insert rest data into [Address] and [Userlogin].

    thanks in advance!

    ps: i'm using sql server 2000

  • You could perhaps use an updatable veiw?

    Or 3 different tasks.  One for each table I'd imagine this could be done with a Data Driven Query.   

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • It's hard to say if an updateable view will work without knowing the DDL for each table, but that may be a good choice.

    You could put the three inserts in a single Execute SQL task or put them in a stored procedure and execute that in a Execute SQL task.

    Greg

    Greg

  • hi, greg

    thanks for your input, I'm using the Execute SQL task now.

    for the data driven query task, it's too complicate for me........

  • Now that I've re-read your question I'd suggest not using a data driven task anyhow. 

    you should be able to acomplish your objective with joins...

    First Copy your data from the Source Table to the Person Table.

    Once you have that done you can join back to your original table to create your other data.  This could be done in a simple insert

    INSERT INTO Address

    SELECT P.PersonID, U.AddressLine1U.CityU.StateU.Zipcode 

    FROM Users U 

          INNER JOIN Persons P 

           ON Users.userID Persons.UserID   

    Then do the same for UserLogin

    INSERT INTO UserLogin

    SELECT P.PersonID, U.LoginInfo

    FROM Users U 

          INNER JOIN Persons P 

           ON U.userID P.UserID  

    Obviously without seeing your DDL and such I'm guessing on alot of this, but you get the idea...

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • thanks, luke

    I got another problem when using the Execute SQL task:

    I simply call a stored procedure in Execute SQL task, the stored procedure failed due to some insert error (but I do rollback and return 0), and the task failed, as well as the whole package. I just want to "swallow" the error in stored procedure, so that the Execute SQL task always run successfully.

    how can I achieve this?

     

  • You could always catch the error within your stored procedure so that the process would not be aborted in a half way. You log the errors to a log table to examine your inserting later on.

    something like:

    set @error=@@ERROR  

          if @error!=0

              begin

               insert into your_LOG

              (error_code,description,process_name)

             values(@error,....)

           end

    Kathy

     

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply