validation error while Executing package due to ##Temp table

  • Hi,

    my task is :

    step 1: want to create ##temp table

    step 2: insert data into ##temp table from Flatfile.

    step 3 retrieve value from ##temp table and insert into another table.

    when i execute package, its throwing error

    Error at Get Data from Temp Table to PayerDB [DTS.Pipeline]: "component "OLE DB Source" (576)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".

    becoz the temp table is not exists in Database, so it require meta data reference . if i create ##temp table manually before executing Package, then packagfe is working fine ..

    so how to handle this ??

  • Hi,

    Why are you inserting into temp table then into main table? Are you able to perform the transforms in the dataflow and insert directly into the main table?

    Kindest Regards,

    Frank Bazan

  • hi,

    yes, i m doing that in Data flow task only.

    creating temp table is done in Execute sql task

    inserting into temp table form flatfile is in data flow task

    inserting into Main table form temp table is in another data flow task

    this is the flow

  • Try checking the "RetainConnection" property in the connection manager you are using.

    If you elaborate a little, we may be able to help you find a way to avoid the temp table?

    HTH

    Kindest Regards,

    Frank Bazan

  • hi,

    i have a flat file.

    Temp table name : ##Temp

    Main Table name : Main_Table

    this is the flow

    Execute sql task1 ---> Data flow task 1 --->Execute sql task2 --> Data flow task 2

    Execute sql task1 creating ##Temp table

    Data flow task 1 Read data from Flat file

    Execute sql task2 after inserting, i m checking pre validation in stored procedure . then ##temp table onlyu have valid data

    Data flow task 2 the insert data into main_table from ##temp table

    what is drawback of using Temp tables ?

    how to avoid temp table ?

  • 1. By using a temp table you are making unnecessary trips to the database.

    2. If you lose your connection for some reason, then your base data is lost.

    3. SSIS is perfect for most types of validation. You can use a derived column (or even a script transformation if the validation logic is more complex) with an expression to check that the data in the pipeline is valid and in doing so you won't sacrifice performance. Then you simply use this new column in a conditional split. Records that pass the validation go straight into your main table. Records that don't, you send to an error log or handle in what ever way suits.

    I guess it depends how much time the current execution takes and how much time and freedom you have to work on your solution.

    Kindest Regards,

    Frank Bazan

  • Frank is right on with his advice. SSIS is an ETL tool. You can do all of your validations and transformations inside the data flow before loading the data. You don't need to load the data (in a staging table), run your tranformations/validations, and re-load the data into your destination . That would be a Extract Load Tranform Reload (ELTL) tool! 🙂

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply