DTS Rollback

  • Can anybody make me understand why the following is happening?

    I had created a customized DTS package, which reads data from a txt file and throws into the database and creates a log file for SUCCESS and FAIL and a GENERAL LOG for BOTH. But i doesn't come to know the reason for onething is that When a rollback is happend it creates/generates a SUCCESS LOG But when i check into the database no data is transferred.

    Please somebody help me on this topic.

  • How are you creating your success/fail/general logs? 

    Are they being written by workflow tasks inside your package? 

    Are they being written to from the Step/Advanced Tab of a SQL Agent Job? 

    Are they Being written to from the package itself, Package Designer/Package/Properties? 

    I'd suggest if you're not logging those places, I'd start to so that you can try to debug this.  Also you may want to use the exception file from the Transform Task properties.  It's on the Options tab. 

    Using logging like this you may be able to figure out what's causing the RollBack. 

    Does the package execute properly when you run it from Enterprise Manager?

    To help us help you read this[/url]For better help with performance problems please read this[/url]

  • First of all thank you for ur reply.

    We are writing the log as per your 3 step.

    I have created 3 packages

    But our actual working for the log is as follows,

    First to truncate the existing temp tables.

    IN Second package, we are pushing the data of text file into the temporary tables and are doing all the validations and the correct data is transferred to the physical tables as required in one package.

    Using Third Package the data (Successful or Failed)which is transfered into one temporary table the log is generated.

    Now if you can guide us how to go ahead? How to trap the problems of rollback.

  • The way I've done success/fail logging before is instead of creating different packages,  do this in one package with different tasks.  Setup your workflow properties to keep the package from executing all steps at the same time.

    The first task would be to truncate your temp tables in an execute SQL task, although if they are temp tables perhaps you'd need to create them or whatnot...

    The second task would be your data transformation, and the third task would have 2 parts. 

    On Success of step 2 write your success message to your log table, log file, whatever in an execute SQL task or whatever.  Use an on success workflow consatraint.  Also use an On Failure workflow contraint to point to another Execute SQL task that handles your Failure logic. 

    What I've always done is build in some sort of Notification via email to the DBA or whomever is in charge of that data, to make sure they know as soon as there is a problem with the task.  Then they can log into EM check the logs and see what failed. 

    This isn't an elegant solution, but it's one that's been proven to work for me.  If you're looking for your package to just keep on rolling instead of sending a rollback when an error occurs think about using the Exception file (Data Transform Task, Properties, Options) and setting the max number of errors before ther package fails... http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dtsdsgnr/dtsdsgnr3_8dkh.asp

     

    To help us help you read this[/url]For better help with performance problems please read this[/url]

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply