"Parent" SQL statement that causes the TRIGGER?

  • Hi,

    For my audit table, I'd like also to record the "parent" sql statement that causes the INSERT, UPDATE & DELETE trigger?  Would this be technically feasible on SQL 2000?  

    Thanks in advance.

     

     

  • Personally, I think it is not logically feasible to audit activities on parent levels from a child level.

  • Yes it can be done with dbcc inputbuffer () from within the trigger.  However I don't have a code sample.

  • Nija: you confused me.

    Could you explain how dbcc inputbuffer affects modification of a parent table, and fire triggers?

    Many thanks

  • The " dbcc inputbuffer" (with the proper syntax, which is to pass the spid that identifies the process that is running) should return the last statement that ran on that spid.

    Russel Loski, MCSE Business Intelligence, Data Platform

  • You first save the dml statement using dbcc input buffer.  Then you can enter that statement into it's own history table, save the id of the record and then use that id to audit all changes made like you would normally do!

  • Here is an example (http://vyaskn.tripod.com/tracking_sql_statements_by_triggers.htm)

    DECLARE @ExecStr varchar(50), @Qry nvarchar(255)
     
     CREATE TABLE #inputbuffer 
     (
      EventType nvarchar(30), 
      Parameters int, 
      EventInfo nvarchar(255)
     )
     
     SET @ExecStr = 'DBCC INPUTBUFFER(' + STR(@@SPID) + ')'
     
     INSERT INTO #inputbuffer 
     EXEC (@ExecStr)
     
     SET @Qry = (SELECT EventInfo FROM #inputbuffer)
     
     SELECT @Qry AS 'Query that fired the trigger', 
     SYSTEM_USER as LoginName, 
     USER AS UserName, 
     CURRENT_TIMESTAMP AS CurrentTime
    

    SQL guy and Houston Magician

  • What's the cost of using this code on all tables all the time for a very low transaction environement?

    Average of about 2-3 transactions/second over an 8 hours period.  I'm looking to implement an audit system on all tables and I would like to add this information in the audit log (actually in a second table using a transactionid AS PK/FK).

  • That's a good question. I've never really benchmarked it but I'm sure it would be an interesting project. 2-3 transactions/sec is pretty light load, I doubt you would notice anything, but that depends on so much.

    From my experience, I've never seen an application that collects data like that add less than 15-20% overhead. In your case, you're writing to a physical table so you can remove the overhead of creatign and maintaing the temporary table. You'll probably want to optimize your audit table for inserts.

    What specificially would you like to audit? If you are just auditing who runs what code when then perhaps running profiler, writing to a table (perhaps even on a different database) might provide less overhead (I don't know). It's worth playing with.

    SQL guy and Houston Magician

  • I want full audit, old row version / who / when / where (application and station) / what (sql statement).

    I would most likely split the duplicate information to it's own table on use pk/fk so that I don't bloat the system too much but that is still in the analysis part.

    That combined with my error log will allow me to troubleshoot almost any error on the spot without guessing for 3 hours what the heck the user did to have that error.

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply