SP Performance Opt

  • Hi All,

    I have recently started a new job and have been given the task of optimizing a huge and poorly performing stored proc which basically contains all of the logic for an import process. We are loading a staging table and then running check after check then updating a series of tables based on different conditions.

    My first question is whether or not I will see performance benefits by simply breaking this down into a series of smaller procs. I am assuming this would make things easier for the optimizer and would definitely make it easier to read.

    My next question is in regards to some indexing. The staging table has an ID column with a clustered index on it that is never used in any query. That is the only index on the table. When I add some nonclustered indexes to speed things up the optimizer does hit them. The problem is it then performs a sort operation using the clustered index even though there is no order by clause. In some cases i am also getting index spools. These are actually causing things to go slower.

    I'm not sure if this is enough for anybody to give me some basic suggestions on how I should go about attacking these problems but any help would be greatly appreciated.

    Thanks in advance,

    Paul

  • Breaking it into smaller chunks probably won't help performance, readability yes..

    you mentioned a sort operation without order by, I'm wondering if there is any joins on this table? I'm wondering if there is something else triggering that sort operation. You might trace it and see which particular queries are the problem.

    I would continue to think about the indexes and I think adding them should help, provided they are the right indexes.

    All-in-all I think we need to know much more about the process to try and give guidance.

    CEWII

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply