Need statistics update several times a day to prevent slow database

  • Hi,

    I have a strange issue on an (Microsoft Dynamics 4) table. Here are the facts:

    - table has 1.566.532 rows

    - table has PK (clustered) and 2 non-unique, non-clustered indexes

    - in 99% of the case an insert is done, 1% is update

    - auto create/update statistics is on

    - every day at 4:30 fragmented indexes are reorganized/rebuilded

    - every day at 5:15 an sp_updatestats is running on this database

    - I am logging the tablestats and I see the statistics are updated (rowmodctr drops to 0)

    Today (and also several times this month) customers are complaining. If I take a look at the tablestats, I see 903 modifications. Thats 0,06%

    If I do a manually statistics on that table, the performance is back again.

    Since I don't want tot spend my day monitoring rowmodctr columns and doing stats update can somebody explain to me why performance is dropping on this small amount of modifications?

    Thanks!

    Wilfred
    The best things in life are the simple things

  • Forgot to say:

    - Os is windows 2003 enterprise 64bit

    - MSSQL is 2005 Enterprise 64bit with SP3

    - 32GB memory, 27GB for MSSQL

    - dedicated (non-virtual) server

    Wilfred
    The best things in life are the simple things

  • It may be nothing to do with the number of modifications, and you may be suffering from a "parameter sniffing" problem, where you end up with a cached plan that just doesn't work for some of the parameters.

    Updating the statistics could then be forcing the plan to be flushed from the cache, and a better one created... until the parameters change again, and the new plan is now the wrong one.

    Can you narrow down the slow performing statements using profiler?

  • Thanks for your reply.

    I added the executionplan, but this is as it runs now (performing well).

    I'll add another version if it's performing bad

    Wilfred
    The best things in life are the simple things

  • Chances are that you will get the same execution plan, but with different actual counts. It would be helpful if you can determine the range of parameter values that might be used in the query. Do you know why the FAST 1 hint was added?

  • This statement is directly copied from Axapta (ask Bill 🙂 )

    The 4 parameters are actually not very special, @P1 and @P3 are the same, a 3 digit company identifier. @P4 is an integer and @P3 is a string.

    @P1 and @P3 are on almost every table

    Wilfred
    The best things in life are the simple things

  • Have a look a the fill factor on the table it may be better if this was adjusted to prevent page splits and fragmentation.

    Facts are stubborn things, but statistics are more pliable - Mark Twain
    Carolyn
    SQLServerSpecialists[/url]

  • Added a new execplan. Have a look at the key lookup on ProjProposalRevenue. This now costs 96% (instead of 61%). The operator cost/SubTree cost is much higher.

    Also attached the details from dm_db_physical_stats.

    fillfactor is 90%

    Wilfred
    The best things in life are the simple things

  • If the problem is parameter sniffing (and it seems likely) then there are two paths that you can take. The first is the correct path - determine which parameters could be passed into the query and whether any combinations would result in a totally different query plan, then resolve accordingly. The second path is a shortcut - it's effectively brute forcing the optimizer to recompile the query plan every time it runs. To do this you need to change the hint to option (recompile, fast 1).

    A few warnings if you choose the fast path:

    1) recompiling means cpu time; if this query runs a few times a day then it's not an issue, 10 times a second and it might be a big deal.

    2) if the stats are skewed in a way that the optimizer can't make a good decision then you may still get bad plans.

    I don't know how your latest plan got generated; did you do something to force it? If not then either the existing plan was dropped from the cache or stats forced a new plan. Unfortunately unless you know the parameters and the data distribution then it's hard to say whether it's the best possible plan - all you can really say is that it would be the best plan the optimizer could generate at the time given the existing statistics. You also need to review whether the existing hint is really necessary as that could change the plan.

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply