How to troubleshoot performance issues?

  • Hi,

    I have a database that is losing performance very quickly when the db size grows large. If I run my insertion benchmark on a clean database, I can get 80 insertions per second. However, if I run it on a db that has about 25 GB of data already in it, it will start off at around 70 insertions per second, drop to 50 per second after 15 minutes, 40 per second after 30 minutes, and 20 per second after about 1.5 hours. The DB grew approx 2GB in the 1.5 hour period.

    Thanks!

  • Firstly, don't reply on autogrow. If you know the DB is growing, manually increase the size of the files to a sensible value.

    These may help in finding the problems.

    http://www.simple-talk.com/sql/performance/finding-the-causes-of-poor-performance-in-sql-server,-part-1/

    http://www.simple-talk.com/sql/performance/finding-the-causes-of-poor-performance-in-sql-server,-part-2/

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • yes, make sure you are not waiting on the file to grow. Create the file at a size that will accomodate all of your data and allow some room for new data. Same goes for the transaction log, make sure it is large enough to handle the size of the activity taking place between transaction log backups (Full recovery mode).

    you don't mention indexes or maintaining them. If you are inserting into tables that have clustered and non-clustered indexes you might need to think about leaving room for the new inserts using padding and frequently rebuilding the indexes to reduce fragmentation and provide more padding for new rows.

  • If you *are* relying on autogrow, the default setting (increase data file size by 1Mb every time) is ludicrously small for pretty much any non-trivial database...increasing that (once you've got the database at a suitable size for the number of transactions occurring) would help.

  • Another thing to consider: Make sure that your queries are set-based, and not operating on a row-by-row basis. Jeff Moden has an awesome article detailing RBAR (as he calls it), and its many pitfalls.

    More RBAR and “Tuning” UPDATEs[/url]

  • Jake50515 (9/24/2010)


    I have a database that is losing performance very quickly when the db size grows large. If I run my insertion benchmark on a clean database, I can get 80 insertions per second. However, if I run it on a db that has about 25 GB of data already in it, it will start off at around 70 insertions per second, drop to 50 per second after 15 minutes, 40 per second after 30 minutes, and 20 per second after about 1.5 hours. The DB grew approx 2GB in the 1.5 hour period.

    Tell us more about your "insertion benchmark" table.

    Do you have a clusterd index on your "insertion benchmark" table?

    Is "insertion benchmark" process inserting out of order (based on clustered index column) rows?

    _____________________________________
    Pablo (Paul) Berzukov

    Author of Understanding Database Administration available at Amazon and other bookstores.

    Disclaimer: Advice is provided to the best of my knowledge but no implicit or explicit warranties are provided. Since the advisor explicitly encourages testing any and all suggestions on a test non-production environment advisor should not held liable or responsible for any actions taken based on the given advice.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply