Same SQL; Diff Params; Diff Execution Times

  • I have a problem that is starting out in Reporting Services but I believe also involves basic SQL issues.

    In SSRS, I have a report created that will execute in 30 seconds when the default parameters are entered. If I change the parameters, the report will take 9 minutes to execute. If I reverse the values of the parameters, I'll get the same results. The default will run in 30 seconds and the entered values will run in 9 minutes.

    So to recap the scenarios. First test:

    Default set to US - run time 25 seconds

    Change country to UK - runs in 9 minutes

    Then change the programming of the defaults and:

    Default set to UK - run time 25 seconds

    Change default to US - runs in 9 minutes

    Upon doing some further investigation, I've discovered that the SQL is being executed with two different execution plans. WIth the default paramters (regardless of what they are) execution Plan A is used. When the parameters are changed, execution Plan B is used. I do no know enough about execution plans to read through them and understand the differences, however, the plans are not really the issue. The issue is why does SQL Server create two different execution plans?

    In Plan A, default set to US, the execution plan does not see the passed parameters. The values are directly entered into the SQL. In this example, the query will run in about 30 seconds.

    WHERE (REGION IN (N'US',N'Canada')) AND (SUB_REGION IN (N'Canada',N'US')) AND (COUNTRY IN (N'Canada',N'United States of America'))

    In Plan B, changed to UK, the parameters are passed and are seen in the SQL. In this example, the query will take about 9 minutes to run.

    WHERE (REGION IN (@Region_Selected)) AND (SUB_REGION IN (@Sub_Region_Selected)) AND (COUNTRY IN (N'Great Britain',N'Ireland'))

    For some reason, the parameter @Country_Selected isn't passed but the values are entered. In this example, Great Britain and Ireland.

    Why does SSRS substitute the parameters for the values in Plan A but not in Plan B?

    Why does the SQL engine care and create two different execution plans?

    Rob

  • It's called plan cache / data cache.

    Or you could be victim of parameter sniffing.


    N 56°04'39.16"
    E 12°55'05.25"

  • Peso (3/27/2008)


    It's called plan cache / data cache.

    Or you could be victim of parameter sniffing.

    When you say plan cache/data cache, I'm guessing this refers to the system caching the plan and/or data from a previous execution. Then reusing the cache and thus executing faster. Maybe.

    But the same thing will occur immediately after rebooting the server. The FIRST execution with the default parameters will occur in 30 seconds. Is the plan still in cache? Shouldn't it have been wiped out after the reboot?

    Regardless, why is it I can change the default parameters from US to UK and then have the UK data pull in 30 seconds and the US take 9 minutes. In that case the SQL for the UK execution has changed. The WHERE clause will now have the UK values. Shouldn't the server view this as different SQL and not use the cache?

    If I run UK twice in a row, shouldn't the second execution use the cache? It doesn't appear to because the second (third, fourth in a row) execution takes 9 minutes each.

    After running the UK and each taking 9 minutes, I enter the values back to US and the results will return in 30 seconds.

    Rob

  • Try eliminating the plan caching issue:

    CREATE PROCEDURE MyProc

    WITH RECOMPILE

    AS

    ....

    This will force the procedure to generate a new plan every time it runs. If this solves the issue, you need to spend some time on your procedure making sure the parameters being passed in are ok with the plan it will cache. One option may be to split it into two procedures and have a main procedure decide which sub-procedure to call based on the parameters.

    You could leave the RECOMPILE option in the procedure, but spending the time to solve the problem will not only provide better performance, but it will be a great learning experience if you have not done a lot of procedure tuning.

  • Michael, I don't see where the OP mentions using a stored procedure. He does mention that he is using Reporting Services.

    Robert,

    How are you generating the SQL? Is it being dynamically generated in the report like:

    ="Select * From Table where " & IIF(Parameters!prmOne.Value="X", "country=" & Paramters.prm2.Value, "country='US'"

    Or is it straight sql always using the same paramters?

    What is the difference between the execution plans? If you post the plans we might be able to offer some better advice.

    Could you run profiler to see what is happening behind the scenes? You would want to make sure you get under Stored Procedures:

    RPC:Completed

    SP:CacheMiss

    SP:CacheHit

    SP:CacheInsert

    SP:Completed

    And under TSQL:

    SQL:BatchCompleted

  • I've thought about converting SSRS so it calls a Stored Procedure and thus I could have more control over the SQL. The downside is the way SSRS passes parameters is a pain but if it solves the problem...

    I'll give it a test.

    Rob

  • Jack - no, I am not currently using a stored procedure. And no, I am not creating the SQL dynamically within SSRS. Just using plan SQL with parameters.

    I've attached the execution plans I captures using the Performance Dashboard in SSRS (from Microsoft). I've also attached an Excel file from the output of the report. I'm hoping I've captured the correct files.

    Default US = 30 seconds

    UK Parameters = 9 minutes

    I'll try to capture the data you suggested with Profiler.

    Rob

  • Ah - sorry, read the post quickly.

    For a test, run the report, clear the cache manually, and then run the report again.

    At the end of your query add:

    OPTION (RECOMPILE)

    This will ignore cached execution plans (actually it will not store the plan in the cache).

    If that works, the same thing applies, spend some time to modify the query you are running if possible.

  • Robert,

    I have looked at the plans and the executed statements. If you look at the executed statements they are 2 different statements because the one using the US defaults only has 1 parameter passed to SQL while the UK one has 2 parameters passed. I would call this a bug in SSRS in the way it is executing the SQL. I believe when you run profiler you will find that SSRS is running sp_executesql and when running with the defaults it is creating the SQL Statement with literals instead of parameters so the statement executed on the SQL Server is like this:

    sp_executesql 'sql statement', '@ForeCastDate datetime', @ForeCastDate = @ForeCastDate and when running with selected values it looks like:

    sp_executesql 'sql statement', '@ForeCastDate datetime, @Region_Selected nvarchar(4),@Sub_Region_Selected nvarchar(2)',

    @ForeCastDate = @ForeCastDate, @Region_Selected = @Region_Selected, @Sub_Region_Selected = @Sub_Region_Selected

    Another thing I would note is that in your query plans there is only 1 time an index is being used and that is in the UK example where you are hitting an index on WatchListLog. So, if this is a regularly run query, you would benefit from some indexes, particularly on ForeCast_Raw_Data as this is the biggest cost in each of the queries.

    Also, you are calling a table-valued function and, not knowing what is in there, I would be inclined to convert that to a stored procedure and handle the defaults in there.

  • Jack Corbett (3/27/2008)


    Robert,

    I have looked at the plans and the executed statements. If you look at the executed statements they are 2 different statements because the one using the US defaults only has 1 parameter passed to SQL while the UK one has 2 parameters passed. I would call this a bug in SSRS in the way it is executing the SQL.

    That is my thought that SSRS is passing the SQL in two different formats and thus causing the problems.

    Jack Corbett (3/27/2008)


    Another thing I would note is that in your query plans there is only 1 time an index is being used and that is in the UK example where you are hitting an index on WatchListLog. So, if this is a regularly run query, you would benefit from some indexes, particularly on ForeCast_Raw_Data as this is the biggest cost in each of the queries.

    The test plans you used were from Tuesday. Last night (Wednesday), I added some indexes to the tables. The performance when running the default parameters has improved a little, maybe 30 seconds down to 20. The changed parameters changed from nine minutes down to about 4.5 minutes. Quite a bit better but still a problem.

    Jack Corbett (3/27/2008)


    Also, you are calling a table-valued function and, not knowing what is in there, I would be inclined to convert that to a stored procedure and handle the defaults in there.

    I was choosing to use a table-valued function because it allows me to do the parameters within SSRS vs. having to pass the parameters back to a stored procedure. When using a stored procedure in SSRS, it is a pain to configre the passing of parameters. However, I think I may have to go that way.

    I'm going to rewrite the table-valued function as a stored procedure and test it. With other work, I won't get to this until late today. I should have an update tomorrow morning.

    Thank you for your input.

    Rob

  • Rob,

    I usually explain things in detail because you never know the expertise of the poster so I like to try to make it so anyone can understand it.

    I would have to disagree about passing parameters to stored procedures in SSRS. I found it to be painless, of course I am firmly entrenched in the always use stored procedures camp.

    I look forward to hearing what happens with the stored procedure version.

  • Jack Corbett (3/27/2008)


    Rob,

    I usually explain things in detail because you never know the expertise of the poster so I like to try to make it so anyone can understand it.

    I would have to disagree about passing parameters to stored procedures in SSRS. I found it to be painless, of course I am firmly entrenched in the always use stored procedures camp.

    I look forward to hearing what happens with the stored procedure version.

    Holy Hanna Batman! errr...Jack!

    Now, regardless of selection, the data pulls in 10-15 seconds. Whew!

    I appreciate everyone's help.

    But now I have another problem to solve. One reason I liked using a view or table-valued function, is that I could use one view to pull all data and then change the parameters within SSRS. Some reports will have five parameters and others will have 15.

    Now that I'm using a stored procedure, I need to deal with that issue. I'd prefer not having to write a stored procedure for each report. My thought is to write one stored procedure that accepts all 15 parameters. In SSRS, if the parameter is not needed, pass a bogus value like "XXXXX". Then in the stored procedure, use dynamic SQL to include/exclude items in the WHERE clause.

    Any suggestions?

    Rob

  • Well, Rob, I also fall into the performance always matters camp, so I normally don't mind writing specific stored procedures because that is how you get the best performance. I used to be one of the guys who tried to write stored procedures in a flexible manner, but I have found that the more flexible a stored procedure the less efficient it is. I guess if the users of your report are willing to wait for it than you can go with flexible, but I know I get frustrated when I wait for a page to load so I like things to be fast.

    If you do go with one sp and dynamic sql within it. I have found that sp_executesql usually outperforms EXEC (SqlStatement). The other option is to call the same sp from the report(s) and have the parameters passed in determine which sub-sp gets called. Still gotta write all the sp's, but only reference one from the reports. I figure I gotta write the code for all the variations anyway, so might as well get the best perfomance I can.

    Glad I could offer some help.

  • Sometimes in an attempt to keep things simple and easy I actually wind up complicating them. And possibly affecting performance. I suppose it is all part of learning.

    BTW - when I converted to the stored procedure, all I did was take the SQL out of SSRS and put it into the stored procedure. The stored procedure is still calling the table-valued function.

    With this in mind, I've decided to go with seperate stored procedures for each report. While there will be more SP's to manage, the actual code will be far less complex that having a SP that manages ALL of the possible parameters. And, as you mentioned, it'll probably perform a lot better.

    Thanks again.

    Rob

Viewing 14 posts - 1 through 13 (of 13 total)

You must be logged in to reply to this topic. Login to reply