January 13, 2014 at 9:59 am
Hi All,
I have this query that runs in 36 secs in SSMS but takes unbelievable 40 mins to run in SSRS. I have read all I could read about parameter sniffing and have tried several methods like using the WITH RECOMPILE in the proc or the option (Recompile) in the statement.The latest method was to assign the procedure to use local variables as in the query below but still my report still takes time. What is wrong? I have attached the execution plan of the query.
EO
ALTER PROC [dbo].[testing1]
(
@startDate DATETIME,
@endDate DATETIME,
@CutOffPeriod INT,
@FromDate DATETIME,
@ToDate DATETIME,
@Employee INT,
@Employees INT,
@office VARCHAR (5),
@EmployeeType VARCHAR (5),
-- @WorkStatus VARCHAR (5),
@FROMPERIOD INT,
@ToPeriod INT ,
@CUT_OFF_DATE DATETIME,
@Maternity varchar (100)
)
AS
DECLARE @startDate1 datetime
DECLARE @endDate1datetime
DECLARE @CutOffPeriod1 int
DECLARE @FromDate1datetime
DECLARE @ToDate1datetime
DECLARE @Employee1int
DECLARE @Employees1int
DECLARE @office1varchar(5)
DECLARE @EmployeeType1 varchar(5)
--DECLARE @WorkStatus1 varchar(5)
DECLARE @FROMPERIOD1 int
DECLARE @ToPeriod1int
DECLARE @CUT_OFF_DATE1datetime
DECLARE @Maternity1 varchar (100)
SET @startDate1 = @startDate
SET @endDate1= @endDate
SET @CutOffPeriod1 = @CutOffPeriod
SET @FromDate1= @FromDate
SET @ToDate1= @ToDate
SET @Employee1= @Employee
SET @Employees1= @Employees
set @EmployeeType1 = @EmployeeType
--SET @WorkStatus1= @WorkStatus
SET @FROMPERIOD1= @FROMPERIOD
SET @ToPeriod1= @ToPeriod
SET @CUT_OFF_DATE1 = @CUT_OFF_DATE
SET @Maternity1 = @Maternity
;WITH MissingHours ([00-MISSING],[MISSING HOUR],[Missing Hours],[Empl Uno],[Employee Code],[Employee Name], [NULL], [Missing Times], Missing, [NULLs] , [''], [Missing Time])
AS
(SELECT
'00-MISSING','MISSING_HOURS',
MIN(sh.specified_hrs) -
(Isnull((Select sum(tobill_hrs) from BO_LIVE3.dbo.tat_time times where times.WIP_Status <> 'X'
AND ((CoD.cut_off_date is null
AND CASE when times.Period = 0 then @FROMPERIOD1 else times.Period end
between @FROMPERIOD1 and @CutOffPeriod1) or
( CoD.cut_off_date is not null and times.post_date <= CoD.cut_off_date ))
and times.entry_status in ( 'U','R')
and Dateadd(dd,cast(Tran_date as int),0) between
case when @FromDate1 < (min (Dateadd(dd,case when datepart(dw,dates.dt) < 7
then 0-Datepart(Dw,dates.dt) else 0 end,dates.dt)))
then (min (Dateadd(dd,case when datepart(dw,dates.dt) < 7
then 0-Datepart(Dw,dates.dt) else 0 end,dates.dt)))
else @FromDate1 end and
Case when @ToDate1 > (max (Dateadd(dd,case when datepart(DW,dates.dt) < 7
then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt)))
then (max (Dateadd(dd,case when datepart(DW,dates.dt) < 7
then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt)))
else @ToDate1 end
and tk_empl_uno = per.empl_uno),0))AS [Missing Hours], per.empl_uno, b.employee_code,b.employee_name, 0, 'Missing Time', 'Missing', 0 , '', 'Missing Time'
FROMbo_custom.DBO.Calendar Dates
--Tat_time times
Inner join BO_LIVE3.dbo.GLM_period_End p on dateadd(dd,datediff(dd,0,Dates.dt),0) between Per_begin_date and per_end_Date
Cross join BO_LIVE3.dbo.tbm_persnl per --on times.tk_empl_uno = per.empl_uno
inner join BO_LIVE3.dbo.hbm_persnl b on per.empl_uno = b.empl_uno
inner join BO_LIVE3.dbo.tbl_work_status g on per.wkstatus_code = g.wkstatus_code
inner join BO_LIVE3.dbo.hbl_office f on --f.offc_code = b.offc
f.offc_code = (
SELECT top 1 isnull(mods.Offc, pers.Offc)
from BO_LIVE3.dbo.HBM_persnl pers
left join BO_LIVE3.dbo.HBA_Persnl_modp mods on mods.empl_Uno = pers.empl_uno and mods.period >= @CutOffPeriod1
--and dateadd(dd,0,datediff(dd,0,mods.last_modified)) >= (Select per_end_date from GLM_period_end where period = @CutOffPeriod1)
where pers.empl_uno = b.empl_uno order by mods.last_modified asc)
inner join BO_LIVE3.dbo.HBA_offc_prof op on f.offc_code = op.offc and op.prof in (select min(opop.prof) from BO_LIVE3.dbo.HBA_offc_prof opop where opop.offc = op.offc)
inner join BO_LIVE3.dbo.hbl_prof_ctr Pc on pc.prof_ctr_code = left(op.prof,2)
left outer join bo_custom.dbo.CUT_OFF_DATES Cod on @CutOffPeriod1 = Cod.period -- PJM 15/06/11
inner join [BO_CUSTOM].[dbo].[BTtimeValuations] (@ToDate) value on value.empl_uno = b.empl_uno
cross apply bo_custom.dbo.BTSpecifiedHoursOffice (Case when @ToDate1 > ( (Dateadd(dd,case when datepart(DW,dates.dt) < 7 then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt))) then ( (Dateadd(dd,case when datepart(DW,dates.dt) < 7 then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt))) else @ToDate1 end , case when @FromDate1 < ( (Dateadd(dd,case when datepart(dw,dates.dt) < 7 then 0-Datepart(Dw,dates.dt) else 0 end,dates.dt))) then ( (Dateadd(dd,case when datepart(dw,dates.dt) < 7 then 0-Datepart(Dw,dates.dt) else 0 end,dates.dt))) else @FromDate1 end ) SH
WHEREsh.empl_uno = b.empl_uno
AND sh.offc = f.offc_code
AND( dates.dt >= (case when Datepart(dw,per.rank_code_date) <= 2
THEN dateadd(dw, -1* Datepart(dw,per.RANK_CODE_DATE),per.RANK_CODE_DATE)
ELSE dateadd(dw,7-Datepart(dw,per.RANK_CODE_DATE),per.RANK_CODE_DATE) end)
OR rank_code_date > @CUT_OFF_DATE1)
-- Employee Filters
--AND g.WKStatus_code in (@WorkStatus1)
AND f.offc_code in ( @office1 )
AND sh.[PERSNL_TYP_CODE] in ( @EmployeeType1 )
AND (b.empl_uno = @Employee1 or @Employee1=0000 )
-- Date Filters
AND (dates.dt <= b.terminate_date or isnull(b.terminate_date, 0) = 0)
AND dateadd(dd,0, datediff(dd,0, Dates.dt)) between @FromDate1 and @ToDate1
AND (dw = 3 Or (datepart(dw,@FromDate1)>3 and dt=@FromDate1))
-- Simple Exclusions to reduce over -flow
AND b.inactive = 'N'
AND Dates.dt >= '2009/01/01'
AND b.employee_code <> 'DM1'
AND per.weekly_hrs <> 0
-- Exclude From missing time (Part timers at year end/start who do not do the same hours each day)
AND b.empl_uno NOT IN
(SELECT MEX.empl_uno from BO_custom.dbo.MissingTimeExclude MEX
WHERE [Week_end] = Case when @ToDate1 > ((Dateadd(dd,case when datepart(DW,dates.dt) < 7 then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt)))
THEN ( (Dateadd(dd,case when datepart(DW,dates.dt) < 7 then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt)))
ELSE @ToDate1 end)
Group By per.empl_uno, per.Weekly_hrs, (case when datepart(dw,dates.dt) = 7 then dateadd(dw,6,dates.dt) else Dateadd(dd,6-Datepart(Dw,dates.dt),dates.dt) end),f.offc_desc, sh.[PERSNL_TYP_CODE], b.employee_name, b.employee_code
,per.wkstatus_code, g.is_working, g.description, pc.prof_ctr_desc, f.offc_code,CUT_OFF_DATE
,(Dateadd(dd,case when datepart(dw,dates.dt) < 7 then 0-Datepart(Dw,dates.dt) else 0 end,dates.dt))
, dates.dt
--Removes 0 Missing time lines
having
0 < min(sh.specified_hrs) - (Isnull((Select sum(tobill_hrs)
from BO_LIVE3.dbo.tat_time times where times.WIP_Status <> 'X' And
((CoD.cut_off_date is null and
case when times.Period = 0
then @FROMPERIOD1 else times.Period end
between @FROMPERIOD1 and @CutOffPeriod1) or
( CoD.cut_off_date is not null
and times.post_date <= CoD.cut_off_date ))
and times.entry_status in ( 'U','R')
and Dateadd(dd,cast(Tran_date as int),0) between (Dateadd(dd,case when datepart(dw,dates.dt) < 7 then 0-Datepart(Dw,dates.dt) else 0 end,dates.dt)) and (Dateadd(dd,case when datepart(DW,dates.dt) < 7 then 6-Datepart(Dw,dates.dt) else 6 end,dates.dt))
and tk_empl_uno = per.empl_uno),0))
),
modifiedTime (time_uno, mod_amt, mod_hrs)
AS
(select xxx.time_uno, sum(case when xxx.type <> 'X' then tobill_amt_var else tobill_amt_var * -1 end) , sum(case when xxx.type <> 'X' then tobill_hrs_var else tobill_hrs_var * -1 end)
from bo_live3.dbo.tat_time_mod xxx
inner join bo_live3.dbo.tat_time yyy on xxx.time_uno = yyy.time_uno
where ( (xxx.period > @ToPeriod1 or xxx.post_date > @ToDate1 or xxx.tran_date > (select per_end_date from bo_live3.dbo.glm_period_end where period = @ToPeriod1)) )
or (yyy.wip_status = 'X'
)
Group by xxx.time_uno, yyy.wip_status)
select client.client_code, matter.matter_code, sum(base_hrs) as hours, empl.Empl_uno, empl.employee_code, empl.employee_name, avg(BO_CUSTOM.dbo.BT_GET_RATE(empl.empl_uno)) as rate, client.client_name, time.action_code,
sum(BO_Custom.dbo.BT_Get_Value( BO_CUSTOM.dbo.BT_GET_RATE(empl.empl_uno),time.base_hrs,client.client_code)) as charge, txt1, case when client.Client_code = '00NCHG' then 'Non-Chargeable Time' else 'Chargeable Time' end Chargeable
from BO_LIVE3.dbo.tat_time time
inner join BO_LIVE3.dbo.hbm_persnl empl on time.tk_empl_uno = empl.empl_uno
inner join bo_live2.dbo.tbm_persnl empl2 on empl2.empl_uno = empl.empl_uno
inner join BO_LIVE3.dbo.hbm_matter matter on time.matter_uno = matter.matter_uno
inner join BO_LIVE3.dbo.hbm_client client on matter.client_uno = client.client_uno
inner join BO_LIVE3.dbo.tat_text txt on time.nar_text_id=txt.text_id
--left Outer join bo_custom.dbo.Cut_off_dates cod on time.period = cod.period
left join modifiedTime mod on mod.time_uno = time.time_uno
where empl.empl_uno in(@Employees1) and time.tran_date between @startDate1 and @endDate1
and wip_Status <> 'X'
and (entry_status in ('R','U') OR (select Period from BO_LIVE3.dbo.GLM_Period_END pe where Dateadd(dd,0,datediff(dd,0,time.tran_date)) between pe.per_begin_date and pe.Per_end_date) in (Select period from BO_Custom.dbo.BT_INCLUDE_UNRELEASED) )
and
(('Maternity' in (@Maternity1) and empl2.wkStatus_Code in ('AFLMA','APLMA'))
OR
('Secondment' in (@Maternity1) and empl2.wkStatus_Code in ('AFSEC'))
OR
empl2.wkstatus_code in ((select wkstatus_code from bo_live3.dbo.tbl_work_status where wkstatus_code not in ('AFSEC','AFLMA','APLMA')) )
)
Group by client.client_code, matter.matter_code, empl.Empl_uno, client.client_name, empl.employee_code, empl.employee_name, time.action_code,txt1
UNION ALL
SELECT [00-MISSING],[MISSING HOUR],sum([Missing Hours]),[Empl Uno],[Employee Code],[Employee Name], [NULL], [Missing Times], Missing, [NULLs] , [''], [Missing Time]
FROM MissingHours
GROUP BY [00-MISSING],[MISSING HOUR],[Empl Uno],[Employee Code],[Employee Name], [NULL], [Missing Times], Missing, [NULLs] , [''], [Missing Time]
January 13, 2014 at 10:41 am
There is a LOT of low hanging fruit here for performance improvements. I would say that 36 seconds is outside of the acceptable range for a query running in production. Obviously 40 minutes isn't going to work either.
You might start by updating statistics on your Calendar table. The stats on that appear to be way out of synch.
Other performance improvements:
You have dozens and dozens of functions in your where clause.
You have scalar functions in select columns. (Some of them are even nested, that is never going to perform very well)
Sum(BO_Custom.dbo.Bt_get_value(BO_CUSTOM.dbo.Bt_get_rate(empl.empl_uno)
You have some triangular joins. http://qa.sqlservercentral.com/articles/T-SQL/61539/[/url]
_______________________________________________________________
Need help? Help us help you.
Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.
Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.
Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/
January 13, 2014 at 10:53 am
If you're seeing one set of behavior from SSMS and a different one from SSRS, the possibility could be bad parameter sniffing, but, the way to address that issue is not to put in RECOMPILE or use local variables immediately. What you need to do is identify if bad parameter sniffing is the issue. Putting in RECOMPILE means that each compilation of the procedure, or the query, is unique. This means that the values passed in as parameters are still used, but the plan will be unique for each set of parameter values. That may be desirable or not depending on the circumstances, but if what you're dealing with is data skew or out of date statistics, neither of those underlying problems is addressed. Putting in local variables eliminates parameter sniffing, but it means that you'll get an average plan rather than a plan specific to your data. That may or may not be part of the underlying problem if you have data skew or your statistics are out of date. In short, you've tried to implement two diametrically opposed solutions, one that goes for a generic plan and one that goes for a specific plan. If neither has helped, it may be possible the underlying situation is not one of bad parameter sniffing.
Rather than immediately begin addressing bad parameter sniffing when dealing with differences in behavior, I'd suggest first identifying if bad parameter sniffing is the issue at all. First off, have you looked at plans for both queries, one when it was slow and one when it was fast? Are they different? Do the differences reflect the reality of the data being returned, for example one set of parameters returns thousands of rows and the other set returns five? If the differences don't reflect reality, then you want to look to estimated versus actual on the row counts. If these are way off, you may just be dealing with out of date statistics. Simply updating your statistics, possibly with a full scan could address the issue. In fact, this may be the case. Your estimates and actuals are off, although not crazily so, but by one order of magnitude and that's enough. Also, if you're seeing two sets of behavior, is the data identical? If not, it could be that easy. Do you have two different sets of ANSI settings for the connections? You can validate that by looking at the two different plans again.
But, if the plans are the same, you're likely not dealing with bad parameter sniffing at all and may be shooting down a bad path. You have a number of issues on display in this plan. You're missing statistics on at least one table. You've also got a lot of lookups, table scans, and other operations, including at least one index spool, that are indicative of other issues. I can't tell for sure because it's not in the properties (I do hate when that happens) but based on the complexity of the query, I'll be the optimizer timed out, which could easily explain differences in behavior since on different systems under varying load, you could see different timeout points resulting in different execution plans.
----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software
January 13, 2014 at 10:55 am
Sean Lange (1/13/2014)
There is a LOT of low hanging fruit here for performance improvements. I would say that 36 seconds is outside of the acceptable range for a query running in production. Obviously 40 minutes isn't going to work either.You might start by updating statistics on your Calendar table. The stats on that appear to be way out of synch.
Other performance improvements:
You have dozens and dozens of functions in your where clause.
You have scalar functions in select columns. (Some of them are even nested, that is never going to perform very well)
Sum(BO_Custom.dbo.Bt_get_value(BO_CUSTOM.dbo.Bt_get_rate(empl.empl_uno)
You have some triangular joins. http://qa.sqlservercentral.com/articles/T-SQL/61539/[/url]
Nice. I didn't even get so far as to look at the code. I agree. I'd go after all this. I think we're way ahead of where this may or may not be bad parameter sniffing at work.
And, because of that, I'd strongly suggest removing the local variables and going with parameters. Most of the time, all day, every day, parameter sniffing is helping performance on everyone's queries, not hurting.
----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software
January 13, 2014 at 11:09 am
First is that the query plan from SSMS (the faster version)? Is it possible to provide both?
I took a quick look at this and there does appear to be a lot parameter sniffing but there's a lot of other problems too. There query is complex but, from first glance it looks like your statistics are messed up. There is a huge dependency between the the est. number of rows and actual number of rows for almost every index seek and scan in your index plan. I noticed, too, that there are no statistics on [BO_LIVE2].[dbo].[HBA_PERSNL_MODD].PARTIME_PCNT.
Using RECOMPILE or the SET @something = @something trick will not help you if your statistics are messed up.
I also see a lot of table scans; tables such as tbl_work_status are small (15 rows) but others, such as tbm_persnl have a couple thousand; you may want to add clustered indexes to those tables.
Here's a couple articles to look at:
UPDATE STATISTICS (Transact-SQL)
SQL Server Statistics: Problems and Solutions[/url].
Update/Edit: I did not see that Sean or Grant replied to this when I posted my response.
-- Itzik Ben-Gan 2001
January 13, 2014 at 3:10 pm
Alan.B (1/13/2014)
I took a quick look at this and there does appear to be a lot parameter sniffing but there's a lot of other problems too.
I'm confused. Please let me know where I'm going off on this. Since this query doesn't use parameters at all and is instead using local variables, it can't possibly be using the precise values passed through parameters to look up those specific values within the statistics (the definition of sniffing). Since it's local variables, unless it's getting a recompile during execution, it's not sniffing these values. And, since we can also tell by looking at the execution plan whether there is a compile time value for the parameters, and these don't have one, it couldn't possible be sniffing the values. That means it has to be using sampled values, not sniffing. Or am I way off somewhere?
----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software
January 14, 2014 at 3:20 pm
I really dont get it. I queried sp_autostats(HBA_PERSNL_MODD) which is the table that was showing the column with no statistics and it happens that the column actually do have statistics and the last update was this morning by 6:30 am. My question now is why would I see in the execution plan of the query that the column has no statistics while statistics exists and well updated.
EO
January 14, 2014 at 3:52 pm
Not sure. Could be they weren't there when you created the plan? Could be you have async statistics enabled on the database which means plans can be created without statistics being created first. Umm... Did the optimizer timeout? Good chance that's an unstable plan. Something along those lines.
----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software
January 15, 2014 at 8:40 am
Grant Fritchey (1/13/2014)
Alan.B (1/13/2014)
I took a quick look at this and there does appear to be a lot parameter sniffing but there's a lot of other problems too.I'm confused. Please let me know where I'm going off on this. Since this query doesn't use parameters at all and is instead using local variables, it can't possibly be using the precise values passed through parameters to look up those specific values within the statistics (the definition of sniffing). Since it's local variables, unless it's getting a recompile during execution, it's not sniffing these values. And, since we can also tell by looking at the execution plan whether there is a compile time value for the parameters, and these don't have one, it couldn't possible be sniffing the values. That means it has to be using sampled values, not sniffing. Or am I way off somewhere?
You are correct Grant. I did not mean to contradict anything you were saying. Frankly, I would not have posted anything if I had first seen that you or Sean had responded to the OP; I would have had nothing to add. There were no replies when I started my response.
I mistakenly assumed there was parameter sniffing because I looked at the query plan but didn't really look at the query. I saw a bunch of estimated row counts that were different from the real row counts which is why I thought there was parameter sniffing. Note to self: parameter sniffing is not the only cause of est/act row count discrepancies.
Anyhow, the topic of parameter sniffing is new to me; something I still have a lot to learn about. I first heard about it on this forum in the last couple years ago. The first time I ever heard someone talk about it was when I attended your Session on it at PASS 2012 -- Seattle. If I remember correctly you said that, when you do the SET @x=@x thing, you always include a comment above explaining why.
-- Itzik Ben-Gan 2001
January 15, 2014 at 9:39 am
Alan.B (1/15/2014)
You are correct Grant. I did not mean to contradict anything you were saying. Frankly, I would not have posted anything if I had first seen that you or Sean had responded to the OP; I would have had nothing to add. There were no replies when I started my response.I mistakenly assumed there was parameter sniffing because I looked at the query plan but didn't really look at the query. I saw a bunch of estimated row counts that were different from the real row counts which is why I thought there was parameter sniffing. Note to self: parameter sniffing is not the only cause of est/act row count discrepancies.
Anyhow, the topic of parameter sniffing is new to me; something I still have a lot to learn about. I first heard about it on this forum in the last couple years ago. The first time I ever heard someone talk about it was when I attended your Session on it at PASS 2012 -- Seattle. If I remember correctly you said that, when you do the SET @x=@x thing, you always include a comment above explaining why.
Oh no worries. I really thought I had missed something on the structure or the code. God knows, I don't always spend a lot of time looking at some of the queries that come through and have made my share of mistakes.
Yeah, if I'm going to use local variables as mechanism for fixing bad parameter sniffing, I'll put a comment something like this:
-- I'm not stupid, this is for fixing bad parameter sniffing
In theory, it'll slow people down before they think that doing something crazy like taking a perfectly good parameter and then setting it to a local variable is nuts. It is nuts, with this exception.
Sorry if that comment came off wrong. I just thought I was completely out to lunch or something. I went back through everything double checking.
----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply