A Single-Parameter Date Range in SQL Server Reporting Services

  • Fernando Margueirat-273899 (11/18/2015)


    Thanks for this great tip!

    One small change that I would make to avoid the "DR.EndDate+1" part and still be able to easily link to dates in other table is to create a new column adding end of day (i.e. 23:59:59.997) to the EndDate of period that have already ended and GETDATE() for current periods (e.g. last month has ended, but month to date is current). This way you can have something like

    SELECT b.Period, sales = SUM(a.trn_amt)

    FROM T_TRANSACTION a

    JOIN V_DATE_RANGE b ON a.trn_dt BETWEEN b.BeginDate AND b.EODEndDate

    GROUP BY b.Period

    FM

    And what about those people who are using DATETIME2?

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin (11/18/2015)


    Fernando Margueirat-273899 (11/18/2015)


    Thanks for this great tip!

    One small change that I would make to avoid the "DR.EndDate+1" part and still be able to easily link to dates in other table is to create a new column adding end of day (i.e. 23:59:59.997) to the EndDate of period that have already ended and GETDATE() for current periods (e.g. last month has ended, but month to date is current). This way you can have something like

    SELECT b.Period, sales = SUM(a.trn_amt)

    FROM T_TRANSACTION a

    JOIN V_DATE_RANGE b ON a.trn_dt BETWEEN b.BeginDate AND b.EODEndDate

    GROUP BY b.Period

    FM

    And what about those people who are using DATETIME2?

    Good point Phil.

    I haven't used DATETIME2 yet, so I'm not familiar with its inner working details. Would adding ' 23:59:59.9999999' instead work?

    Adding 1 day to the new column will work if you use < and >= instead of BETWEEN. Functionally still the same as using 'DR.EndDate+1' but I would prefer it for clarity. Also, I personally believe that it would be less error prone; people don't have to remember to add the +1 too all queries.

    FM

  • Fernando Margueirat-273899 (11/18/2015)


    Thanks for this great tip!

    One small change that I would make to avoid the "DR.EndDate+1" part and still be able to easily link to dates in other table is to create a new column adding end of day (i.e. 23:59:59.997) to the EndDate of period that have already ended and GETDATE() for current periods (e.g. last month has ended, but month to date is current). This way you can have something like

    SELECT b.Period, sales = SUM(a.trn_amt)

    FROM T_TRANSACTION a

    JOIN V_DATE_RANGE b ON a.trn_dt BETWEEN b.BeginDate AND b.EODEndDate

    GROUP BY b.Period

    FM

    This is another great idea! Thanks for sharing.

    John.

  • Nice except the Fiscal Year to Date will not return correct results when the Fiscal Year is not in the same year as the current date. If the current date was 01/01/2016 you would get date range of 2016-10-01 to 2016-01-01 which is not valid. You would want to change it to something like this.

    select

    Convert(varchar(100),'Fiscal Year To Date') as [Period]

    ,case when datepart(month,getdate()) < 10 then convert(datetime,'10/1/' + cast(year(dateadd(year, -1, getdate())) as varchar(4)))

    else convert(datetime,'10/1/' + cast(year(getdate()) as varchar(4))) end as BeginDate

    ,convert(datetime,convert(date,getdate())) as EndDate

  • rick.foster (11/18/2015)


    Nice except the Fiscal Year to Date will not return correct results when the Fiscal Year is not in the same year as the current date. If the current date was 01/01/2016 you would get date range of 2016-10-01 to 2016-01-01 which is not valid. You would want to change it to something like this.

    select

    Convert(varchar(100),'Fiscal Year To Date') as [Period]

    ,case when datepart(month,getdate()) < 10 then convert(datetime,'10/1/' + cast(year(dateadd(year, -1, getdate())) as varchar(4)))

    else convert(datetime,'10/1/' + cast(year(getdate()) as varchar(4))) end as BeginDate

    ,convert(datetime,convert(date,getdate())) as EndDate

    Thanks! I will look into this, and will submit a revision when I have this and the ISO date issue previously identified resolved.

  • I remember reading an article that tested this with many thousands of records and there was a material impact. Since then I have always used the more efficient approach as a matter of course.

    In this instance though, I think the cost would be immaterial. It's just not called enough to matter.

    EDIT: I was referring to the performance cost of the date calculations

  • Great article, I cannot wait to use the concept! I often need to mix & match different ranges of financial data with monthly, quarterly, etc. compared to last year's same numbers, though not in SSRS but in a proprietary reporting engine.

    What I think I might do is implement it as a TVF rather than a View, so that I can allow the user to enter a specific seed date rather than getdate().

    Thanks for sharing your idea!

  • I just want to chime in with thanks for the great original idea, and everybody else with ideas to make it even better. I can imagine using this some day and look forward to an improved version of the code to steal. 🙂 On that note, I guess my one suggestion would be: for the sample script, if you want to include a header with attribution and maybe a URL or something... I try to respectfully leave a note about the original author in there if I "borrow" code to use elsewhere.

  • Awesome article, thank you very much for sharing this.

  • Thanks. This is very adaptable to a broad range of applications, locales, and industry.

  • Jack (11/18/2015)


    I remember reading an article that tested this with many thousands of records and there was a material impact. Since then I have always used the more efficient approach as a matter of course.

    In this instance though, I think the cost would be immaterial. It's just not called enough to matter.

    EDIT: I was referring to the performance cost of the date calculations

    I've been a SQL Server developer for a long time, and I've learned over that time that no harm ever comes from making every bit of code you write run as quickly as reasonably possible.

    There's also a Zen-like satisfaction to be had after writing optimum code, regardless of its immediate impact on performance.

    No one ever got told off because their code ran too fast.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • Phil Parkin (11/19/2015)


    Jack (11/18/2015)


    I remember reading an article that tested this with many thousands of records and there was a material impact. Since then I have always used the more efficient approach as a matter of course.

    In this instance though, I think the cost would be immaterial. It's just not called enough to matter.

    EDIT: I was referring to the performance cost of the date calculations

    I've been a SQL Server developer for a long time, and I've learned over that time that no harm ever comes from making every bit of code you write run as quickly as reasonably possible.

    There's also a Zen-like satisfaction to be had after writing optimum code, regardless of its immediate impact on performance.

    No one ever got told off because their code ran too fast.

    True, but IMHO, unless the improvement in performance is significant, code maintainability is priority over performance. In most of the cases, every minuscule saving in cost of hardware, network bandwidth, user time, etc. goes to the garbage when you have to pay a coder an extra week to find and fix a problem or make an enhancement on messy code.

    FM

  • Fernando Margueirat-273899 (11/19/2015)


    Phil Parkin (11/19/2015)


    Jack (11/18/2015)


    I remember reading an article that tested this with many thousands of records and there was a material impact. Since then I have always used the more efficient approach as a matter of course.

    In this instance though, I think the cost would be immaterial. It's just not called enough to matter.

    EDIT: I was referring to the performance cost of the date calculations

    I've been a SQL Server developer for a long time, and I've learned over that time that no harm ever comes from making every bit of code you write run as quickly as reasonably possible.

    There's also a Zen-like satisfaction to be had after writing optimum code, regardless of its immediate impact on performance.

    No one ever got told off because their code ran too fast.

    True, but IMHO, unless the improvement in performance is significant, code maintainability is priority over performance. In most of the cases, every minuscule saving in cost of hardware, network bandwidth, user time, etc. goes to the garbage when you have to pay a coder an extra week to find and fix a problem or make an enhancement on messy code.

    FM

    Hence my use of the phrase 'reasonably possible'. Not 'at all costs'.

    And if you think that my proposed amendments to the code in this thread are more complex or difficult to maintain than the originals, I'm afraid I disagree.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • One advantage to the varchar solution is that (if it was converted to use ISO data formats), the end dates could have 'T23:59:59.997' added to them so the date ranges actually included the end date rather than excluding it (of course, that fractional part of a second assumes datetime, which should be a big no-no for any columns created in a version that supports datetime2, but with datetime2 you have to match the number of 9's after the decimal to the precision of the datetime2, which is another issue entirely)

  • John Meyer-239467 (11/20/2015)


    One advantage to the varchar solution is that (if it was converted to use ISO data formats), the end dates could have 'T23:59:59.997' added to them so the date ranges actually included the end date rather than excluding it (of course, that fractional part of a second assumes datetime, which should be a big no-no for any columns created in a version that supports datetime2, but with datetime2 you have to match the number of 9's after the decimal to the precision of the datetime2, which is another issue entirely)

    My preference is to make the end date 'midnight on the day after the final day of the period' and then use < in any comparisons.

    For example, for 2015

    StartDate = '20150101'

    EndDate = '20160101'

    select from table

    where date >= StartDate and date < EndDate

    Works with all date data types and keeps things clean.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

Viewing 15 posts - 16 through 30 (of 50 total)

You must be logged in to reply to this topic. Login to reply