giving error for the calculation

  • I am using Sql server 2008.

    I am writing the query with cte.its structure like below

    ;with

    cteA as

    (

    select * from tableA

    ),

    cteB as

    (

    select A.*,(other derived column based on cteA) from cteA as A

    ),

    cteC as

    (

    select B.*,(other derived column based on cteB) from cteB as B

    )

    select * from cteC

    now in on of the inner cte i am getting error, below is the error code i am getting

    Msg 8632, Level 17, State 2, Line 2

    Internal error: An expression services limit has been reached. Please look for potentially complex expressions in your query, and try to simplify them.

    Below is the query I am using. I don't have any other choise that i can store that calculation in temp table or anywhere. I serched google and i got that i have to re wtrite the query without complex calculation but i h=don't have any choise i can't break the query.

    cteK

    as

    (

    select J.*,case when (limits <> 0 or limits is not null) then

    (Case when limits = 1000000 then (m_1a)

    when limits = 2000000 then (m_1a + m_2a)

    when limits = 3000000 then (m_1a + m_2a + m_3a)

    when limits = 4000000 then (m_1a + m_2a + m_3a + m_4a)

    when limits = 5000000 then (m_1a + m_2a + m_3a + m_4a + m_5a)

    when limits = 6000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a)

    when limits = 7000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a)

    when limits = 8000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a)

    when limits = 9000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a)

    when limits = 10000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a)

    when limits = 11000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a)

    when limits = 12000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a)

    when limits = 13000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a)

    when limits = 14000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a)

    when limits = 15000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a + m_15a)

    when limits = 16000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a + m_15a + m_16a)

    when limits = 17000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a + m_15a + m_16a + m_17a)

    when limits = 18000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a + m_15a + m_16a + m_17a + m_18a)

    when limits = 19000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a + m_15a + m_16a + m_17a + m_18a + m_19a)

    when limits = 20000000 then (m_1a + m_2a + m_3a + m_4a + m_5a + m_6a + m_7a + m_8a + m_9a + m_10a + m_11a + m_12a + m_13a + m_14a + m_15a + m_16a + m_17a + m_18a + m_19a + m_20a)

    End) end

    as Curr_Limit

    from cteJ as J

    )

  • I think the root cause is a cte definition issue (or a design issue in general).

    Instead of add up various columns based on a certain limits value you should normalzie (unpivot) the cte and perform a rather standard aggregation.

    I think the basic concept of the cte needs to be revised, since you'd need to rewrite the query just because a limit of 21000000 needs to be covered.

    Based on the limited information available so far there's little we can do, I guess.



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • Thanks for the response Lutz,

    I am still stuck in that problem.

    I am not adding the seperate column based on the limit but based on the limit i am just getting the value for the curr_limits.

    I tried so many different way but still i am getting the same error.

    I rewrite the same query suing the subquery but getting same error.

  • Please provide table def and sample data of the source table as described in the first article refernced in my signature. Also, please post your expected result based on your sample data.

    Based on that we might be able to help you (at least regarding a general concept).



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • It looks to me like what you're looking for is something like this:

    IF OBJECT_ID(N'tempdb..#T') IS NOT NULL

    DROP TABLE #T ;

    IF OBJECT_ID(N'tempdb..#Rules') IS NOT NULL

    DROP TABLE #Rules ;

    CREATE TABLE #T

    (

    ID INT IDENTITY

    PRIMARY KEY ,

    Limits BIGINT ,

    m_1a INT ,

    m_2a INT ,

    m_3a INT

    ) ;

    INSERT INTO #T

    ( Limits ,

    m_1a ,

    m_2a ,

    m_3a

    )

    SELECT 2000000 ,

    1 ,

    2 ,

    4

    UNION ALL

    SELECT 3000000 ,

    1 ,

    2 ,

    4 ;

    CREATE TABLE #Rules

    (

    Limits BIGINT NOT NULL ,

    Measure VARCHAR(10) NOT NULL ,

    PRIMARY KEY ( Limits, Measure )

    ) ;

    INSERT INTO #Rules

    ( Limits ,

    Measure

    )

    SELECT 2000000 ,

    'm_1a'

    UNION ALL

    SELECT 2000000 ,

    'm_2a'

    UNION ALL

    SELECT 3000000 ,

    'm_3a' ;

    ;

    WITH Unpiv

    AS ( SELECT *

    FROM ( SELECT *

    FROM #T

    ) p UNPIVOT

    ( Vals FOR Measures IN ( m_1a, m_2a, m_3a ) ) AS unpvt -- Add all your columns here

    )

    SELECT Unpiv.Limits ,

    SUM(Unpiv.Vals) AS TotalMeasureVals

    FROM Unpiv

    INNER JOIN #Rules AS Rules ON Unpiv.Limits >= Rules.Limits

    AND Unpiv.Measures = Rules.Measure

    GROUP BY Unpiv.Limits ;

    The way I worked this was by creating a smaller version of your dataset, which I put in temp table #T. You have more columns, and I'm guessing they're "measures" of some sort, but what you want is to turn those into rows. UNPIVOT will do that for you quite efficiently. It would be better if the data were already stored that way in your database, but I'm guessing that's not an option.

    All you need to do to get the Unpivot to work correctly is add all the columns to the list, on the row indicated, and (of course) use your dataset instead of #T.

    Then the key is the #Rules table. That will be a representation of the complexity of your calculation. What I did is make it so each Limits value adds one more column to it. To add Limits 4-million, all you'd need to do is add one row with that in the Limits column and "m_4a" in the Measure column, and so on.

    I think that'll get you what you want, without error messages.

    Note that this won't work at all in SQL 2000 or earlier. It requires 2005 or later. Based on the forum you posted in, I'm assuming you're using a version it will work in. If not, it can still be done, just not as easily and efficiently.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • Thanks Lutz

    My Query is going more that 10 layer down in herarchy. and If i made sample query then its useless.

    I can't post the real query. It has company's data.

    Is there any way that i can send you private message?

    Thanks Again

  • krishusavalia (3/7/2011)


    Thanks Lutz

    My Query is going more that 10 layer down in herarchy. and If i made sample query then its useless.

    I can't post the real query. It has company's data.

    Is there any way that i can send you private message?

    Thanks Again

    You don't have to post company data nor the actual query.

    What we need to see is the basic concept.

    So, instead of ten layers include just three. Instead of 30+x columns include just 5 or ten.

    Instead of company data use some fake data.

    We need to see and understand the concept you're using currently as well as the goal you're trying to achieve. Maybe Gus is already close to what you're looking for...

    Regarding the private message: that would be consulting. I think in that case you might be better off looking for someone local.



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • The way I worked this was by creating a smaller version of your dataset, which I put in temp table #T. You have more columns, and I'm guessing they're "measures" of some sort, but what you want is to turn those into rows. UNPIVOT will do that for you quite efficiently. It would be better if the data were already stored that way in your database, but I'm guessing that's not an option.

  • Hey G ,

    Your solution is awesome. It was interesting and i learn something new.

    But in my senario, I have more than 2 million rows and for every row I have to check that for the limit and and count the TotalMeasureVals. So its giving me bad performance.

    Do have any alternative for that??

  • krishusavalia (3/9/2011)


    Hey G ,

    Your solution is awesome. It was interesting and i learn something new.

    But in my senario, I have more than 2 million rows and for every row I have to check that for the limit and and count the TotalMeasureVals. So its giving me bad performance.

    Do have any alternative for that??

    At the very least, I'd have to see the data structure (table definitions) and some sample data, before I could begin to tune performance.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

  • GSquared (3/9/2011)


    krishusavalia (3/9/2011)


    Hey G ,

    Your solution is awesome. It was interesting and i learn something new.

    But in my senario, I have more than 2 million rows and for every row I have to check that for the limit and and count the TotalMeasureVals. So its giving me bad performance.

    Do have any alternative for that??

    At the very least, I'd have to see the data structure (table definitions) and some sample data, before I could begin to tune performance.

    +1



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

Viewing 11 posts - 1 through 10 (of 10 total)

You must be logged in to reply to this topic. Login to reply