April 2, 2013 at 11:20 am
I am trying to make improvements to a job that does the following.
1) Load a table with about 40,000 rows (Pretty straight forward, so there isn't much to change here)
2) Call a stored proc that joins this table's data with big set of data from a linked server. This proc reads data from the above table (where one of the columns is flagged '0'), and does a lot of gimmicks to process it and sets the flag to '1'
3) This stored proc is coded such that it takes a parameter for the number of rows to be processed. Currently the call is made with @rowsToProcess as 250. The code looks as below...
WHILE EXISTS (SELECT 1 FROM MyTable WHERE MyFlag = 0)
BEGIN
EXEC dbo.MyProc
@Server = 'MYREMOTESQLSERVER',
@DBName = 'MYRemoteDBN',
@rowsToProcess = 250
END
Can anyone suggest if there is a better idea to run this more efficiently? For now, I only have room to play around how to call this proc to process all rows in "MyTable". In other words, I can't influence how data is organized/arranged in the tables on the linked server.
Thanks in advance,
Rex
April 3, 2013 at 8:49 am
Can you put the procedure on the remote server and call it from your primary? That has worked fro me in some scenarios.
EXEC RemoteServer.RemoteDB.dbo.MyProc
April 3, 2013 at 10:07 am
I can't do that, unfortunately.
-Rex
April 3, 2013 at 10:26 am
If I understand correctly the procedure is called multiple times until all rows are processed. Every time its called it needs to join with the large remote data set. I would think the large remote data set is being dragged over the network every time.
Can you pull the remote data set back to the local server into a temporary or permanent table? If the data is staged locally then your joins will be much faster.
My other thought is to call the procedure once and process all the data at one time. Is the @rowsToProcess parameter an attempt to address performance issues?
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply