Writing to a File Using the sp_OACreate Stored Procedure and OSQL: The

  • I'm glad there're people like you to do this stuff for us - nice article

    A comment about the VB code.  I think it's un-necessary to have the If statement to determine what to write to the file.  As with the other approaches, just write the Start Time, then iterate through the loop, then write the End Time.  I'm not sure what sort of performance difference you'd see, but I think it would be a fairer test.

  • On our system we get out of memory errors when we use code that utilizes sp_OACreate and other associated extended stored procedures.  The MemToLeave seems to get fractured and the fix is stopping and starting SQL Server to reallocate the virtual memory.

  • Be very careful with how you interpret the article’s conclusions!

     

    Real world scenario:  I am currently having to take narrative information stored in a SQL Server 2000 db and convert it into a text document.  The narratives are stored with a header record and detail records (one line of narrative per detail record, many detail records per header record).  I have to use the header record to create a report title, then add the narrative.

     

    To accomplish this (in general), I create a title from the header record and store it to a new text file.  I then use a query to collect the detail information and append it to the text file.

     

    I have to generate roughly 3 million text files this way.

     

    Using the sp_OACreate (and related) procedures, I start out generating files at the rate of 5000-6000 per minute, with spikes as high as 8000/minute.  This quickly degrades so that 30 minutes later I’m down to about 1100/minute, and 30 minutes after that I’m down to 600/min.  Eventually I get to the point where I’m only generating about one file per second.

     

    Using OSQL, the results are more consistent but far less dramatic.  I plod along at about 300/minute.  To generate 3 million documents at this rate is going to take a week!

     

    Any suggestions or thoughts?

  • Thanks for sharing your research on this.  Just wanted to point out one mistype... In your summary and graph you list test 2(OSQL - From Query Analyzer) at 32.6 seconds but in your notes you list it at 5.86 seconds.

    David

  • I am really glad that a the conversation on this topic is continuing and that people are pointing out differences in results, and yes my typo's too. 

    Looking back at when I wrote this article, it was my first, I realize that I had a limited understanding of the sp_OA procedures and also that I didn't know how to use them appropriately. 

    I guess I started out just sharing what I was doing then I tried to prove myself right.  If I had it to do over again I would have looked for better options (trust me there are better meathods) than the ones that I came up with and let the tests prove themselves out.

    There have been a lot of good suggestions and finding on the comments of this article and I'm glad to see that the discussions continue.  More than anything, I'm glad that my conclusions didnt finalize anything but people shared their ideas and findings with the community too.

    -Zach



    --------------------------
    Zach

    Odds_And_Ends Blog

  • Todd, let me make sure that I understand your senario before I post an answer.  Am I right in saying that you have Parent table A which can or does have multiple Child table B records. Each Parent table A record requires that a single file be created to contain...

    1. A single Parent table A record and...
    2. Multiple Child table B records for the Parent A record.

    Are you saying there are around 3 million of these parent records?

    I think that the slowdown in your file creation rate "sounds" like mem-to-leave or other memory pressure.

    Can you correct or confirm my summation of the issue before we go on?

    Thanks,

    Zach



    --------------------------
    Zach

    Odds_And_Ends Blog

  • Zach, you have this correct.

     

    To flesh out the scenario a bit, this is a conversion process from one application to another involving as much as 10-15 years worth of data.  In the original application, every time a real-life event happened (which could be tens or hundreds of time per day) a report (or narrative) was written.  These narratives vary in from one 70-character line to as many as 1000 70-character lines.

     

    The database in the new application is designed differently, so these old narratives are converted to text documents and then “attached” to the converted event in the new application.  This allows users to see all the old information, especially when the old info doesn’t fit well into the new database.

     

    This conversion process will happen when the customer goes live on the new application.  They will stop the old system, the data will be converted, and the customer will go live on the new system that now contains their converted data.  The text documents don’t have to be generated exactly within that window, but will come very shortly thereafter.  So we are attempting to create these documents as quickly as possible.

     

    For the particular customer I am currently helping, they have roughly 3 million events for which the narratives have to be converted to text files.

     

    Any thoughts/suggestions are welcome.

     

    I agree that this sounds like some sort of memory-pressure issue.  I'm working to diagnose 1) exactly what the cause is, and 2) how I can fix it/work around it.

  • Hi Zach,

    It's worth pointing out to your readers that using OLE Automation in T-SQL not only has performance issues, but has real drawbacks when it comes to security. For example, you need to be a member of the sysadmin role to execute these SPs. Bart Duncan wrote a good paper on the topic at ttp://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnsqldev/html/sqldev_10182004.asp. In addition, it is very easy to cause memory leaks in the server that could impact the server's memory space.

    Thank you,

    Bill Ramos

    Lead Program Manager

    SQL Server Manageability

  • Very good summary of methods, but you forgot the oldest one in the book.  We routinely use CMD calls to write files, but surprisingly it's just as fast to write using the ECHO command to write any amount of data necessary.  Something like:

    Declare @cmd varchar(255), Declare @data varchar(1000), Declare @append varchar(2), Declare @path varchar(255), Declare @filename varchar(255), Declare @output varchar(2000)

    Set @cmd = 'echo'

    Set @data = 'some sort of data'

    Set @append = '>>'  -- to create a file

    Set @append = '>'  -- to append a file

    Set @path = '\\someservershare\share' or 'c:\localpath\localdir'

    Set @filename = 'somefilename'

    Set @output = @cmd + ' ' + @data + ' ' + @append + ' ' + @path + @filename

    EXEC master..xp_cmdshell @output

    There are other techniques to just create an empty file by using the @echo command, and dozens of other things you can do with echo.  Plus, if you place the whole thing in params in a Store Procedure you can pass in other commands and do most any DOS command you want.  Of course be sure the security is set correctly so it can't be abused.  This method outputs about a 1.5MB text file in about 30 seconds.


  • Set @append = '>>'  -- to create a file

    Set @append = '>'  -- to append a file


    I'm thining the above are backwards... >> is the append symbol and > is the create symbol in DOS (Cmd Window)...

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • I just tried the first osql cmd you wrote in SQL 2005 and replaced osql with sqlcmd

    osql command runs in 7 seconds

    sqlcmd runs in 2 minutes 26

    SQL 2005 docs say that sqlcmd is the replacement for osql. Can this be right? Obviously they are structured very differently, but wouldn't you expect that sontinuity of performance might be desirable?


    Regards,

    Steve

    Life without beer is no life at all

    All beer is good, some beers are just better than others

  • Heh... let's hear it for progress... pay more, get less.  Gotta love that

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • 11/13/2006

     

    I have discovered the problem with my text file generation degradation.

     

    It all comes back to the file system.  As files are being generated and more and more files are generated into one folder, the subsequent file generates get slower.  By moving to a new folder, the file generates pick right up.

     

    In my final test scenario I did the following:

     

    1. Using sp_OACreate, sp_OAMethod, and sp_OADestroy, I generated 75000 text files into test folder 1

     

    1. Using sp_OACreate, sp_OAMethod, and sp_OADestroy, I generated 75000 text files into test folder 2, sub folder K, where K was incremented every 10000 files.

     

    1. Using sp_OACreate, sp_OAMethod, and sp_OADestroy, I generated 75000 text files into test folder 3

     

    Step 1 ran in 25 minutes with the file generation degrading from 16800/min down to 600/min.

    Step 2 ran in 5 minutes with no noticeable degradation, averaging about 17000/min.

    Step 3 ran in 24 minutes with the file generation degrading from 18000/min down to 350/min.

     

    So I will include multiple subfolders in my file generation process.

     

     

    11/15/2005

     

    I modified my process to include subfolders and ran a partial test.  446,000 documents were generated in 68 minutes, giving an average rate of over 6500 documents generated per minute.

     

  • Additional note: The 446,000 documents totaled 274MB in size.

  • I modified a stored procedure that moved DTS packages from one server to another.  It is from http://www.dbazine.com/sql/sql-articles/larsen8

    It would stop after moving 255 packages, everytime, no mateter how the packages were sorted.

    It would always stop at sp_OAMethod @object, 'LoadFromSQLServer'.  Make of it what you will. Smells like a memory leak.

Viewing 15 posts - 16 through 30 (of 32 total)

You must be logged in to reply to this topic. Login to reply