File checksum

  • I work on a script, which will import data from a flat file. The file will be manually placed into a network directory each day. A scheduled job will trigger a DTS package and import data from this file. How can I run file checksum from a DTS package to make sure that this is not a duplicate file? Maybe there are some other ways of identifying a duplicate file, using SQL Server 2000?

  • If you're worried about picking up the same file two days in a row. Best/easiest option would be to move the file to an archive location once it has been processed.

    A more complicated method, would be to store things like the file size, last updated and date dreated. Then comparing these before you load the file.

     

    --------------------
    Colt 45 - the original point and click interface

  • Do you know the way to check the sized, date created and date updated from DTS package?

  • Ella,

    Take a look at this site - it has examples for working with files.

    http://www.sqldts.com/default.aspx?292

     

    Good Luck,

    Darrell

  • If you need a file checksum, there is a win32 console app that will calculate MD5 thumbprints for you:

    http://www.fourmilab.ch/md5/

     

    hth jg

  • Thank you everybody. It was a very useful info.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply