Dynamically set Text delimiters

  • I am trying to complete a DTS package that can handle the import of several different text files. They all contain the same general information, but they have different column orders, delimiters, text qualifiers etc.  I am trying to store the delimiters in a table so I can dynamically set them during run-time. However, I have been having difficulty setting non-character delimiters (such as {CR}{LF} from a global variables. For example:

    cnTextFile.ConnectionProperties("Row Delimiter").Value = vbCrLf

    vs.

    cnTextFile.ConnectionProperties("Row Delimiter").Value = DTSGlobalVariables("gv_RowDelimiter").Value 

    where the Global Variable is set to vbCrLf.

    The 1st one works, the 2nd doesn't. My only guess is that when its set through a variable its treating it as a literal string, but when you set it manually its using the vbCrLf appropriately. Does anyone have any suggestions on how to do this?

    Thanks in advance

  • Doing a quick test, if you set your global variable = to {CR}{LF} rather than the VB equivalent and  use the Dynamic Property task, you should be able to set it that way.  I changed the global variable to the various appropriate delimiters, and it seemed to work.  I did not test an actual transformation after setting the delimiter, but you can handle that.

    Lee

  • Thanks for the reply. I have tried that - and the results are the same. 

    Since I posted I have found one way to make it work. By using the EVAL function on the global variable it was apparently able to convert it to a vbCrLf that was usable in the transformation. I had to do this in ActiveX as I don't see a way to use such expressions in a Dynamic Properties Task.

    The only problem is that some of my delimiters are text (such as |) and some are Tabs, Carriage returns, etc. It seems I will have to deal with these 2 types differently as the EVAL cannot be used on the text delimiters.

     

  • Glad to see you found a way to do it!  Never really had to change the delim before!!!  Thank goodness.

  • I had a similar situation when I worked for a previous employer, but I approached the problem in a different manner.

    We had about 60 different customers that provided us data extracts that we were to import so that the data could be audited by the company auditors.

    When I started, there were literally 60 different DTS packages that were set up to handle each customer data extract. After reviewing all of the data extracts, I boiled the various formats down to 10. I created and tested 10 DTS packages that included error checking and handling processes. In some cases, I had to work with the customer to tweak their data extract processes to eliminate data anomalies that seemed to exist in their extracts. So, at this point, I had about 60 data extracts that fell into 10 different formats that each had a new DTS package for them.

    For each customer, a file directory was created on a file server that the database system could read from.

    The next thing I did is create a table that identified each customer, the folder where their extracts were stored and what format they where to use. I then created a log table to track imports.

    Finally, I created a "main importer" procedure that would do the following for each customer each night after backups:

    1)Check each customer folder for a new file that was not in the import log table.

    If a new file was found, load the filename into the log as well as which customer ID it is for and mark it as found, but not mark it as imported at this point.

    2)After checking all folders for new files, start importing the files that are not marked as imported in the log. I used the xp_cmdshell to execute the appropriate DTS based on what was defined in the customer info table and provided the filename as a parameter based on what was in the log. The use of a cursor was required to make this work effectively.

    If the DTS imported and processed the data into production successfully, then the log would be marked as import successful and when it completed. Failure would result in an error log file created and no change to the log file. Note that the data was always imported to a "holding table" first. If the DTS failed, only the holding table data would be incomplete and the production tables untouched. If the DTS loaded the holding table successfully, a series of data integrity checks would take place. If the data passed muster, then the data would be loaded into the production tables. The holding table would always be cleared in the end so it was ready for the next import.

    The main import proc also had a log table just for it's execution tracking. During it's processing, it would do things like identify each step and when it began and ended, the number of records in the destination table before and after each step. This helped track down any import processing problems.

    The end result from all of this work was an "import engine" that was, for the most part, automated. It only needed to be managed to address new customers, leaving customers, file format changes, and the occasional data anomaly. As data extracts were received, they would be placed in the appropriate folders. The company managers would get a report each morning on what company data was imported, how many records were imported for each, and various other data for their tracking of incoming data.

    To be honest with you, the system worked so well that I moved on to another employer to find another challenge. 🙂

    Just thought you may want to think in this direction for your EDI import processing

    - Al


    "I will not be taken alive!" - S. Hussein

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply