Odd Behavior for Bulk Insert Task

  • I'm working on a DTS package with a Bulk Insert task. The table has one numeric column with a precision of 14 and a scale of zero. The numeric column does not allow null values. The data file has many rows with 0000000000000 for the numeric column. Here is a sample row so you can see what I'm talking about:

    13,0000000000000,051,65733,2004-10-09,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189

    When I execute the package, I get the following error message:

    Bulk Insert fails. Column is too long in the data file for row 1, column 2. Make sure the field terminator and row terminator are specified correctly.

    Yet, if I simply change the last digit in that column to 1 for those rows, the Bulk Insert completes successfully. Here is a sample of the changed row for your reference:

    13,0000000000001,051,65733,2004-10-09,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189,01,0000189

    Any ideas as to why this would be happening?

    Thanks.

    “If you're not outraged at the media, you haven't been paying attention.”

  • I'm not 100% certain here, but I believe that the value of 000000000 will be treated as null by the db for a column type of numeric, hence the errors your are getting.

     

    Have you tried a simple insert statement with the value of 0000....? And do you get the same error?

    If not, then it looks like the transformation may be nulling the value during the insert. You may need to transfom inside an activex task. That way you can use "if not isnull(column) then column else 0" as part of the code to force it in.


    ------------------------------
    The Users are always right - when I'm not wrong!

  • Yes, I successfully inserted the exact same data with a standard INSERT statement.  I decided to abandon the Bulk Insert approach for this load.  I need to do other transformations to the data anyway, so I don't think I would gain much by using a Bulk Insert.

    “If you're not outraged at the media, you haven't been paying attention.”

  • Glad to hear you have a working solution. Definitely very strange behaviour. Obviously the dts package treats the insert differently, but I cannot understand how or why.


    ------------------------------
    The Users are always right - when I'm not wrong!

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply