CSV Import Fail

  • Hi everyone

    I am importing a CSV into SQL Server 2019.  I am getting the below error (see below).  I took a closer look at the field in question and noticed that there are NULLs in the column.  I updated the table definitions to allow for NULLS.  Re-imported it and still did nothing.  The field stores small integers (less than 10) so I used the datatype INT.  The CSV calls the field "{exchange" but I don't know how SS feels about { in the name so I the table stores it as "EXCHANGE" and I manually updated the mapping in the import wizard.  Re-imported and still nothing.  Any idea on what is going on here?

    - Executing (Error)

    Messages

    Error 0xc02020c5: Data Flow Task 1: Data conversion failed while converting column "{exchange" (86) to column "{exchange" (231). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

    (SQL Server Import and Export Wizard)

    Error 0xc0209029: Data Flow Task 1: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[{exchange]" failed because error code 0xC020907F occurred, and the error row disposition on "Data Conversion 0 - 0.Outputs[Data Conversion Output].Columns[{exchange]" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

    (SQL Server Import and Export Wizard)

    Error 0xc0047022: Data Flow Task 1: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Data Conversion 0 - 0" (147) failed with error code 0xC0209029 while processing input "Data Conversion Input" (148). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

    (SQL Server Import and Export Wizard)

    Error 0xc02020c4: Data Flow Task 1: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

    (SQL Server Import and Export Wizard)

    Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - UnderlyingOptionsTradesCalcs_2021-03-16_csv returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

    (SQL Server Import and Export Wizard)

    • This topic was modified 2 years, 10 months ago by  water490.
  • One more thing...

    I use the Tasks -> Import Data wizard and I don't change any of the options (other than manually mapping certain fields from the CSV to import table headers)

  • This means that SQL  thinks your target field is not large enough for the data being imported.

    "...The value could not be converted because of a potential loss of data...."

    Also, I have often made the first step import into a SQL Table of all large VARCHAR fields. Then once the data is in SQL, it can be easier to look for data problems, quirks, anomalies etc. , then do a 2nd step to push the data from the VARCHAR table to the final target table

  • The field only has two values 5 and NULL.  I think INT is big enough to store it.  I'll try the VARCHAR approach...thx for the idea

  • water490 wrote:

    The field only has two values 5 and NULL.  I think INT is big enough to store it.  I'll try the VARCHAR approach...thx for the idea

    Apologies for checking the obvious, but by NULL, which of the following do you mean?

    a) ,,

    b),"",

    c) ,NULL,

    d) something else

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • good question.  that is hard to tell.  the data is coming in the form of an excel csv file.  the cell is definitely empty.  is there a way to know for sure what is in that cell?

  • CSV files are not Excel files. Open it in Notepad to see the unadulterated data.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • i am seeing this

    ,-0.0143,3965.54,3986.34,0,,,,,

    How do i address this?  VARCHAR approach as suggested earlier? or a different one?

  • Those are not INTs!! Change the target datatype to DECIMAL and you should have more success.

    If you haven't even tried to resolve your issue, please don't expect the hard-working volunteers here to waste their time providing links to answers which you could easily have found yourself.

  • water490 wrote:

    i am seeing this

    ,-0.0143,3965.54,3986.34,0,,,,,

    How do i address this?  VARCHAR approach as suggested earlier? or a different one?

    These are not integers

    -0.0143

    3965.54

    If you convert these to integers, you will get 0 and 3965.  The decimal portion will be truncated. That is what is causing your error.

    Is this financial (money) data contained in the file?  Can you determine the maximum number of decimal places, and size the column appropriately?

    Be careful with the varchar approach. Converting to different datatypes, especially dates, may change the data.

     

     

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • they are definitely INTs...

    the field only has two values 5 and NULL/empty

    The example I gave (,-0.0143,3965.54,3986.34,0,,,,,) are for the record.  the problem areas are the ,, ones.  Excel shows nothing.  when i look at the CSV i see ,,

    if i set the field to allow for NULLS then shouldn't it accept ,, ?

    you gave the choice of A or B or C.  is there a way to cover all of these cases?

  • water490 wrote:

    they are definitely INTs...

    the field only has two values 5 and NULL/empty

    The example I gave (,-0.0143,3965.54,3986.34,0,,,,,) are for the record.  the problem areas are the ,, ones.  Excel shows nothing.  when i look at the CSV i see ,,

    if i set the field to allow for NULLS then shouldn't it accept ,, ?

    you gave the choice of A or B or C.  is there a way to cover all of these cases?

    Like Phil said, .CSV  is not an Excel file.  Use a text editor to actually view the data.

    Also, I just created a .csv file with values like 123, 123.456, and ,,,,

    The import wizard picked these up as varchar(50) columns.    The "empty" (,,,,) columns were converted to empty strings, not nulls.

     

    Michael L John
    If you assassinate a DBA, would you pull a trigger?
    To properly post on a forum:
    http://www.sqlservercentral.com/articles/61537/

  • varchar did the job.  thank you!

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply