Data flow task (OLE DB) does not fetch all the stored procedure data

  • I have an package that is used to import email items that is deployed on production and in daily use. The source database is SQL 2008 R2 and the destination database is SQL 2012. The data is transferred from one database to the other and it contains information that has been extracted from email files--Outlook, Lotus Notes, etc. The problem occurs with a table containing Lotus Notes email data and, very simply stated, is that the data flow task only fetches a limited number of records--generally 714, while the stored procedure that is the source of the data flow task produces 14793 running in SMSS. There is no error generated by the data flow task.

    I have tried several work arounds, all to no avail:

    1. DTS was attempted. The DTS package generates this message: Found 43 unknown column type conversion(s)

    You are only allowed to save the package.

    2. Created a view using the stored procedure SQL with the parameter hard-coded in the view and used that as the Data flow source. Same result, 714 records fetched.

    3. Inserted the the data generated by the stored procedure into a table and used that as the Data flow source: 714 records fetched.

    4. Deleted all the data from the table made in the third try except for 6 NOT NULL columns and used that as the Data flow source. Surprise! 1207 records fetched.

    Interestingly enough, the number of records matches fetched in total by the data flow task matches the number of records returned to the first and only data viewer grid.

    My thinking is that there is some sort of special character which is causing the data flow task to abort prematurely. However, I have no evidence that this is the case nor have I come across any such thing as I search for similar problems on this and other forums. And, for attempt #4, all the data is pretty much gone --5 integer columns and 1 bit. At this point, I have no clue.

    Any insight would be appreciated.

  • Just a thought and obviously you wouldn't do this in Production, but you might have taken a backup and restored to a Test/Dev database:

    what happens if you delete row 715 in the source? Have you tried viewing the source data in a Text Editor to check for strange characters? (I don't know if this is possible as I'm not familiar with Lotus Notes).

    Regards

    Lempster

  • I have tried deleting row 715 and that gets me a little further along, but not a lot further. That again makes me think that something in the column data is the problem. However, searching column data for strange characters has not turned up anything interesting.

  • Do you get any errors in the data flow? Or just 700 records as if everything was ok?

    Does the view show all the rows when you just query it in SSMS?

    Another step to try: get the stored procedure to load a table on the source DB, and see whether all the rows will go in, and whether a data flow can grab them from there.

  • No errors occur on the data flow task. And the stored procedure from SMSS always returns 14000+ rows. And I have dumped the data into a table from the stored procedure and tried to use that as a source for the data flow task. To no avail. Still unable to get more than a few hundred rows into the data flow and the data viewer.

  • Here is a quick update: I deleted everything in the package but the source and destination and then entire data set is transferred without a problem. Back in the original package, after the source is a Look-up and there is an error in the Look-Up, as there are no matches in the first 714 records and the Look Up is configured to fail with no matches. There are only 14 matches in entire ~14,000 records so perhaps configuring the Look Up to fail with no matches is the source of the problem. Does there need to be a match in each of the data buffers in the Look Up task? That I will have to investigate. But it seems that the Look Up should not stop the rest of the data from being loaded if there is no match in the first data set. There are lots of configuration settings on the Look Up and I have tried most of them. Will touch base again if anything illuminating is forthcoming.

  • Lookup is fussy.

    First, realize it's case sensitive. UPPER everything to make sure you're matching like to like.

    Next, you have to tell it to ignore failures if you want nulls to be passed in the lookup column if it doesn't have a match. You do that in the error controls.


    - Craig Farrell

    Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.

    For better assistance in answering your questions[/url] | Forum Netiquette
    For index/tuning help, follow these directions.[/url] |Tally Tables[/url]

    Twitter: @AnyWayDBA

  • Yes, everything works as it should if I set the Lookup to ignore failures. All the data gets loaded and I get my 14 matches. I will need to figure out another way to fail the package if no matches are found. That shouldn't be too hard, I hope.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply