Multicast help

  • I've got a dataflow which has a script component that will generate some new Guid's for me. These guid's are then used in my inserts

    for example I have 3 tables, A, B, C

    Each table uses a Guid as the primary key

    In my script component I create the following guids

    tableAGuid

    tableBGuid

    tableCGuid

    After my script component is a multicast with 3 paths

    One for each table

    I beleieve my problem is Table B has a foreign key constraint to Table A so when I run I'm getting the following

    Table B = AccountResponibleParty

    Table A = PersonRawUnencrypted

    [AccountResponibleParty [4079]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E2F. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The statement has been terminated.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The INSERT statement conflicted with the FOREIGN KEY constraint "FK_AccountResponsibleParty_Person". The conflict occurred in database "PRM", table "dbo.PersonRawUnencrypted", column 'PersonUid'.".

    How do I insert into both these tables during the dataflow at one time?

    I can't do it after I insert into the SQL Server Destination since there is no output from the destination. So I need this to occur simutaneously.

    Joe

  • Hi Joe,

    I was trying to achieve something similar recently and decided it probably wasn't possible without breaking the relationship between the tables, as the data won't be committed till after the data flow has finished, and sql server won't allow this loss of referential integrity unless you specify it in the table design. I also found that SSIS and BIDS got very flakey when i had more than 3 or 4 destinaitions per dataflow, although I have a feeling this might be down to my comp as much as SSIS :S

    The way I did it in the end was to pull the next primary key for each table (in my case i did (max(ID)+1) + counter from script task),and store all the keys and data in in a flat import table. This way, I can use multiple dataflows to load normalised data and gave me a simple data flow at the top of the process to design new imports while the rest of the package stays the same. Other options would be to use a raw file destination to pass data between dataflows in a similar manner. I have also known people to use SPROCS to do similar things, however I can't go into any massive detail about that.

    Hope thats helpful

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply