SELECT Statement using Flat File Contents as Criteria

  • I am trying to use bcp to write a select statement to extract data from one database/server to get it to be in sync with another database/server. (There is no way to get either of the two SQL Servers to talk to one another).

    The problem is with the WHERE clause. I have a list of over 3,000 "field records/criteria" I need to use as part of the criteria. Normally, I would just use an inner select statement for this, but since the SQL Servers cannot talk to one another, I need to specify all 3,000 "criteria". Hmmm. Sounds confusing, even to me!

    Basically, I need to do this:

    bcp "select * from mydb.dbo.mytable where myfield in (select myfield from ) " queryout...

    I was hoping to export the items into a flat file, then try to do my query above against the flat file list, but I cannot find any documentation or examples as to how to do this. Again, it unfortunately needs to be done via bcp and I have no ability to get access to that remote server (another dba will have to run my bcp script).

    bcp "select * from mydb.dbo.mytable where myfield in (select from ) " queryout...

    Argue for your limitations, and sure enough they're yours (Richard Bach, Illusions)

  • I'm not sure I understand.

    First 3000 isn't much. I'd just run them, suck them in on your side and discard what you don't need.

    Second, where is the criteria for the inner statement? I'm not sure what you're asking here.

  • Steve,

    Thanks for the reply. I figured this would be a bit confusing; the "inner" select statement needs to contain all 3000 items. Basically, the criteria needs to be

    bcp "select * from mydb.dbo.mytable where myfield in ('criteria1', 'criteria2', 'criteria3', 'criteria4' .... 'criteria 2998', 'criteria 2999', 'criteria3000') " queryout ...

    I was hoping I could create a flat file that contained all 3000 criteria and just point to the file for my "inner select" statement.

    Also, those same criteria need to be used on 8 different tables...

    bcp "select * from mydb.dbo.mytable where ...

    bcp "select * from mydb.dbo.mytable1 where ...

    bcp "select * from mydb.dbo.mytable2 where ...

    I am also selecting specific fields from each of the 8 tables, so select * cannot be used (just wanted to simplify the example). I could use a date parameter instead of 3000 individual items, but then I will be duplicating data and moving (after compression/zip) over 200MB of data for just one month (over 3x the data I really need).

    Does this help? 🙂

    Argue for your limitations, and sure enough they're yours (Richard Bach, Illusions)

  • You could generate insert statements of your data and run those against your other DBs.

    Try:

    select 'insert into temp_table_name values (''' + column_value + ''')' from source_table

    You will need to address any data type conversion needed to handle your data.

    Then dump these insert statements into a script with the create table statement. Run that against the other DBs. Then you only have to do a simple join or subquery with this temp table.

  • Tony,

    Thanks for the input! I did consider creating a temp table (perhaps @temp or @@temp) to store the 3000+ values, run the bcp queries, then drop the table.

    However, we (the company I work for) does not own the other database, one of our clients does. The directive "from the top" is to minimize any intrusions into their database or work that the other DBA has to do.

    However, I think that I might have to use the temp table option if I cannot figure out another way to get the data from a flat file into a query. 🙂

    Argue for your limitations, and sure enough they're yours (Richard Bach, Illusions)

  • I have 500 customers running their own SQL Server with our products database installed on their server. The DB is theirs and they have to do all of the maintenance, backups, etc... Because these databases are not on our network and we have no access to them, we have to build scripts and either have our consultants run them or have the customers DBA or Application Administrator run them. We just make sure any temporary objects we create, we delete at the end of the script (tables, procs, functions, views, etc.). Because we can not get backups of the DBs in all cases, sometimes we have the customers run scripts that generate insert statements so we can get pieces of the DB back in our hands for analysis (reports don't always cut it as you always have another question after looking at the first). It works for us and our customers. It allows us to get the job done with minimal disruptions or legal headaches with non discloser agreements. Since all 500 customers are Law Firms we are always careful so we do not get sued.

    Another option may be SSIS. If you put your IN clause data into a delimited file, you could use SSIS to load that data into memory and run a routine to delete, dump to file, etc... for any data that matches each item in your delimited file. I don't have any examples or further details, you would have to play around and see what you could get working. I'm sort of a newbie to SSIS in SQL 2005. However, it is extremely easy to development and test with. I been building a DW with it and it is sweet.

    The final option I have at this point would be to create a script that used a cursor to dymanicaly build your statement with the large IN clause. It would have to loop thru your in clause data and append it to the statement. Then at the end print the statement and save it to a script. I've automated stuff like this via batch files, basically script one create the statements and saves them to script2. Then the batch file runs script two. BTW, I forget what the IN clause limitation is for SQL Server, but Oracle's is 2000 items. Older version of SQL Server also had small limits, but I think 2000 and 2005 support up to 4000 items. Our products break the IN clauses up into multiple IN clauses within the statement, ANDing them together when we have more items than the DB limit.

    Lots of option, the question is which one is right for you.

    -Tony

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply