Flat File Destination

  • Hi there,

    I have a table in sql which has 4 million data in it. Now, I want to export this table to a text file so I can zip the txt file, copy on the ftp site and then import that txt file to a different sql server. So far, I have tried exporting this 4 million data to a txt file and this is what I did.

    Import and Export Wizard

    Choose a Destination

    Flat File Destination

    File name : CustomerFile.txt (I created a blank txt file on the server)

    Format : Delimited

    Text Qualifier : None

    Configure Flat File Destination

    Source table or view : Customers

    Row Delimiter : {CR}{LF}

    Column delimiter: Comma {,}

    When I ran the wizard, the export was completed fine. Now when I tried to open the file(CustomerFile.txt) to see the data, I got an error.

    "Not enough space to open the file." I looked at my disk space and there was a lot of space. Is it something with the text file? Was it not able to handle 4 million data? If not, then under file extension should I save the data? Please advice.

  • I got this figured out. I was trying to open the file in notepad which by default opens the txt file in notepad. The notepad cannot open that big file, thats why I was getting the storage problem error. I opened that in a wordpad and that opens fine. When I try to import the data from this txt file in sql server (in a new table; created at the time of import through import and export wizard), i got some truncation error. There are like 65 columns in the file, do I need to change the datatype of each column? Also, in the import export wizard, I was not able to change the datatype of each columns. Any ideas how to make this work?

    If you need the errors to see, i can re-run the import and can show you those error messages. Let me know.

    Thanks.

  • How would I handle any fields that has a comma in them. Right now, if a field has abc co, inc..then the outdata data is shown as abc co, inc as two separate fields. This is what causing the problem. How would I handle this in the Import and Export wizard?

  • Hi

    Not the exact answer for your question but did you try to do this with BCP? It's much faster and you can export/import in native mode without all those problems.

    Greets

    Flo

  • You could just create a new DB, select the data into a table in the new database. Then detach this, zip the db up and re-attach this on the other server and then insert the data from this tempDB.

  • As far as your question with "how do I deal with commas in a name?" goes, that's a common thing with names.

    The answer is to put a " character in the text qualifier field if you want to use a comma-delimited file. Commas happen a lot in real data, so they are not the greatest delimiter in general, but using a " as the text qualifier should help quite a bit. Another good strategy is to forego commas and just use a tab character as your column delimiter since tabs are significantly less common in people data (though not unheard-of).

    I also agree that BCP might offer you a good result.

  • As a side note, for viewing very large files, there is a shareware utility called "V" by Charles Prineas that I have used for years. I have never encountered a file it wouldn't open to view, even 6GB+ mainframe dumps. Also, for viewing and even manipulating large CSV or other delimited files, I use "CSVed" by Sam Franke. These tools have made flat file analysis a lot easier.

  • I did export the data into tab delimited instead of csv and this time the import worked just fine. I haven't tried bcp and would like to know more about them. For now, I just imported the file to a table in sql and everything comes out just fine. I will look into it more. Thanks for everyone chipping into this. It was very helpful.

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply