File sizes : single 145G file how do I get it into 5-10gig files

  • Are there any other utilities to export the data, shink the files, create new files and re-import it?

    BCP keeps erroring out based on time-out and once because of memory.  I am trying to make smaller, more manageble files.

    With Oracle, I have done 100+gig exports, modified tablespaces and datafiles to spread them out on disks, ect.  Looking for an alternative to managing databases with 100+gig data files.  I have 4 user databases running on a SQL Server 2000 Enterprise sp4, quad CPU, 4gig ram.  Each user database has a single data file that ranges from 110G to 145G.

     

    Thanks for any suggestions.

     

    Joseph

  • If you are encountering time-out or memory issues with bcp you can try the -b switch, which will commit the data in smaller batches. Otherwise, write your scripts to import the data in batches. SSIS would also help you with moving data around.

    Paul

  • Can I use SSIS on a 2000 db for this?  I do have SSIS setup on another server. 

    Will running ssis from another server cause network traffic overhead?  Like zipping a file from your PC on a file that sits on a sever?

     

    Thanks for the suggestion.

    Joseph

  • For sure! You wouldn't be able to deploy a SSIS package on SS2000, but for your purpose you don't need to. All you need is to use the 'SSIS Import and Export Wizard...' - you can find this in the 'Project' menu item list.

    HTH

    Paul

  • Joseph,

    Were your dbs created with files that size or have they grown? If you want smaller data files, meaning file groups, you can easily create those and then move the clustered index for a table to move the data to a new file. Then you can shrink the main file.

  • They started of smaller, but grew quickly (what I was told).  I am walking into the middle of things are trying to organize and layout standards.  Most things here are all "unlimited" for file sizes, so as long as disk as available, no one worried about it.

    I will try to move the indexes to a new filegroup, and then try running SSIS.

     

    Thanks,

    Joseph

  • Joseph, my earlier post may have confused you a little here. Basically, there are two options with respect to using files and filegroups:

    1. Have one filegroup with several files, each appropriately sized. In this instance, you can move the data by simply recreating the 'clustered index' on the data to be affected. This process will fill each file in that filegroup in sequence. No need to use SSIS in this case. You will need to make sure that enough space is available for the files in the filegroup.

    2. Have one file per filegroup, each appropriately sized. In this instance, you will need to change the default filegroup in sequence as you copy data across. Each filegroup will have a table with the same schema but different names - this is because a table cannot span more that one filegroup. You combine the data using a partitioned view (SS2K) or implement table partitioning (SQL Server 2005). You will need to use an alternate approach to moving the data, such as SSIS in this case.

    In your case, it looks like option (1) as suggested by Steve is the way to go.

    Paul 

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply