database size more than 4 GB

  • We have a database here whose size is more than 4 gb. There is a table which takes 2GB and the rest of the tables are pretty small. The table that takes 2GB, has a reserved space of more than 3GB. We have already shrunk the log file to 1000 KB but the database still is more than 4GB. Whatever query we run comes up with an error 'Could not allocate space'. We have tried many commands to shrink the database but no success. I guess in express, the limit per database is 4GB, right? Since no other table in that database takes 2GB except that one table, we are assuming it has to do with the Reserved file thats been taking more than 3GB. Is there a way to shrink that file so the database size gets shrunk too. We are out of options. Thanks in advance.

  • How large is the data file versus the transaction log file? The stored procedure sp_helpfile can assist you in determining this.

    I take it that you have run sp_spaceused against the database and individually against the 2GB table. Can you share the output of these procedures as well?

    Ken

  • After running sp_helpfile on that database, this is what I got.

    Data file size - 4193216 KB; maxsize - Unlimited; growth - 1024KB

    Log file size - 504 KB; maxsize - 2147483648 KB; growth - 0%

    After running the exec sp_spaceused tablename

    rows - 3913660

    reserved - 4137664 KB

    data - 2311496 KB

    index_size - 1825024 KB

    unused - 1144 KB

  • You are correct that SQL Server 2005 Express has a size limit of 4GB on databases (data file and transaction log file space combined).

    You may want to consider SQL Server 2005 Workgroup Edition if you are using the instance for production purposes. It has not limit on database size. SQL Server 2005 Developer Edition would be an option if this is a Dev or Test instance.

    Ken

  • Thanks Ken! So right now we can't just shrink the database then? Why is that Reserved so big on that table?

  • You do not have any free space in the data file to facilitate shrinking it with DBCC SHRINKFILE. Would it be possible to move the large table to another database and then reference it from the original database using a view? Please note that this should be viewed as a short term work around. If this database is supporting an application directly you will definitely want to perform some regression testing with the application against this proposed configuration. The view in the original database would have the same name as the 2GB table. The actual data would reside in the new database and table (a separate set of data and transaction log files). You will want to ensure that your host has the additional disk capacity for the data and necessary backups.

    Prior to using a cross database view you may want to consider an archiving strategy for older rows in the 2GB table in the original database. Maybe you could move the older rows to a separate database for archiving, or delete them entirely? Try to get a full backup prior to doing any archiving, cleanup, or migration of the data.

    Ken

  • Thanks a lot Ken for your idea. I will give that a shot. Thanks!

  • If you look online you could find Small Business Server 2003 online for reasonable price, the premium comes with 175 users but most important it comes with Exchange and SQL Server. The SQL Server Databases can grow as needed. The only restriction is you can only run one per network.

    Kind regards,
    Gift Peddie

  • I am afraid the work around method I posted will not work. I believe your 2GB table is actually 4GB in size (index space + data space). If you are confined to SQL Server Express, you will need to move a portion of the table to another database. Typically this means moving rows that are older than a certain point in time. This will be a bit easier if you have auditing columns defined on your table (creation or modified dates or timestamps), or you could use transaction effective dates.

    Ken

  • Good advise Ken Garrett ! 😎

    Apart from solving the 4Gb problem, the real question should be ..;

    Is it considered normal that a single table can consume 2Gb in a size restricted db system ?

    - Why does this table consume 2Gb ? (Can this table be truncated or partially "cleaned out" ?)

    - Do you rebuild indexes on a regular bases (to optimize space consumption, hence prevent this issue)

    You could as well use Kens workaround to export the data to a workerdb, truncate the original table and import it again from the workerdb.

    See what space it consumes after that operation ?

    * Play it safe and start with a full db backup !

    * be sure no application has access to the db when you perform this operation !

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • Thanks to all of you guys. We were able to solve that. What we did was we exported that 2GB table to a new database, then we removed it from the database. After removing the table, we shrunk the database and thats where we got the size to 55000 KB. What happened was we get this huge data every month. I then delete the existing table, then import the table with the new one. Everytime I was deleting and importing data, I wasn't shrinking the database. Thats why the data size kept growing. But now on, I know that whenever I delete this huge table, I have to make sure to shrink the database first before importing new data of same size.

    Thank You.

  • In stead of doing the delete... try out your re-load scenario using:

    truncate table yourtable

    This is the fastesd way of emptying a table and it will reuse the space.

    Regarding the log file, it is only a mark the table has been truncated.

    One downside: you cannot truncate a parent table.

    If you only import that table, do some extraction, and then never use it again, maybe you would be better of letting it reside in the second db, so it nolonger jeoperdises your original database.

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • I think the table that takes up a size of 2GB on its own needs to looked into. The table design should be corrected from design as well as from performance point of view.:)

    Following things can be tried:(Ofcourse if u want to continue with SQLCE Compact edition)

    1. If that large table is ever growing, then a periodical clean up is reqd. Normally as the size of SQLCE.sdf goes beyond 2 GB performance issues creeps into the picture.

    2. If the table is static, go for a seperate SQLCEdatabase and store some static tables of the application there.

    I mean to say the following:

    For the application

    *) Consider making two DBs, one containing master or static tables and the other containing transaction or growing tables.

    *) From application create two seperate connection strings to access the two DBs.

    *) On application opening like operation cache the master data in the app layer and if you require joins betwwen two tables from different DBs try it inside the application code.

    *****) If your application is a always connected mode one accepting a large volume of data go for the SQL Server 2005 and don't rely on SQLCE Compact edition.:D

Viewing 13 posts - 1 through 12 (of 12 total)

You must be logged in to reply to this topic. Login to reply