Database Size - Archiving Data

We are currently running a live database of 2.6Gb and it is increasing at an alarming rate. This has begun to create problems for us in terms of time taked to back up the server and the speed at which the system runs on a day to day basis. The live data consists of the last two financial years only. Is this normal to have an increasing database? Can Data be archived? Does anyone run Navision on its own server? Help would be much appreciated.

Generally speaking, I wold not say that it’s an alarming rate of database growth. You could look into running the Compression batches found under Periodic Activities - but remember to make a complete backup first. To speed up Navision, you could also take a look at your cache/commitcache settings - they may not be set the best way. We don’t have any customers who are NOT running Navision on a spearate Navision server. Lars Strøm Valsted Head of Project and Analysis Columbus IT Partner A/S

Hi You can (re)generate a lot of space by optimizing the tables. Whenver you take a backup and restore it again, you will automaticly optimize your tables, but you can do it manually by choosing “optimize” in Database, Information, Tables. This operation requires some amount of free space.

Hi cathryn, what exactly is an alarming rate? Of course it is normal to have an increasing database. Every day that you create new data → your database is growing. I would say, that 2,6 GB in 2 years is not very much. But that depends on your business. If you want to use the compression batches, then please contact your NSC, to do it together with them. That batches are complex tasks and have a huge impact on the quality of the resulting data. About Performance Problems … - run a dedicated database server - have enough cache (navision can handle up to 1GB cache) - check if commitcache is activated - do NOT use RAID5 - have 2, 3 or more db-files on different physical harddisks - use really fast harddisks - show your users how to use keys before making a selection :wink: Backing up with the internal backup-tool is o.K. for small databases. With a size of 2.6GB i would shut down the db-server-task at night, make a copy of the database-files and start the db-server-task again. Then you perform a dbtest=min on the copy before you save the copies on a streamer. But better plan the backup-strategy together with your NSC. Backing up larger databases with the internal tool takes a lot of time - but if you have to restore your data, it takes much more time. So think about the other way of backing up. Data can be archived for example by using a third party tool with a navision link (Easy, Saperion). Merry christmas, Richard

If you optimize your tables you will gain quite a lot of space, that’s true. But to stay at the smaller size, you would have to do that over and over again. I would not recommend that. And - if you have more than one database-part, then optimizing your tables will result in a lower performance than before. Richard