Restoring backups reading/extracting archives three times?

Hello!
I am testing a complete restore of my backups using bzip2, and gosh is it slow… It’s about 35 GB of bz2 arhives, and it takes about 3hours before I can even select what to restore. Then, it reads all archives again, which takes another 3 hours, and only then it starts extracting them, which also takes a few hours.
Isn’t there any way to save some time on this? Isn’t the .info file supposed to make restores faster? Maybe it’ll run faster if restoring via command line?

lol, this is kind of frustrating… after waiting basically the whole day for virtualmin to finally start extracting the archives, i got this error:

Starting restore of 171 domains from local file /mnt …

Extracting backup archive files …
… extraction of /mnt/27-02-2011 failed!

cat: /mnt/27-02-2011: Is a directory
/bin/tar: This does not look like a tar archive
/bin/tar: Exiting with failure status due to previous errors
Backup does not contain virtual server information for missing domain axxxxxxx.com.

… failed! See the progress output above for the reason why.

I can quite agree with this one… Vmin really should improve performance of backup archives. Unpacking large archives twice is just unnecessary, the metainfo should be stored outside the main large archive.

I tried switching to ZIP for this reason, which has a central index and does not need full uncompression to read the metainfo, but that failed because ZIP apparently does offer an incremental compression option, and the coders sadly were not willing to appropriately change the way incremental backups are created. (And I don’t really wish to create a full backup every night.)

So, unfortunately a double-fail here. :frowning:

A possible solution might be backing up home directory contents, which usually constitute by far the largest chunk of data, by some other means which doesn’t require double-unpacking.

maybe i shall switch to .tar.gz instead of .tar.bz2? it won’t have multicore extraction (have 8 cores) but maybe be faster anyway?
btw it looks like restoring via command line works faster as it saves those 1-2 extra archive openings…

Howdy,

Yeah, unfortunately, handling restores with large archives can be slow.

One thing you could do is what Locutus suggested – if you handle homedir data a different way, that could really speed things up.

Personally, I backup homedir data separately using rsync, and use the Virtualmin backup just for all the other data.

-Eric

Actually Virtualmin improved this by introducing .info files. But unfortunately there is a bug and it doesn’t work :slight_smile:

Yeah i didn’t see any improvement since those .info files were introduced :stuck_out_tongue:
Anyway, i will do as eric has suggested, by copyiing the homes with rsync.
For the rest I will create a single backup tar.gz archive containing all domains, settings and dbs and restore it via command line.

What do you do about the log files? do you need to rsync them as well?