Virtual Server backup of huge site

I have a huge site on my server (roughly 52 gig of content with static and databases combined). The backup process works, but I tried to use it to migrate to a new server and I couldn’t wait the hours and hours that it was going to take to restore it. After around 3 hours I aborted it and manually migrated the site.

I tried backing up the site without the public_html so I could just manually migrate the static content and it wouldn’t let me do that either. What I ended up doing was moving the static content from the servers public_html folder, doing the backup with an empty public_html, restoring it and manually migrating the public_html content.

Is there any way I can backup the Virtual Server without all the public_html content and I’ll deal with backups of it manually?

…since you have control of the site…log in and cut the public_html directory…paste it under another directory outside of that domains Home directory…backup and restore then move the public_html over separately.

if you have ssh then you could transfer it from one server to the other directly. you wouldn’t have to babysit the migration all the time + it would not take so long to migrate

log in
cd public_html
tar - cpzf mydomain-25-07-09.tar.gz *

login to your other server
cd public html
wget http://old-servers-domain/mydomain-25-07-09.tar.gz

tar -xzf mydomain-25-07-09.tar.gz

That’s what I did. But that doesn’t help for scheduled backups.

When scheduling the backup, in the Features and Settings section of the backup screen, there’s a list of what actually should be backed up.

By default, Virtualmin grabs everything… but using that list, you could uncheck “Server’s home directory and web pages” – and at that point, everything but the files in $HOME would be backed up.