Inquiry About Backup Impact on vHost Functionality and Server

I have a couple of questions regarding the backup process for virtual hosts (vhosts) and would appreciate your assistance:
1-Does running a backup on a vhost have the potential to cause it to stop working?
2-If the size of a vhost is very large, such as 500 GB, could this create issues on the server?
Thank you for your guidance on these matters.

It can (esp if you have very little space and a big VS to backup (or many VS))

regarding item 2 (one question per topic!) still assuming you have enough space; memory; power it will have some impact and that is for you to determine. - if a production VS beware that timing could be critical.

It might have been good for the OP to add what os etc they are using .. just saying :grinning_face:

Could you please provide more explanation on this?

Could you please provide more explanation on this?

You are prompted when you first make a post and anice copy function is provided for you in the dashboard. reading about such things is a good idea before asking a question as it helps other to help you.

1 Like

amazing what you can find using the search facility (when you have exhausted that you can refeence a post asking for further clarity)

sorry about being curt - but i’ve a flight to catch :upside_down_face:

1 Like

Not if you have sufficient resources. It might slow things down, as backing up is resource intensive…compressing files is CPU and memory intensive, reading and writing a bunch of data is disk intensive.

You’ll only know for sure by testing. Which you should do before the system is in production use. Backups are the among the first things you should configure and test, and then worry about all the other stuff. Obviously for a proper test you need all your data in place, but maybe before you flip the switch (DNS) to make it the production server do some testing with backups to make sure you’re comfortable with it and your server is up to the task.

More data is more resource intensive, generally speaking. But, otherwise, no, it should be fine, as far as I know. Some types of backups have size limits that would require the backup to be split up. I think in most cases Virtualmin will do that for you automatically. Again, you want to test all that before you’re depending on the server. Practicing restores and testing the backups is also worth doing before going into production.

1 Like

One of the curious things I have found these days that I have been transferring domains from one server to another is that both backup and transfer, when they generate a backup file using gzip and with top I see that puts a core of the cpu 100x100 but only one core, is monocore, I do not know if there is another method that can use all the cores of the server to speed up the process?

For the rest, it does not affect the use of the server, logically it will go slower but it does not stop any service.

1 Like

There are parallel gzip implementations (e.g. pigz or pgzip), and I guess they’d probably work without any changes in Virtualmin, as I think they accept the same arguments as gzip. I think you can configure the path for gzip in Virtualmin somewhere…however, you probably don’t want backups slamming all of your cores (though it would get done faster).

1 Like

There is no option in any gzip to pass as a parameter the cores to use?

This topic was automatically closed 8 days after the last reply. New replies are no longer allowed.