Hello,
I have an issue on two different virtual servers running the latest version of Virtualmin so I thought I would mention it here.
Operating system Ubuntu Linux 10.04.1 x64
Virtualmin version 3.90.gpl GPL
Webmin version 1.570
For some reason backups initiated by server owners to their local browser do not work.
I set the Backup Destination to “Download in browser”, and select the “Single archive file” Backup format.
However on both servers using any server owner account (different accounts, different sites) I get the following error:
Backup failed : Failed to open /tmp/.webmin/643293_1_backup.cgi.tar.gz.dom for writing : Bad file descriptor
This is logged in the webmin logs as well.
However in the /tmp/.webmin directory I do see the following files:
643293_1_backup.cgi.tar.gz and 643293_1_backup.cgi.tar.gz.info
So it seems some files are created there fine. However it is complaining about a “.tar.gz.dom” file.
Backups to a local directory work fine, just a problem with download to browser.
The same backup initiated to download to browser by the server admin seems to work fine.
My Workaround: ******* USE AT OWN RISK!
I have looked at the /usr/share/webmin/virtual-server/backup.cgi file, and I don’t see any mention of dom files (for instance where permissions are set).
So I modified the code as follows (change on lines 137,150) surrounding code given for clarity:
if ($dests[0] eq "download:") { # Special case .. we backup to a temp file and output in the browser $temp = &transname().($config{'compression'} == 0 ? ".tar.gz" : $config{'compression'} == 1 ? ".tar.bz2" :".tar"); foreach $t ($temp, $temp.".info") { &open_tempfile(TEMP, ">$t", 0, 1); &close_tempfile(TEMP); &set_ownership_permissions($doms[0]->{'uid'}, $doms[0]->{'gid'}, 0700, $t); } &set_all_null_print(); ($ok, $size) = &backup_domains([ $temp ], \@doms, \@do_features, $in{'fmt'}, $in{'errors'}, \%options, $in{'fmt'} == 2, \@vbs, $in{'mkdir'}, $in{'onebyone'}, $cbmode == 2, undef, $in{'increment'}); &cleanup_backup_limits(0, 1); unlink($temp.".info"); &run_post_actions();To:
if ($dests[0] eq "download:") { # Special case .. we backup to a temp file and output in the browser $temp = &transname().($config{'compression'} == 0 ? ".tar.gz" : $config{'compression'} == 1 ? ".tar.bz2" :".tar"); foreach $t ($temp, $temp.".info", $temp.".dom") { &open_tempfile(TEMP, ">$t", 0, 1); &close_tempfile(TEMP); &set_ownership_permissions($doms[0]->{'uid'}, $doms[0]->{'gid'}, 0700, $t); } &set_all_null_print(); ($ok, $size) = &backup_domains([ $temp ], \@doms, \@do_features, $in{'fmt'}, $in{'errors'}, \%options, $in{'fmt'} == 2, \@vbs, $in{'mkdir'}, $in{'onebyone'}, $cbmode == 2, undef, $in{'increment'}); &cleanup_backup_limits(0, 1); unlink($temp.".info"); unlink($temp.".dom"); &run_post_actions();I added “.dom” files as part of the foreach loop, and had it clean up the file afterwards.
Backup to browser seems to work fine.
Perhaps this should be looked at and added to the next version if it is an issue for everyone.
Thanks!