Backups gzip failed! .. failed! TAR failed

Hi, I am relatively new to Virtualin and in teh process of transferring sites over to a VPS (Centos 6.3)

Things generally going well, so I wanted to start taking backups. But I’m getting …gzip failed! when the virtualmin backup gets to zip the MySQL Db.

So thought I’d try webmin config backup but that errors with … failed! TAR failed

I can’t find any more detailed log.

I have checked gzip & tar are available at command level on the server

I have searched the forums but can’t find and specific answer to this issue.

Must be a newbie setup issue, as everything is basically new install.

Your help appreciated

Howdy,

Those normally work with no problem on a new system.

What does “df -h” show?

Also, what kind of VPS are you using? Is it by chance OpenVZ? If so, can you paste in the contents of /proc/user_beancounters?

-Eric

In answer to your queries

df -h

Filesystem Size Used Avail Use% Mounted on
/dev/xvda1 79G 12G 64G 16% /
tmpfs 466M 172K 466M 1% /dev/shm

the system is apparently a Miniserver VM® technology , there is nouser_beancounters in proc


given tar & gzip work at command level I’m guessing at permission issues. I’m logging in as root at Webmin/Virtualmin level so I’d assume that is all good.

OK, so when I ran the backups for virtual servers I set it to a local directory in /root and ticked ‘create directory’

Virtualmin created the directory but failed on the gzip.

Based on my ‘permissions’ suspicion, I just set the new directory to 777 and the backups are now running.

the directory was
drwxr-xr-x 2 root root 4096 Feb 20 20:04 backups-as-transferred

so that suggest that the backup process is not running at root. But that can’t be right as teh backup files are getting created as root now that the directory is set to 777

e.g.
-rw-r–r-- 1 root root 5981 Feb 21 11:28 roisites.com.tar.gz.dom
-rw-r–r-- 1 root root 315 Feb 21 11:28 roisites.com.tar.gz.info


With the Tar Failed - on webmin config backups I have tried the local directory (777) and also tried ftp yet the Tar fails, so I have to assume the tar is doing something in a working directory where the permission are not making sense either.


Does any of this help get to the bottom of what is going on?

WIth the tar, on webmin config backup I can make it work if I don’t select all modules.

Selecting all modules makes the tar fail.

It isn’t a specific module, as I have selected the top half bottom half and every combination within reason, and the tar works up to a certain amount, but all fails. Now the files aren’t big or that numerous, so I suspect it is something to do with the length of the command line that is generated by selecting them all. That is a guess, but is there an OS limit or an internal array.

Is it abnormal to select all modules to backup -or is it more normal to select one at a time into different files? If so that is a bit tedious isn’t it?

my two cents worth - get an account at adrive.com (they have webdav) and backup there.

then use crontab instead.

I tried to do a recovery and got TAR failed.

So went back to test a backup / reovery and still go gzip failed, even though I had changed permissions of teh folder I was using.

So soemthing is wrong, but this is pretty much ‘out of the box’ setup.

Can anyone suggest ways to debug this and get it going?

OK I tested on my standalone PC with Centos 6.3 and it worked.

So I thought I’d bring both machines up to exactly teh same level, and noted that there was an updated to webmin (even though I only installed last week). Stop webmin - updated - started - problem gone.

I suspect a memory respurce issue, I ahve a 1gb VPS, but was flying close to the wire. Not now, seems to have a bit more head room.

Fixed.