Restore failing on some domains others work - Kindly need your help

I had to upgrade a regular rhel x64 server and I backed up all 10 domains and they were neatly backed up as domain.com.tar.gz
I had backed up mysql files separately so problem is only of html backup.

I used flwg command to move:

scp -rp *.com.tar.gz root@2ndserver

After 1st server is upgraded with 5* linux etc. I do the same command to move
from second to 1st server.

7 domains are restored flawless and no problems.

BUT 2 domains of similar sizes failed on same server.
I just need to restore html files of these remaining 2 domains.

  1. RESTORE ON VIRTUALMIN
    I got flwg errors :

Restore failed : The specified source is not a Virtualmin backup : Not a valid tar or tar.gz file

After several movings I tried to decompress the tar.gz file and re-compressed the file
*.com_dir that is in *.com.tar.zip

I then tried restore on virutalmin and then on webmin Filesystem

tar: x.com.tar.gz: Cannot open: No such file or directory
tar: Error is not recoverable: exiting now
tar: Child returned status 2
tar: Error exit delayed from previous errors

Restore failed! See output above.

  1. On webmin Filesystem:

restoring x.com.tar.gz’ -p -z …

ERROR:
gzip: stdin: invalid compressed data–crc error
tar:
gzip: stdin: invalid compressed data–length error
Skipping to next header
tar: Child returned status 1
tar: Error exit delayed from previous errors

Restore failed! See output above.

  1. Then i tried to restore just with the *.com_dir that is in *.com.tar.zip

tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors

Restore failed! See output above.

Kindly advise me how can I make it work to restore these domains.

Howdy,

It sounds like a few of those backup archives may be corrupted.

Do you still have your live server somewhere? That is, can you re-generate the backups?

If so, I’d start there… re-creating the backups that aren’t working may resolve the problems you’re seeing now.

If not – what you’d probably need to do is look at those archives, and determine if they’re recoverable. If you run the command “file” against them, what does it say? For example, “file domain.tld.tar.gz” – does it recognize it as gzip or tar data? Also, how large is the file?

-Eric

Dear Eric

Besides the markets and servers one is fighting with the icy air but rest all is fine:) I hope you are doing fine.

I wish that I had done some testing on original server but see I trust totally blindly in webmin n virtualmin. It usually works fine. See I was able to restore 7 domains instantly.
I do have the “original backup” copied on server 2 which is RHEL Linux 5* and also copied from there
back to 'New server1 with Linux 5*]
Just for the reference the original server1 had linux 4* where i backed up. This could probably be the cause?

Here is the output :
file *.net.tar.gz: gzip compressed data, from Unix, last modified: Tue Jan 18 19:19:36 2011

They are fairly large files.
Further some hopeful aspect is that:

I have .net_dir file out of the gz. is there any way to use this file to restore?

domain *.net.tar.zip has decomprssed a file .net_dir size : 1.831.987.200
the original .net.tar.gz size is 292.880.865

When I do tar -xf or xvzf :
gzip: stdin: invalid compressed data–crc error

gzip: stdin: invalid compressed data–length error
tar: Skipping to next header
tar: Child returned status 1
tar: Error exit delayed from previous errors

file *.com.tar.gz: gzip compressed data, from Unix, last modified: Tue Jan 18 19:18:58 2011

I did
tar -xf *.com.tar.gz

gzip: stdin: invalid compressed data–crc error
gzip: stdin: invalid compressed data–length error
tar: Skipping to next header
tar: Child returned status 1
tar: Error exit delayed from previous errors

and .com_dir file has extracted itself.
Files:
-rw-r–r-- 1 root root 561879040 Jan 18 19:06 *.com_dir
ORIGINAL backup -rw-r–r-- 1 root root 95298665 Jan 21 14:44 *.com.tar.gz

I will now go to 2nd server and try to do the above and report back.

Second domain

Hello again.

I did [root@# tar -xf *.net_dir

flwg error :

tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed

Now it did restore just a few files may be 2%.
is there any way out?

Miracle. I went back to the 2nd server where i had “stored” the backups of 1st one.

Now here i did on the same location where the file was without copying anywhere
tar xvzf *.com.tar.gz

_dir
_dns
_virtualmin
_web etc.

After that I did :
tar -xf *.com_dir

and it exctracted all files obviously the file ownership is changed. I think now I can move these files to 1st server.

I wil now try other bigger file.

I am posting it for benefit of any other person in same situation.
As per my last post I was able to extract html files of one domain. I used filesystem backup of webmin and .gz was created. When I copied by scp -rp to other server and tried restore:

Similar error as above:

tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
gzip: stdin: invalid compressed data–crc error
gzip: stdin: invalid compressed data–length error
tar: Child returned status 1
tar: Error exit delayed from previous errors

This could have something to do with 32/64 bit? But both servers have similar setups.
I will now try to copy as plain files cos compressed file move dint work.

If you are backing up and moving to another server make sure both have similar standards.
If not perhaps there should be another way or a fix for this issue.

I am sure there should be a way out. But in my case I was able to backup, copy to other server, then copy back to original server and restore but only some domains. This world is very bizarre:)

Ok as above post I was able to extract files of one smaller file. Other larger file is still giving error.

I did tar xvzf and got *.net_dir

file *.net_dir

*.net_dir: POSIX tar archive
[root@ # tar xf *.net_dir

tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors

After a search I found one perl file which I ran but it read all content files and did not give me any clue.

http://aplawrence.com/Bofcusm/2646.html

You can check the perl file on the link or I can send or attach. Just chmod 755 and run it I did but nothing came out or may eb I dint understand it.

This gz file is about 292mb and .net_dir file is about 1.8 gb

Any solutions please kindly do write me. I shall be very grateful for any help.

I just want to tell that mr Jamie Cameron and other friendly people of webmin and virtualmin are wonderful.

I dont know what will I do without them.

Can someone reply me if the backup.com.tar.gz has .com_dir file opened? How can one can work around this to extract backed up files.

Howdy,

Sorry, I’m not really sure about the errors you’re seeing above. It sounds like the backup does have some corruption in it. It also sounds like, with enough expertise and tweaking, it’s possible you could gain access to at least some of the data in that archive.

But I’m unfortunately not really familiar with how to do all that, and I don’t have much advice I can offer for your situation :slight_smile:

Perhaps someone else who has run into that can offer some thoughts though!

-Eric

Thank you Eric. I hope someone can help. I have searched but there is not much solution.

The ungunzipped file .net_dir is perhaps tar without the extension .tar

It is very strange indeed. I wish someone can see this message and give some hints.