New Virtualmin virtual-server module version 3.30

Hi all,
<p>
I’ve just rolled out the latest Virtualmin virtual-server module 3.30 to all repositories. I has numerous bugfixes and usabilty enhancements, along with one major feature enhancement.
<p>
This version includes support for backups to Amazon’s S3 storage service, which provides an extremely reliable and reasonably priced remote backup solution. This feature requires the perl XML::Simple module. Installation of this module is covered below.
<p>
Changes since 3.29:
<p>
[list]

[]The cache file used by the lookup-domain.pl program to determine if a mailbox is close to its disk quota is automatically flushed when a user’s or domain’s quota is changed, which increases the speed at which such changes are detected.[/]

[]When renaming a virtual server, an option is available to rename any mailboxes in the domain that contain the old server name.[/]

[]A city or locality name can be entered when generating a certificate.[/]

[]Added an option to use Spanish to the Joomla script installer.[/]

[]Changed the ‘PHP Options’ page to ‘Website Options’, and added a field for enabling log writing via a program (to protect against a missing ~/logs directory).[/]

[]When restoring template backups, existing templates are no longer deleted. This makes copying templates to new servers easier.[/]

[]Added checkboxes and a button on the Server Templates page to delete several at once. [/]

[]Fixed the osCommerce script installer, so that the admin module works.[/]

[]Virtual server backups can now be made to Amazon’s S3 service, which provides online storage (at a price). Similarly, restores can be made from the same service. Before you can use this feature, you must sign up for an account with S3 and get an access key and secret key.[/]

[]Each reseller can have an IP address specified for virtual servers with shared address websites under his ownership to be set up on. All DNS records in the servers’ domains will use that IP, which allows resellers to appear to have a dedicated server for their customer domains.[/]

[]The change IP address page can now modify the IP of name-based servers, if more than one possibility is available (such as from a reseller IP). Similar, the modify-domain.pl program now takes a --shared-ip option to do the same thing.[/]

[/list]

<p>
To upgrade on Red Hat based systems:
<p>
yum update wbm-virtual-server
<p>
On SUSE systems:
<p>
yast -i wbm-virtual-server
<p>
On Mandriva systems:
<p>
urpmi.update -a<br>
urpmi wbm-virtual-server
<p>
And on Debian and Ubuntu systems:
<p>
apt-get update
apt-get upgrade webmin-virtual-server
<p>
If you plan to use the Amazon S3 backup feature, you will also need the perl XML::Simple module. This is provided in the system repository of most supported systems. For those that do not, I will package it and roll it into our repository over the next couple of days.
<p>
To install on Fedora 5/6 and CentOS 4:
<p>yum install perl-XML-Simple
<p>
To install on Red Hat Enterprise Linux 4:
<p>
up2date perl-XML-Simple
<p>
To install on OpenSUSE 10.0:
<p>
yast -i perl-XML-Simple
<p>
To install on Mandriva:
<p>
urpmi perl-XML-Simple
<p>
And on Debian/Ubuntu:
<p>
apt-get install libxml-simple-perl
<p>
Fedora versions below 5 and CentOS/RHEL below 4 do not include this module. I’ll package it shortly and add it to our repositories for these systems.
<p>
Please let us know of any problems, by filing a bug in the bug tracker.

I tried the Amazon S3 this evening and got to sort of a road block. I installed the libxml-simple-perl fine and reran the backup. I was then prompted with

Backup failed : The Perl module S3::AWSAuthConnection needed to communicate with Amazon’s S3 service is not installed

I’ve searched and cannot seem to come up with the apt package name for this. Also, are there any other packages that might pop up after installing this one that might save some time and another post? :slight_smile:

I updated from 3.29 to 3.30 this evening on RHEL4
The update caused apache to hang. Stopping/starting httpd failed, and the server required a reboot before web pages were accessible again.

[[Sun Dec 10 19:38:09 2006]] [[notice]] caught SIGTERM, shutting down
[[Sun Dec 10 19:38:10 2006]] [[notice]] suEXEC mechanism enabled (wrapper: /usr/sbin/suexec)
[[Sun Dec 10 19:38:11 2006]] [[notice]] Digest: generating secret for digest authentication …
[[Sun Dec 10 19:38:11 2006]] [[notice]] Digest: done
[[Sun Dec 10 19:38:11 2006]] [[notice]] LDAP: Built with OpenLDAP LDAP SDK
[[Sun Dec 10 19:38:11 2006]] [[notice]] LDAP: SSL support unavailable
[[Sun Dec 10 19:38:11 2006]] [[error]] (28)No space left on device: Cannot create SSLMutex
Configuration Failed
[[Sun Dec 10 19:39:39 2006]] [[crit]] (28)No space left on device: mod_rewrite: could not create rewrite_log_lock
Configuration Failed

Space is not an issue

Hey Tom,

I’ll look into it. I thought only XML::Simple was needed. :wink:

I’ll plan to add it to the repository soon, if it’s not already in the Debian universe.

Hey Jeff,

[[Sun Dec 10 19:38:11 2006]] [[error]] (28)No space left on device: Cannot create SSLMutex
Configuration Failed
[[Sun Dec 10 19:39:39 2006]] [[crit]] (28)No space left on device: mod_rewrite: could not create rewrite_log_lock
Configuration Failed

Space is not an issue

Space almost certainly is the issue, but it’s probably not where you expect it to be. It’s something to do with the IPC shared memory on the system. You can run ipcs to see the existing segments. You may be able to free some up by restarting the processes that are eating them (or figuring out what is eating more than it’s fair share and fixing it). Feel free to send along the results of this command to the Help! forums, and I’ll try to help you figure out what we need to do to fix it.

It could also be that you don’t have /dev/shm mounted for some reason. I would think you’d get a different error, though.

Oh, yeah, I’ll point out that this has nothing to do with the Virtualmin version–Virtualmin doesn’t use any of this memory. On my systems it is mostly used by apache and postgresql.

Joe, here are the results of ipcs

[[#]] ipcs

------ Shared Memory Segments --------
key shmid owner perms bytes nattch status
0x00000000 131072 root 600 33554432 22 dest
0x00000000 163841 root 600 524288 22 dest

------ Semaphore Arrays --------
key semid owner perms nsems
0x00000000 65536 apache 600 1
0x00000000 98305 apache 600 1
0x00000000 262146 apache 600 1
0x00000000 294915 apache 600 1
0x00000000 458756 apache 600 1
0x00000000 491525 apache 600 1
0x00000000 524294 apache 600 1
0x00000000 557063 apache 600 1

------ Message Queues --------
key msqid owner perms used-bytes messages

[[#]] free -m
total used free shared buffers cached
Mem: 4953 3121 1831 0 194 1852
-/+ buffers/cache: 1073 3879
Swap: 4094 0 4094

I’m getting an error when trying to backup using s3…

Starting backup of 1 domains to Amazon S3 bucket chris …
HTTP/1.0 500 Perl execution failed Server: MiniServ/0.01 Date: Sun, 4 Mar 2007 15:22:15 GMT Content-type: text/html Connection: close

Error - Perl execution failed

syntax error at line 1, column 0, byte 0 at /usr/lib/perl5/vendor_perl/5.8.8/i386-linux-thread-multi/XML/Parser.pm line 187

I’m running Fedora 5 and have installed the xml::simple as described above.

Any thoughts?

Chris

I’m getting the same error under centos4

I also have xml::simple installed.

Starting backup of 15 domains to Amazon S3 bucket virtualmin1359 …

HTTP/1.0 500 Perl execution failed Server: MiniServ/0.01 Date: Sun, 18 Mar 2007 09:21:35 GMT Content-type: text/html Connection: close
Error - Perl execution failed

syntax error at line 1, column 0, byte 0 at /usr/lib/perl5/site_perl/5.8.5/i386-linux-thread-multi/XML/Parser.pm line 187

John

Hi,

Has anyone experienced the above error besides me and Chris?

Or can anyone offer suggestions as to corrective steps I should try?

My keys are ok as I can access my S3 account with the S3 firefox extension and other tools so I’m wondering what I can do to get this working.

This would be the most fantastic way to back up the server.

Thanks,
John Wolgamot

Hey John,

Yeah, there’s all kinds of stupid with S3 backups. We’re still not installing all of the dependencies, and the remaining missing deps aren’t even visible–they’re hidden inside the response from the server.

I’m fixing them first thing tomorrow. I’ll ping this thread when it’s easily resolvable.

I’ve been wanting to use S3 ever since I started using Virtualmin. I just tried the above on my Ubuntu system and all I keep getting is Backup failed : Missing or invalid S3 secret key. I even regenerated my key (which was an ok thing to do for me cause I have NEVER been able to use it for anything yet.) :frowning:

It really taxes my little DSL circuit every Sunday morning when I get a gigabyte download to my home just to keep backups current on my Virtualmin box. ANy help with this would be very much appreciated.

Tom Beattie & Bryan Smith

UPDATE: I feel pretty stupid now but for some of those that are having problems with this, you should copy and paste your ID and secret to notepad or something and make sure that you don’t have a space on the tail of either one. That has been the case with me ever since the perl module got fixed on this. (Thanks guys!) As of this writing at least, it’s happy enough with it that it’s now making the archives to upload to S3.

Hi,

I did a test restore months ago to make sure restoreing from amazon worked and it worked and it did work great.

For some reason I just got this error when trying to restore a server.

Error - Perl execution failed
File does not exist: at S3/ListBucketResponse.pm line 26

Can anyone point me in the right direction?

Thanks ahead,
Wolga

Thanks Joe,

Looking forward to using S3.

I do nightly scheduled ssh backups to another virtualmin server I use for experiments which resides on a 156/34 DSL connection.

Recently I started backing up using my home cable 800/80 connection using cygwin to receive the backups. Anyway… both backups are on residential grade internet connections.

Since my house cable connection is 800/80 I can backup from the VPS virtualmin server to cygwin quickly so that’s nice.

A backup or restore from the amazon S3 backup would happen at around 800 per second either direction.

A restore from my house would be pushed out at 80 so to restore 3, 4, or 5 gigs of server would take a while.

To restore from the DSL connection at a 156 down / 34 up from my experimental virtualmin server would be nerve wracking. ;-(

Although I haven’t had to restore :sunglasses: yet… I know the day will come so I am hoping I can do it from amazon S3

HUMBLE REQUEST

Out of curiosity… After the :smiley: "all kinds of stupid" S3 issues are worked out in the virtual server backups would it be hard to offer S3 backups under webmin/system/filesystem backup area?

Also, would it possible to build in a field to list directories and files you would like to exclude from a backup or scheduled backup?

Thanks Joe,

Looking forward to using S3.

I do nightly scheduled ssh backups to another virtualmin server I use for experiments which resides on a 156/34 DSL connection.

Recently I started backing up using my home cable 800/80 connection using cygwin to receive the backups. Anyway… both backups are on residential grade internet connections.

Since my house cable connection is 800/80 I can backup from the VPS virtualmin server to cygwin quickly so that’s nice.

A backup or restore from the amazon S3 backup would happen at around 800 per second either direction.

A restore from my house would be pushed out at 80 so to restore 3, 4, or 5 gigs of server would take a while.

To restore from the DSL connection at a 156 down / 34 up from my experimental virtualmin server would be nerve wracking. ;-(

Although I haven’t had to restore :sunglasses: yet… I know the day will come so I am hoping I can do it from amazon S3

HUMBLE REQUEST

Out of curiosity… After the :smiley: "all kinds of stupid" S3 issues are worked out in the virtual server backups would it be hard to offer S3 backups under webmin/system/filesystem backup area?

Also, would it possible to build in a field to list directories and files you would like to exclude from a backup or scheduled backup?

Hey John,

Thanks for the informative update! I knew I’d get that bastard figured out eventually…it was a missing dependency, but one that only showed up inside of the error message returned from Amazon…so nobody ever saw the details. Again, just a stupid bit of me not realizing which packages were actually needed.

A pleasant surprise. Performance is unbelievable.

Thanks so much for the S3 feature that is now complete and working perfectly.

I yum updated the day before your major update notification on the home page and I noticed the S3 had been completed.

I’ve run several tests and all have worked.

Here is a rundown for anyone interested.

I have 2 servers. A VPS-link-5 server at www.vpslink.com

and a test server to play with and to use for backups. Both are running Centos and Virtualmin-Pro

I did a backup of all 14 of my v-servers on my VPS to S3 and then a restore from S3 to my physical test server.

Once Virtualmin Pro packaged them up it transfered them to S3 at the rate of:
5,250.0 KB/s… OMG

I wanted to use the nightly scheduled backup method using a date and number so I used the following bucket filename:

test_%m_%d_%Y_%u

I entered my S3 key and access key

checked:
Do strftime-style time substitutions on file or directory name

Checked:
One file per server, in new format

checked:
Create destination directory?

checked:
Halt the backup immediately Action on error

It placed all virtual servers in a bucket named:
test_04_19_2007_4

Thanks once again to Jamie and Joe for this fabulous feature.

Thanks Joe,

Looking forward to using S3.

I do nightly scheduled ssh backups to another virtualmin server I use for experiments which resides on a 156/34 DSL connection.

Recently I started backing up using my home cable 800/80 connection using cygwin to receive the backups. Anyway… both backups are on residential grade internet connections.

Since my house cable connection is 800/80 I can backup from the VPS virtualmin server to cygwin quickly so that’s nice.

A backup or restore from the amazon S3 backup would happen at around 800 per second either direction.

A restore from my house would be pushed out at 80 so to restore 3, 4, or 5 gigs of server would take a while.

To restore from the DSL connection at a 156 down / 34 up from my experimental virtualmin server would be nerve wracking. ;-(

Although I haven’t had to restore :sunglasses: yet… I know the day will come so I am hoping I can do it from amazon S3

HUMBLE REQUEST

Out of curiosity… After the :smiley: "all kinds of stupid" S3 issues are worked out in the virtual server backups would it be hard to offer S3 backups under webmin/system/filesystem backup area?

Also, would it possible to build in a field to list directories and files you would like to exclude from a backup or scheduled backup?

A pleasant surprise. Performance is unbelievable.

Thanks so much for the S3 feature that is now complete and working perfectly.

I yum updated the day before your major update notification on the home page and I noticed the S3 had been completed.

I’ve run several tests and all have worked.

Here is a rundown for anyone interested.

I have 2 servers. A VPS-link-5 server at www.vpslink.com

and a test server to play with and to use for backups. Both are running Centos and Virtualmin-Pro

I did a backup of all 14 of my v-servers on my VPS to S3 and then a restore from S3 to my physical test server.

Once Virtualmin Pro packaged them up it transfered them to S3 at the rate of:
5,250.0 KB/s… OMG

I wanted to use the nightly scheduled backup method using a date and number so I used the following bucket filename:

test_%m_%d_%Y_%u

I entered my S3 key and access key

checked:
Do strftime-style time substitutions on file or directory name

Checked:
One file per server, in new format

checked:
Create destination directory?

checked:
Halt the backup immediately Action on error

It placed all virtual servers in a bucket named:
test_04_19_2007_4

Thanks once again to Jamie and Joe for this fabulous feature.

Same issue on Debian Jessie now when upgrading from Wheezy. How can we fix this? Will the patch be sent over the next upgrade of Webmin/Virtualmin?