advice on local dev - remote publish


I manage many highly “volatile” (changing) websites that require manual editing with Virtualmin. It has become incontrollable to create local users with same UIDs as the server to remote NFS to the public_html to access files and edit them.

Is there a way for me to develop locally under my Ubuntu (10.04.1 LTS) user account to remote publish to the virtual server’s users credentials? (running virtualmin pro under CentOS 5.4)

Any help will be very much appreciated.

Best regards.


Could you maybe add some more details to what you’re trying to achieve? I’m afraid I didn’t fully understand the problem here. :slight_smile:


I may have missed some points.

I am trying to do web development on many many sites everyday. All of this data needs a manual editing for some reason.

The fact is that under normal circumstances, under linux, you can connect to nfs share if you have the same user/group on both systems.

This forces me to logout/log back in as many users in a day.

Add to that the fact that UIDs and GIDs are difficult to keep the same on both my dedicated server and my work box (different OSs, different setups). This has become unmanageable.

I need a solution to develop locally in my computer, whatever the websitem and to publish on my virtualmin pro webserver. All this from my local account.



What about using SCP or FTP, or maybe Samba, to upload the changed files to the life site? With those protocols, you can either specify or configure which life-server-local user (ID) is used for which remote-access username.


Is there anything simpler like half automation between a local virtualmin server and a remote virtualmin pro?



what about rsync over ssh with user’s credentials as seen here:

Do you think that after creating xyz domain in my local system… that whatever the local user… after rsync’ing with the remote user… that it will work.

Providing that the credentials for the remote user match Virtualmin’s credentials for that user.

Your thaughts in this?



Well, there’s the “Cluster Copy Files” function, I suppose that’s as close as you can get to a VMin-built-in file synchronization. But I have my doubts that this will do exactly what you’re looking for. It seems e.g. that function does not preserve file date, owner and access mode. (It’s meant for different things than file sync.)

A file sync solution might be using rdiff-backup.

Question: Your nick, Locutus, it is in reference to Locutus of Borg in Star Trek The Next Generation. Isn’t it?

Yeah indeed you don’t even need rdiff-backup. rsync alone should suffice. Just did a little test:

 rsync -t -p -E -r /home/domainowner/public_html

will recursively copy changed files in the source to the destination, under username “remoteuser”, preserving file dates, executability and permissions. I suppose it can’t get any easier than that. :wink:

@Duke: Yepp that’s right, it is. :slight_smile: Locutus of Borg, assimilation in progress. :wink:

As for credentials: rsync will query for password of the remote user, or you can use SSH parameters to provide a password or key file.

Thanks for the testing.

I too am testing. Here is what I found:

I braught files over with rsync with: (mind the trailing slash on the source dir… rsync behaves differently when it’s there or not)

rsync -vzatpe ssh user@domain.tld:/home/remote_path/public_html /home/local_path/domain.tld/

Bringing the file at 500Kb/sec took a about a minute for 25 Mb.

Changing nothing and resyncing right away in the same direction took a few seconds. Normal, nothing had changed. The other way took longer but it’s because of my slow upload speed.

I changed 1 small file somewhere…
Resynced with:
rsync -vzatpe ssh /home/local_path/domain.tld user@domain.tld:/home/remote_path/public_html/

The rsync the other way around took a long time at 80Kb/sec. And it re-uploaded all the files. AND NOT EVEN THE ONE I MODIFIED!!!

Not wanting to waiste my time again, I tested this with a dry run:
rsync -nvuzatpe ssh /home/local_path/domain.tld user@domain.tld:/home/remote_path/public_html/

It showed smaller send/receive data so I tested again with: (without dry run)
rsync -vuzatpe ssh /home/local_path/domain.tld user@domain.tld:/home/remote_path/public_html/
To end up with my modified file still local but not remote.

Tried again with:
rsync --size-only -vuzatpe ssh /home/local_path/domain.tld user@domain.tld:/home/remote_path/public_html/
to no avail… My modified file is still not on the remote server.

Tried multiple switches and ended up with this:
rsync --delete --size-only -vuzr -e ssh /home/local_path/domain.tld user@domain.tld:/home/remote_path/public_html/

It still doesn’t work.

The only file that I modified is not re-uploaded.

I am able to reup everything, but with my 80K/sec connection, I would prefer if just the modified files were uploaded.

Any idea?



ok, I’ve got it…

I’v got it to sync only modified files. But it also updates some other files I haven’t updated locally.

Is it possible it’s because apache touches the files on the servers… Thus rsync thinks they are modified becaude not identical to the local copy… Therefore rsync sends thos again?


I’m sorry, I can’t reproduce your problems…

Why all those options? Especially “-a” resolves to a whole bunch of other options…

I tested a similar scenario as you did. On a “test machine” I created a directory /test/html. I pulled in a copy of a domain public_html:

 rsync -rtp /test/html

Took about 5 seconds. (Both machines are VMs on the same host, and it was about 72 MB to copy). Then I modified a file in /test/html and added two new ones. Then:

rsync -rtp /test/html/

Took about half a second, and all changed/added files were transmitted correctly.

Yes… but with one problem. On the sync back to where the files were originally… I had to remove the last dir in the command otherwise all files were sent into a subdir bearing the same name as the dir.

And the -a option is a nuisance when the system times (files timestamps) are different for some reasons.

Here are the commands that worked:

fetch the original files from the prod server:
rsync -e ssh -avz user@domain.tld:/home/remote_path/public_html/catalog /home/local_path/domain_catalog/

put the modified files back:
rsync -e ssh -avz --exclude “catalog/system/cache” /home/local_path/domain_catalog/catalog user@domain.tld:/home/remote_path/public_html/

See how I had to remove the “catalog” dir in the “put back” command? If I don’t… It creates another catalog folder in the original catalog folder and re-uploads all the files.

Do you see the same thing with your tests?