Prevent server's FQDN being indexed by Google

Hi,

We have Virtualmin running on an Ubuntu 12.4 instance on AWS.
We use Route 53 for DNS management.
We have a FQDN for our instance (e.g. server.domain.com) which has an A Record pointing to the instance’s public IP address.

This all seems to be working fine, but the problem is, Google has indexed ‘server.domain.com’ and it is rendering our first (alphabetically) Virtualmin virtual server, which happens to be a site with thousands of pages!

This is causing duplicate content for that domain, and is also a problem for our company site ‘domain.com’.

Is this setup correctly, or is there somehow we can prevent this?

Should we actually create a virtual server of ‘server.domain.com’?

Thanks

Howdy,

What I might suggest doing is choosing a preferable default website for your server.

Anytime a domain or IP address points to your server, but doesn’t have a Virtual Server setup, it will use the default Virtual Server.

And the simplest thing you could do there is just pick a default Virtual Server that you want seen in those cases… perhaps your company’s website?

You can set a Virtual Server as the default by going into Server Configuration -> Website Options, and setting “Default website for IP address”.

-Eric

Hi, wouldn’t it be possible to prevent google from indexing that site with robots.txt?

@helpmin: I think he doesn’t want to prevent the site from being indexed in general, but only the “copy” Google reaches via the server’s fqdn. I’d go with Eric’s suggestion here!

If there is no useful existing default website, just create a new one, use that as default and put a single page like “nothing here” into it.

Thank you for all the your responses. Sorry for the delay in responding, I had forgotten to subscribe to this post.

What we’ve ended up doing what I suggested in my first post: creating a virtual server for ‘server.domain.com’, with an apache website enabled for it. We’ve also set it as the default domain for the server.

Thanks for the tip on that Eric! It was actually set as the first alphabetical domain, as that would have been the first virtual server we added.

server.domain.com returns a 403, so Google should flush out the pages over time and we shouldn’t have any more issues moving forward.

Thank again.