[support] different robots.txt files for one Drupal installation?

dondi_2006 dondi_2006 at libero.it
Mon Jun 12 07:51:37 UTC 2006


Greetings,

I'm setting up several independent websites:

www.myfamilywebsite.net
www.myhobbieswebsite.org
etc etc

all off one drupal installation, placing settings,
modules, themes etc.... in the proper sites/ subfolders.

Fact is, I want some of those domains indexed by search
engines, and other (for example myfamilywebsite.net) to
be NOT indexed.

How/what do I set up to make sure that myfamilywebsite.net sends a robots.txt files DISALLOWING robot spidering,
while the other domain serve a *different* (indexing friendly) robot.txt?

I understand that this is at least partly an http server
thing, but (apart from knowing how it should be done at
that level): is such a requirement compatible with one
drupal installation? Or I will have to install and maintain one drupal for each domain?

TIA,
O.



More information about the support mailing list