Greetings,
I'm setting up several independent websites:
www.myfamilywebsite.net www.myhobbieswebsite.org etc etc
all off one drupal installation, placing settings, modules, themes etc.... in the proper sites/ subfolders.
Fact is, I want some of those domains indexed by search engines, and other (for example myfamilywebsite.net) to be NOT indexed.
How/what do I set up to make sure that myfamilywebsite.net sends a robots.txt files DISALLOWING robot spidering, while the other domain serve a *different* (indexing friendly) robot.txt?
I understand that this is at least partly an http server thing, but (apart from knowing how it should be done at that level): is such a requirement compatible with one drupal installation? Or I will have to install and maintain one drupal for each domain?
TIA, O.