[support] different robots.txt files for one Drupal installation?

perke perke11 at gmail.com
Mon Jun 12 13:24:17 UTC 2006


this could be helpful I believe,

http://drupal.org/node/53579

Use this module when you are running multiple Drupal sites from a single
code base (multisite) and you need a different robots.txt file for each one.
This module generates the robots.txt file dynamically and gives you the
chance to edit it, on a per-site basis, from the web UI.

On 6/12/06, dondi_2006 <dondi_2006 at libero.it> wrote:
>
> Greetings,
>
> I'm setting up several independent websites:
>
> www.myfamilywebsite.net
> www.myhobbieswebsite.org
> etc etc
>
> all off one drupal installation, placing settings,
> modules, themes etc.... in the proper sites/ subfolders.
>
> Fact is, I want some of those domains indexed by search
> engines, and other (for example myfamilywebsite.net) to
> be NOT indexed.
>
> How/what do I set up to make sure that myfamilywebsite.net sends a
> robots.txt files DISALLOWING robot spidering,
> while the other domain serve a *different* (indexing friendly) robot.txt?
>
> I understand that this is at least partly an http server
> thing, but (apart from knowing how it should be done at
> that level): is such a requirement compatible with one
> drupal installation? Or I will have to install and maintain one drupal for
> each domain?
>
> TIA,
> O.
>
> --
> [ Drupal support list | http://lists.drupal.org/ ]
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.drupal.org/pipermail/support/attachments/20060612/dc8a11e7/attachment.htm


More information about the support mailing list