Robots.txt files in an Enterprise Multi Site implementation


I have seen the EpiGoogleSiteMaps and going to look at it further after posting this.

 However in the meantime I was wondering if anyone had any neat soluions for generating the multiple robots.txt files one needs in an Enterprise Multi Site solution


Pat Long

#26215 Nov 25, 2008 22:06
  • Måns H
    Member since: 2006
    Any luck with this? Having the same issue...
    #26750 Dec 18, 2008 12:15
  • Johan Björnfot
    Member since: 2004

    One idea (not tested but should work in theory at least... :-) is to create a "special" VirtualPathProvider for that purpose and register that only for virtual path "~/Robots.txt". In the implementation of that provider you could decide how to return the file, one option is to read it from a common storage (accessible from all sites in enterprise scenario) like database or some values from an EpIserver page called e.g. "Robot".   

    Then there is no need to have any physical file Robots.txt at all.

    #26757 Dec 18, 2008 17:11
  • Måns H
    Member since: 2006
    ah, thanks, that's one idea, but i solved it using an httphandler that checks which site we're on and serves the correct file accordingly.
    #26773 Dec 19, 2008 10:58