WHS web server - robots.txt - should default to opt out? RRS feed

  • Question

  • Like many of you, I have signed up to livenode.com using WHS, opened my router, etc. However in doing so I have been thinking about robots crawling my little site and indexing, well, basically nothing. Which brings the thought that woulden't it be best if the WHS webserver came with a robots.txt in the root folder with:
    User-agent: *
    Disallow: /

    So that search engines simple ignored all the sites?

    Sunday, April 22, 2007 4:40 PM

All replies

  • Well, if there are no links from other web sites to your WHS Remote Access site, you shouldn't have much of a problem. And even if there are links, only two pages can be indexed without logging in. And they don't exactly have a lot of content. Smile
    Sunday, April 22, 2007 5:00 PM
  • There is not much content, but wouldn't that cause your homepage to show up in search indexes?  Which could....could give someone the ability to hit your homepage and try to get in?  I'm sure it's unlikely, but would it hurt to have a robots.txt file with a disallow all?
    Monday, April 23, 2007 2:53 AM