locked
robots.txt RRS feed

Answers

  •  

    Hi Abyzn,

     

    The first is correct, the second would need a / behind it to allow a robot to crawl your whole site. The first example states "all user agents are disallowed nowhere" i.e. allowed everywhere. It is a matter of preference, but the allow command is usually used to allow access to a certain sub-section of your domain or to call out certain crawlers. An example would be:

     

    User-agent: *
    Disallow: /

    User-agent: MSNbot

    Allow: /

     

    Sitemap: http://www.sikandarmachines.com/sitemapspal.xml

     

    This states that all user agents are not allowed to crawl your domain from the root level, but goes on to specify an exclusion for MSNbot  which can crawl your entire site. Once a bot is declared it will ignore all commands in the * block of commands, so you will need to repeat any commands you want the declared bot to follow.

     

    Hope this helps,

     

    Brett

    Thursday, November 20, 2008 9:21 PM