Hi,
Your validator find now errors on a robots.txt as basic as this one :
Code Block
User-agent: Googlebot
User-agent: Slurp
User-agent: msnbot
Disallow: /cgi-bin/
Sitemap: http://www.example.com/sitemap.xml
Errors obtained are:
Code Block
Line #2: User-agent: Slurp
Error: 'user-agent' tag should be followed by a different tag.
**************************************************
Line #3: User-agent: msnbot
Error: 'user-agent' tag should be followed by a different tag.
In fact it does not understand block for several agents and want that:
Code Block
User-agent: Googlebot
Disallow: /cgi-bin/
User-agent: Slurp
Disallow: /cgi-bin/
User-agent: msnbot
Disallow: /cgi-bin/
Sitemap: http://www.example.com/sitemap.xml
But if we refer to the link you give as a reference :
"A Standard for Robot Exclusion" :
http://www.robotstxt.org/orig.html we can read :
"The record starts with one or more User-agent lines, followed by one or more Disallow lines..."The first exemple is accepted by Google Analyze robots.txt tool and other validators I used.
Hope this small tool will soon be usefull...
Oukiva