locked
what all Robots.txt directives FAST supports ? RRS feed

  • Question

  • We want to create a robots.txt file for our sharepoint web application to specifiy what all locations to be crewled by any crewler and what to not.
    'User-Agent' and 'Disallow' are the only directives included in the basic standard for robots.txt. Though, some of the search engines like 'Google' supports some extension directrives like 'Allow' to be specified in the robots.txt. These extensions can make it robots.txt easy to write and effective.

    Can anyone tell us what all Robots.txt directives FAST supports ? Does it support 'Allow', 'Request-rate', 'Crawl-Delay', 'Visit-Time ?

    Thanks in advance,
    Mahavir

    .net programmer
    Tuesday, December 30, 2008 9:30 AM

All replies

  • Off-topic as this is not a FAST support forum.

    Moving to Off-Topic
    Tuesday, December 30, 2008 2:31 PM
  • Along with the specific query, it was a request to "redirect me to some references from where i can find FAST specifications specifically for SharePoint".

    I think people will still reply if they have got the requested information.

    Thanks,
    Mahavir
    .net programmer
    Wednesday, December 31, 2008 7:06 AM