locked
robots.txt file question - not working right RRS feed

  • Question

  • I have a robot.txt file at the following URL and when I do a test, I insert the entire URL and get the following message:

     

    Line #1: http://www.villabellemer.com/robots.txt
    Error: 'user-agent' -  should be the first tag.

     

    The content of the robots.txt. file is:

     

    User-agent: *
    Disallow: /Scripts/
    Disallow: /ScriptLibrary/
    Disallow: /searchresultpages/
     
    so the user agent is the first tag.
     
    Please someone help me figure this out.
     
    batche
    Friday, January 4, 2008 2:17 PM

Answers

  • I will look into this.  I will file a bug and see if there is an issue with the processing of the Robots.txt file.   Thanks for bringing this to my attention.

     

    Jeremiah

     

    Friday, January 4, 2008 6:37 PM

All replies

  • I will look into this.  I will file a bug and see if there is an issue with the processing of the Robots.txt file.   Thanks for bringing this to my attention.

     

    Jeremiah

     

    Friday, January 4, 2008 6:37 PM
  • What do you get if you paste the contents of the robots.txt file into the validation window

    as i think you are supposed to paste the contents of the file, rather than the link to the robots.txt file


    so you would paste this in the purple validate robots.txt window

    User-agent: *
    Disallow: /Scripts/
    Disallow: /ScriptLibrary/
    Disallow: /searchresultpages/


    Rather than this


    http://www.villabellemer.com/robots.txt


    Wednesday, January 9, 2008 3:07 AM
  • that didn't work out. MSN was supposed to look into it but haven't gotten back to me. I am still showing the last time they roamed my site was December 7 at http://www.villabellemer.com HELP!

    Saturday, January 12, 2008 10:47 PM
  • I might be of some help, even though I'm having problems with the validator. I don't intend to sound rude, but you are not using the validator correctly. The validator doesn't go and read your robots.txt file that you have online. It reads and interpurts what you have placed in the box. It appears by the warning that you are getting and the info that you provided that all you put in was the address of your robots.txt file. What you need to place in the box is the contents of your robots.txt file. If you don't mind I have taken a look at your robots.txt file no offense but the way you have is wrong. It should look like this

    # robots.txt for http://www.villabellemer.com/

    User-agent: *
    Disallow: /Scripts/
    Disallow: /ScriptLibrary/
    Disallow: /searchresultpages


    if you are to copy and paste this in the validator the warnings will be gone. This txt file keeps all robots off of every web page that you have that has titles that start with scripts by your first disallow so the second will not be needed. The first script will cover both and any other script. After you validate it cut and paste it onto your robots.txt file and reload it.
    Saturday, January 12, 2008 11:26 PM
  • You are not being rude at all - you are helping me - and for that I am grateful. It looks like I should redo it to say:

     

    # robots.txt for http://www.villabellemer.com/

    User-agent: *
    Disallow: /Scripts/
    Disallow: /searchresultpages

    Disallow: /robots.txt

     

    ########

     

    I did a search from google for st. martin villa belle mer and I get the robots.txt file as one of the files! What do you do to prevent that?

    Saturday, January 12, 2008 11:56 PM
  • Yes but without the ####### sometimes they do get indexed. You can ask to have them removed from the index though. All you should have to do is paste this onto your txt page after you delete what you have on there.

    # robots.txt for http://www.villabellemer.com/

    User-agent: *
    Disallow: /Scripts/
    Disallow: /searchresultpages

    Disallow: /robots.txt



    You should now be all set.


    Sunday, January 13, 2008 1:01 AM
  • Jeremiah,

     

    You said you would look into this. I've followed ALL the recommendations to no avail. Would you post your findings of your filing a bug to see if there was an issue. Thanks, batche

    Wednesday, January 23, 2008 12:52 AM