locked
What exactly is this new crawler hoping to achieve here? RRS feed

  • Question

  • My question is in regards to the new crawler that fakes a referer.

    Ok. So I know the general point is to detect cloaking. I understand that. But my question would be, what is this hoping to achieve, long term? This is a short term solution at best, and will end up creating a better, stronger breed of cloakers. Everyone I've talked to so far has updated their cloaking units, and seem to be getting quite adept at blocking them. Is there a future, follow up plan? Or is this just going to be here forever, polluting logs and probably not stopping most cloakers?

    Thanks for your time.

    Thursday, December 6, 2007 3:34 AM

Answers

  • That is a fair point. The tool you are referring to is one of several ways we detect cloaking, we will continue to evolve all of our methods to minimize the footprint and keep cloaking spam to a minimum in our index.

    regards,
    nate
    Wednesday, December 12, 2007 4:30 AM

All replies

  • That is a fair point. The tool you are referring to is one of several ways we detect cloaking, we will continue to evolve all of our methods to minimize the footprint and keep cloaking spam to a minimum in our index.

    regards,
    nate
    Wednesday, December 12, 2007 4:30 AM
  • Ah yes, I'm sure that it will continue evolving. And for that matter, I don't support the idea of completely eliminating this crawler. Google does the exact same thing actually, just more selectively(they spam out http://www.google.com/search?q=abc), and have not gardnered near the same attention.

    Are there plans to at least refine this software so it causes less of a footprint? I will come out and say this straight out, I am a blackhat SEO(with some whitehat tendencies). And with that in mind, I can think of several ways to modify it so that it would create a less noticable footprint, and maintain near the same effectiveness. Knowing what I do about the people that work on the Live search team, I have to conclude that they have probably considered many of the same possibilities, so it seems kind of odd a more selective approach has not been deployed.

    If you'd like to talk more about this(although I doubt anything I have to say is especially needed, but hey, it might help), feel free to contact me at the e-mail address I use for here, but @gmail.com.

     

    But yeah, getting back to the question. Will a more selective approach be adopted?

    And thank you for the response by the way, I appreciate it.

    Thursday, December 13, 2007 9:58 PM
  • This constant faking of referer is a little disruptive to the weblogs. It is a bore to explain away someones enthusiasm, so I guess I need do some clever trickery in the stats package to compensate if I want this problem solved.

    For the month of March 'o8
    614 http://search.live.com/results.aspx?q=samekeyword&mrt=en-us&FORM=LIVSOP

    I would not mind if the site was actually on the first page of results, but it is not and is infact nowhere near the first page (12 page) - It is a keyword I dearly want the website to be on the firstpage for.

    Well the number of hits (with false referer) is a lot less than the number of bot visits from live.com

    PS I do not do cloaking etc.

    PPS I like this post editing toolbar - is it available anywhere?





    Monday, April 14, 2008 11:33 AM