locked
These tools are the problem...here's the proof RRS feed

  • Question

  • Our site www.promomanagers.com was manually added to my tools section and a valid XML file was placed into the directories at the end of last month.  On 11/27 the msnbot indexed 1 page and left.  It's been back but never hit more than a few files each day and never indexed more than the one.  Yesterday I gave up and deleted the site from the live tools.

     

    Today I just checked.  In 12 hours MSNBot has hit the site 18 times and visited 74 unique pages.  It's behavior is not consistent with the other two SE.

     

    It's the tools for some of us.

    Sunday, December 23, 2007 4:20 PM

Answers

  • I appreciate your tenacity to work with the tools and get indexed.  Let me first say there is no connection between the auth code and your index rate or the tools and your index rate.   We are working constantly on improving the crawler and the webmaster team has been supplying details to the crawl team about issues as they come up.   The tools are not directly connected to the crawler.   

     

    The tool that checks for the auth code is the only tool that recognizes the auth code.  MSNbot would see the auth code as just another piece of meta data. 

     

    Jeremiah

     

    Wednesday, December 26, 2007 4:40 AM

All replies

  • Correct that, I meant to say it IS consistent with the others.  It's an incredible coincidence for it to be anything but the tools which is what I suspected all along.

     

    From 12/1 to 12/22 the site was visited 49 times for a total of 93 pages, all of which were the home page or sitemap.

     

    I deleted the site late yesterday from the tools section.  Within a few hours there were 20 visits for 47 pages last evening.  There have been 18 visits for 74 pages this morning.  From the moment I deleted the site to now (about 16 hours) there have been 38 visits for 121 pages.  Versus pre-deletion's totals of 29 visits for 46 pages over the previous 21.5 days.

     

    Coincidence? I highly doubt it.  Will Microsoft please take a look at either the tools or the assigning of XML sitempas/authorization files.  Something is clearly messed up.

     

     

     

    Sunday, December 23, 2007 4:40 PM
  • Update.  Six more visits for another 50 pages, total in 3 days approx 50 visits for ~175 pages.  That's about a 25x increase over the previous 2 months average daily vistiting/page checking.   This all occured within hours of deleting the site from tools.

     

    My bet is the livesearchauth.xml files or the file keys we are given are somehow not valid, causing all sorts of problems we never know about because the live tools don't tell us the status.

     

    As soon as it stopped checking for that file, it started indexing the site.

     

     

     

    Monday, December 24, 2007 3:16 PM
  • I appreciate your tenacity to work with the tools and get indexed.  Let me first say there is no connection between the auth code and your index rate or the tools and your index rate.   We are working constantly on improving the crawler and the webmaster team has been supplying details to the crawl team about issues as they come up.   The tools are not directly connected to the crawler.   

     

    The tool that checks for the auth code is the only tool that recognizes the auth code.  MSNbot would see the auth code as just another piece of meta data. 

     

    Jeremiah

     

    Wednesday, December 26, 2007 4:40 AM
  •  

    Jeremiah, the inability of MSN to index the site takes food off our plate so it isn't tenacity, it's survival.  What may seem inconsequential to you has hurt many of us dearly.   For some the changes MSN made to indexing of legitimate sites could cause people to lose their livelihoods. 

     

    We submitted to the spam people to see if that is part of the problem.   The bot visited and indexed closed to 200 pages, but yet still nothing has changed in the live serps.  Advertising specific pages with Adcenter didn't get those pages even a sniff either.  

     

    Something is blocking some sites from getting indexed.  At least in our case it seems to have to do with the name of the site.  As of this morning searching on our name still has MSN returning the name split with a space which tells me something is triggering it to not register the full name.  I can't register an email account with MSN/Live/Hotmail using the name either as those are "protected" terms...certainly makes us wonder.

     

    It was within a matter of an hour that we deleted the site from Live Tools and had the bot indexing more than it ever has.  That seems pretty odd in timing.   I would have thought the approval of the XML file would trigger the bot to browse, the disapproval would trigger it not to browse.   What you are saying is the XML file really has no purpose outside of the tools, which I understand.

     

    The site we run is in your premier ASP.net 2.0.  If it can't index our structure using best practices from asp.net...what site is it intended to access, one written in a competitive technology?

     

    What's a job for you is what feeds our families.   The non answers, intentional or not aren't helping any of us.  I haven't seen one site posted here with deindexing issues that was a bad site or irrelevant.  All are sites I'd shop or visit, if the intention of the new algorithm was to block cheaters it seems to have had the opposite effect judging from what I see in the live serps.

     

    Thanks

    Wednesday, December 26, 2007 3:28 PM
  •  

    Jeremiah, as of this morning you still rank ads we ran on google higher than some real pages such as business.com and other high ranking listing listings.  That's really, really weird.  Is the algorithm actually using paid ads on the google network as links in?  When searching for promomanagers you still rank promo manager higher on a clubjobs site that was last updated 15 months ago.

     

    So clubjobs is more relevant than business.com for a business?

     

     

    Wednesday, December 26, 2007 3:40 PM