Google Sitemaps Sucks
I have been trying to make use of the google Sitemaps feature because I have noticed that my site is not beind updated on Google. You can find links that haven’t been active in months, and I can’t find new pages that were created in the last month or so.
I discovered several things:
At some point, Google changed the name of their spiderbot without bothering to make it public enough that I heard about it.
When interacting with the Sitemaps page at google, they apparently rely on cached content (robots.txt) to display error messages. So you get an error in your robots.txt file, and you change it, you have no way of knowing when to try again.
When you create a file, sitemap.xml for example, Google claims it doesn’t exist or somesuch nonsense. Despite the fact that any idiot with a webbrowser can read the file.
When you have an interactive tool like this, it has to INTERACT with the frickin’ site….
I like searching with Google, but I am increasingly less impressed with it from the webmaster perspective…. It shouldn’t be this damn hard to submit a text file to the top-ranked search engine out there. And I wish they would stop returning links to all these pages from my old site that haven’t existed for months…