Maine Munchies Ad

Tuesday, October 17, 2006

New (Useless) Bells and Whistles for Google Webmasters

When Google announced the launch of Google Webmaster Central, I said it was silly. I wasn't referring to the webmaster tools section of Webmaster Central. Now, that's getting silly. Today, they announced new features like Googlebot activity reports and Crawl rate control. Pictured below is one of the Googlebot activity reports for my firm's site.

Number of pages crawled per day:
Googlebot pages crawled

I tried the Crawl rate control (in the Tools section on the Diagnostic tab). First, it recommended keeping the Normal setting. Went back a few minutes later and it said, "We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below." I've set it to Faster and will see what impact that has.

Do webmasters need a graph showing Googlebot activity? Do we need a throttle control for the crawler? I don't think so. While I commend Google for engaging webmasters in a more meaningful manner than in the past, none of this seems particularly useful. I'd rather know precisely which pages have been crawled than look at "these cool charts" of aggregate Googlebot activity. This type of information is already readily available in a webmaster's own web server log file. My firm's free spider tracking tool will show which pages Googlebot has visited, the time of the visit and the HTTP status code (see demo of Googlebot visit tracking). IMHO, that's an ugly UI but some useful information. I prefer that to a pretty UI but useless information. Now, if the Googlebot activity chart shows a flatline or odd spikes, that could be useful. IOW, it's only useful for extremely bad situations. It's essentially binary information: either Googlebot crawling is good (1) or bad (0). The chart is superfluous.

What webmasters really need to know is the status *after* the crawl takes place. What is done with the information? Are there any problems with the content or the structure of the site that Googlebot crawled? Are there problems with internal or external linking? How about updating the "link:" utility in the search results? That'd be more useful. How about a chart showing incoming and outgoing links for all crawled pages on a given site? Honestly, I think Google's trying to show that it's making an effort. Does anybody find these new bells and whistles useful?

Reading my blog, you'd probably get the impression that I'm not a big fan of Google. On the contrary, I really enjoy search engine marketing work and working with Google AdWords, in particular. However, I've been quite irritated with the lack of support. As such, I'm a bit perplexed as to why Google would devote resources to support webmasters (for free listings) when they haven't yet built an organization to support the advertisers (for paid listings). Even this Webmaster Central is a halfhearted attempt at customer service. Consider the last line in the Google Webmaster Central Blog entry:
As always, we hope you find these updates useful and look forward to hearing what you think.
No, that link isn't an email address. It doesn't send you to a contact page. No, it sends you to the Google Webmaster Help Google Group. It's a mess, full of trolls and spammers and misinformed, self-proclaimed SEO experts. Reminds me of Mos Eisley.

Technorati tags: , , ,

0 Comments:

Post a Comment

<< Home