Friday, January 26, 2007

Why is Google Blogging About Robots.txt?

Not quite sure why this post, Controlling how search engines access and index your website, is on the main Google Blog instead of the Google Webmaster Central Blog. Anyway, the post is about the robots.txt file. It's always worth validating a robots.txt file anytime you create one or update one. This is not a file you want to have errors in. Here's the robots.txt validator I've been using lately.

NOT Technorati Tags (make your own with TagBuildr): , , ,

1 Comments:

John said...

How would you keep the robots.txt from being crawled and indexed? (or the sitemaps file?)
http://www.google.com/search?q=inurl%3A%22robots.txt%22
http://www.google.com/search?q=inurl%3A%22sitemap.xml%22
:-)

5:57 PM  

Post a Comment

<< Home