November 06, 2003

A simple robots.txt

If you ever take the time to look in your web server access logs, you might be surprised to see that you are getting tons of 404 errors for a file called robots.txt. While you could spend the time to make an elaborate file, why not spend 30 seconds to make a very simple one that will reduce the number of 404 errors, and make search engine spiders that much happier to visit your site.

A generic robots.txt file that welcomes all robots and denies none would look like this:


User-agent: *

Disallow:

More examples can be found at clockwatchers.com.

Posted by mark at November 6, 2003 10:27 AM | TrackBack
Comments
Post a comment









Remember personal info?