Sometimes, there may be places on your website where you do not want the content to be indexed. A great tool to use for these areas is the robots.txt file in your website code. For example, you could
have a unique customer specific section which is not password protected and this page is not found in your site’s navigation. While this area may not be sensitive information, you also don’t want the public to see it either. By using the robots.txt file, you essentially tell the engines: “do not index this page.” Another example would be a print targeted version of a web page which does not feature the normal graphics, rich media files and other heavy bandwidth type features. If the search engines were to index this page, a duplicate content penalty could be levied on your site as this page is basically a stripped down copy (for a printer device) of another. Once again, the robots.txt file will stop the search engines from indexing this page.
The placement of your robots.txt file is very important. The best place is your main directory file because this is the first place the search engines check. By telling them early, your chances of having them comply are exponentially greater than if you bury the file deep in your site.
For the highest security concerns, I recommend not using the robots.txt file. Instead, create a username and password protected area. This gives you a much greater chance of keeping the bad guys out and the good guys in!
About the Author: Rhett DeMille, the owner of PalmettoSoft, is a leading search engine optimization consultant located in the Charlotte NC and Charleston SC areas.