You can easily create Robots.txt file by setting up a free Google Webmaster tools account. After creating a Google Webmaster tools account selects “crawler access” option under the “site configuration” option on the menu bar. Once you’re there, you can select “generate robots.txt” and set up a simple Robots.txt file.
After creating the Robots.tst file select the “block” option under “action” and then specify the robots that you want to block under “User-agent.” After that simply type in the directories that you want to block under “directories and files.” As you do this, be sure that you leave the “http://www.yoursite.com” part of your URL off. For example, if you want to block the following pages:
You would type the following into the “directories and files” field in the Google Webmaster tools:
After adding these for all robots and clicking “add rule,” you would end up with a Robots.txt that looked like this:
Notice here that you have a default “Allow” command which is useful if want to make an exception and allow on robot to access a page which you have blocked using a command like.
By placing the command:
Below the disallow command, you’d be allowing ONLY the Googlebot to access the images directory of your site. Once you’ve specified which pages and files you want to block, click the “download” option to download your Robots.txt file.
Installing Your Robots.txt File
Once you have your Robots.txt file, you can upload it to the main (www) directory in the CNC area of your website. You can do this using an FTP program like Filezilla. The other option is to hire a web programmer to create and to install your robots.txt file by letting him know which pages you want to have blocked. If you choose this option, a good web programmer can complete the job in less than one hour.