Share This Post

Blog / Digital Marketing / SEO Blog / Uncategorized

What is The Importance of a Robots.txt File in SEO



Robots.txt file is important in SEO because Robots.txt file is what instructs the search engines which pages of your website to access and index which not to. For instance, if you specify in your Robots.txt file that you don’t want the search engines to be able to access your sales page, the search engines won’t be able to access it. Restricting search engines from accessing a few pages on your site is essential for privacy purpose and SEO purpose

How Robots.txt Works 

When a user searches some keywords, the search engines send tiny programmes called “spiders” or “robots” across the World Wide Web to bring back information that could be indexed in the search results. Using a “disallow” command you can restrict the spiders from accessing the pages that you don’t want it to index. For example if you want to restrict your sales page enter the Robots.txt command:

User-agent: *

Disallow: /sales

The “User-agent:” part specifies which robot you want to block and could also read as follows:

User-agent: Googlebot

This command would only block the Google robots, while other robots still can access the page.

Your robots.txt file would be located in the main directory of your site. For example:

Why Should You Block Some Pages

If you have duplication of pages on your site, you must restrict one page; else it would result in duplicate content which can hurt your SEO.

The second reason is if you have a page on your site which you don’t want users to be able to access unless they take a specific action. For example, if there is a “thank you” page on your site, where users get access to certain information because of the fact that they entered their email address, you probably don’t want people to find that page by doing a Google search.

The other reason why you need to block pages or files is when you want to protect private files in your site such as your cgi-bin and keep your bandwidth from being used up because of the robots indexing your image files:

User-agent: *

Disallow: /images/

Disallow: /cgi-bin/

So now you know that by including a command in your Robots.txt file, you can restrict the access of search engine spiders from indexing a few of your pages.

Share This Post

Mr. Kumar MS is a Digital Marketing Speaker & Trainer, he provides you tips on measuring success of Digital marketing. His excellent knowledge will help you to become an Internet marketing expert. He is the founder and Mentor of National Institute of Digital Marketing-Bangalore, Has Trained more than 4000+ professionals on Digital Marketing Techniques, he can be reached at
Skip to toolbar