Pages

Wednesday, October 31, 2012

Robots

What is a Robots.txt file?

A Robots.txt is a “file that is added to your server, to prevent Search Engine Spiders from crawling to certain sections of your site, or your entire site”.

The Robots.txt file is easy to create, as its a simple notepad file. It’s a major tool used by illegal sites to survive on the Internet.  But there are some valid advantages to it as well.






These are some reasons why people use Robots.txt file, or why you should use it:

 1.  If you don’t want your unfinished site to already appear in public.
 2.   It can help you escape from spammers, who try to collect email addresses.
 3.   You have sensitive information that you do not wish to be shared.  Porn Sites, come under this category.
 4.   You can cut off specific pages from the Search Engine Listing, and that can be a useful advantage.
 5.   It can act as an invitation for bots to crawl to your site as well.

No comments: