
It is the Robots Exclusion Protocol that provides instructions to different search engine bots on how to crawl your site data. With the help of this Robot codes, webmasters can easily control what pages of the website should be indexed search engines crawler? By making simple changes to .txt file webmasters can allow or disallow certain pages to be indexed by Search Engines like Google, Yahoo, Bing and improve your site SEO..............................................................................................................What is the use of robot.txt file?.............................................................................................................
If you don’t want Google or any other search engines to access the content of your site than use of robots.txt file is very much beneficial. For instance, webmasters can disallow crawling of contact us page, search results pages, deleted pages, 404 error pages, content you don’t want to show up or any other older pages that don't add enough value for visitors coming from different search engines the use of robot file comes into play. .............................................................................................................How does a simple robot.text file look like?.............................................................................................................User-agent: *
Disallow: /
User-agent: Applies to robot (Google Bot)Disallow: / Applies to URL you want to block (private-doc.html)......................................................................
...................................................................... If you want to block your blog post’s/websites images by , Googlebot-Image you can remove them by using the following code give below.
User-agent: Googlebot-Image
Disallow: /
...................................................................... ...................................................................... If you want to block only specific image of your websites than you can do it by using the following code give below.
User-agent: Googlebot-Image
Disallow: /images/flower.jpg
...................................................................... ......................................................................
Allow site to be crawled by all robots by using the following code below.
User-agent: *
Disallow:
...................................................................... Block The Entire Site From Crawling......................................................................
Disallow all site contents to be crawled by robots can be done by using the following code .
User-agent: *
Disallow: /
......................................................................
......................................................................'*' symbol in the User-agent field has a special meaning any robot.The entry should always begin with a forward slash( / ) If you don't know how to customize, please let us know. We appreciate your feedback and suggestions as well.
No comments:
Post a Comment