Breaking

Saturday, 20 October 2018

How To Create Custom Robots.txt For Blogger Website

In this post I will be showing you how to create custom robots.txt file for your blogger website. In the previous post, I discuss about How To Set Up Custom Robots Header Tag For Blogger. The custom robots.txt file and custom robots header tag are most important factor for search engine optimization. Today We will discuss in detail about custom robots.txt for blogger blog.


What is Robots.txt?

Robots.txt is a set of guidelines or rules that interact with bot-crawlers or Googlebot about how to crawl and index your website or blog in the search results. The robots.txt file for a blogger website is different from regular website.
NOTE:
Incorrect use of this features can result in your blog being ignored by search engines, so make sure it is done correctly by you before enabling it.                                              
Custom robots.txt is a way for you to instruct the search engine, what pages you want to be crawled on your blog, and what pages you don't want to be crawled. Crawl means that crawlers, like Googlebot, go through your content, and index it, so that people can find it when they search for it.

Default Custom Robots.txt For Blogger Blog

User-agent: Mediapartners-Google
Disallow:

User-agent: *   
Disallow: /search
Allow: /

Sitemap: http://www.yoursite.com/sitemap.xml
Don't Forget to Change yoursite.com with your blogger website domain.

Explanation of the above code

User-agent: Mediapartners-Google
Disallow
:

Mediapartners-Google is a set of command that instruct Google AdSense Robot to scan your pages, so that Google Adsense advertisements will better match the content of your page in order to serve better ads in your blog. Either your blog uses Google Adsense or not simply leave it as it is.
User-agent: *
This is used for all Search Engine spiders, robot and crawlers to execute index or no index function. We don't need to worry about this command because it is not going to hurt your website.
Disallow: /search
This implies that search engine spider, robots and crawlers should not index any of your webpages having keyword search just after the domain name.
Allow: /
This code refers to the Homepage that means web crawlers are free to crawl and index our website or blog’s homepage.
Sitemap: http://www.yoursite.com/sitemap.xml
This code refers to the sitemap of our blog. By adding sitemap we are simply optimizing our blog’s crawling rate. Means whenever the web crawlers scan our robots.txt file they will find a path to our sitemap where all the links of our published posts are present. The sitemap simply tell the search engine spiders, robots and crawler to crawl and index every new blog post.

How to Add Custom Robots.txt in Blogger

  • Go to your blogger dashboard.
  • Navigate to Settings >> Search Preferences ›› Crawlers and indexing.
  • Click Custom robots.txt ›› Edit ›› Yes
  • Now paste your robots.txt file code in the box.
  • Click on Save Changes button.
How To Create Custom Robots.txt For Blogger WebsiteYou are done!

Finally you can check weather the change has been made to robots.txt file or not. Therefore open a new tab in your browser and type (http://www.example.com/sitemap.xml) replace example.com with your url and press enter. You will see the following changes.

That's all what you need to know about custom robots.txt. I assume the information on this post have helped you.


Please don't forget to share and subscribe.

No comments:

Post a Comment

Please enable / Bitte aktiviere JavaScript!
Veuillez activer / Por favor activa el Javascript![ ? ]