How to Custom Robots.TXT Blogger [Custom Robots.TXT]



The Custom Robots.TXT method specifically for Blogger [Blogspot] is the latest blogging tutorial that we will discuss this time. For the Indonesian version of Blogger, the robots.txt settings are written specifically robots.txt. If in the previous post, we discussed Setting Custom Robots Header Tags for bloggers which I have reviewed in full. And in this tutorial, we will discuss the robot.tx settings.


If you have read the post then I hope you must be aware of the importance of setting up these robots to get rankings on search pages (google, yahoo, bing, Yandex, etc). And then we will conduct a complete discussion on how to set robots.txt specifically for this blogger/blogspot.


What is Robots.txt?

Robots.txt is a text file containing a few simple lines of code. The file is stored on the website or blog's server which instructs web crawlers on how to index and crawl blogs in search results.


This means that we can restrict web pages on your blog from web crawlers so that some special pages or posts on the blog can be ordered via the special robot.txt so as not to index unnecessary pages in search engines such as your blog's label page, archive page or other pages that are not important to be indexed.


Always remember that search bots/crawlers scan the robots.txt file first before crawling any web page. So, that's why we need to do some special txt robot settings for bloggers so that these txt robots can be maximized.


Every blog hosted on blogger has a default robots.txt file that looks something like this:

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /

Sitemap: https://niadzgn.blogspot.com/sitemap.xml


Explanation of the Contents of the Robots.txt File

Now we look at the explanation of each text contained in the file above and we describe it so that we can better understand the function of setting robot.txt so that it is more SEO.


User-agent: Mediapartners-Google

This code is for Google Adsense robots which helps them serve better ads on your blog. Whether you are using Google Adsense on your blog, you are welcome to use the code and it will not affect the indexing of posts and pages on your blog.


User-agent: *

This is for all robots marked with an asterisk (*). In default settings, all types of crawler robots are allowed to index our website. For things that can and cannot be crawled, the code is below.


Disallow:/search

What this means is that links that have a keyword search immediately after the domain name will be ignored. Check out the example below which is a page link label named SEO.

http://www.niadzgn.com/search/label/SEO


And if we remove Disallow:/searchfrom the code above, the crawler will access all of our blogs to index and crawl all content and web pages. Be careful in setting up this special blogger robot.txt. Because it can cause cases to be indexed even though they are blocked by robots.txt .


Allow:/


Refers to the Homepage which means web crawlers can crawl and index our blog homepage.

So everything related to your main page will be accessible.


Disallow Specific Post

Now suppose we want to exclude certain posts from indexing then we can add the below line in the code.

Disallow: /yyyy/mm/post-url.html


Hereyyyyandmmrefers to the year of issue and month of posting respectively. For example, if we have published a post in 2016 in the month of March then we have to use the below format.

Disallow: /2016/03/post-url.html

Or

Disallow: 2022/12/how-to-custom-robotstxt-blogger-custom.html


Disallow Specific Pages

If we need to ban a specific page then we can use the same method as above. Just copy the page URL and remove the blog address from it which will look something like this:

Disallow: /p/page-url.html

Or if there is more than 1 (one) page that you don't want crawled, it could be like this:

Disallow: /p/ekstensi.html
Disallow: /p/egegffegd.html


Sitemap: https://niadzgn.blogspot.com/sitemap.xml

This code refers to our blog sitemap. By adding the sitemap link here we are only optimizing the crawl rate of our blog.



Add Blogger Custom Robots.TXT [Default]. 

Now the main part of this tutorial is how to add custom robots.txt in blogger. So below are the steps to add it.


Please login to your Blogger Account. Go to Settings ➤ Search Preferences ➤ Crawlers and indexing ➤ custom robots.txt ➤ Edit

Next, in the Edit column, select Yes



Safe and SEO-Friendly TXT Custom Robot

For installation, please enter the following code.

User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /

Sitemap: https://niadzgn.blogspot.com/sitemap.xml



Please change the blog address above with yours then click "Save changes"


How to Check Robots.txt File?

You can check this file on your blog by adding it /robots.txt at the end of your blog URL in a web browser. As an example:

https://www.niadzgn.com/robots.txt


After you visit the URL of the robots.txt file created earlier, you will see all the code you used in the special blogger robots.txt file. For example, let's take the example you are currently reading on my blog.

Check out the image below.



Conclusion About Custom Robots.TXT

This is today's full tutorial on how to add a custom robots.txt file in blogger. Here I tried my best to make this tutorial as simple and informative as possible. But still, if you have any doubts or questions then feel free to ask me in the comments section below.


Thus this tutorial is about Custom Robots.TXT settings for Blogger [Blogspot]. If you think this post is useful, please share it to make it more useful. Thank you


Previous Post Next Post