How to Fix Robots Txt File In Blogger and WordPress

93 / 100

Robots Txt File In Blogger and WordPress

Hey, hope you are fine in this Quarantine. If you have a website and website are hosting on the blogger or WordPress so then you definitely got the Robot txt error on the website through the mail.
 
If you got his kind of mail so you need to fix it first you can cause critical SEO that can negatively impact your ranking and traffic as well.
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress

 

As you can see on the ScreenShot where you can see Fix Coverage issues this section of Google Console in this section are find some kind of error Robot txt, crawler issue,404 error, and 505 error so the day will be discussed about the Robot txt Error 
 
In this post, you will learn what is a robot txt file, the importance of a robot txt file, why robot txt shows, why do you need it. How to fix it in a blogger, how to fix it in WordPress. 
 
If you want to fix this so read up this article till at the end and I will tell you How to fix the robot txt file in blogger and WordPress step by step with the use of screenshots so Let’s start.

What is Robot.txt

Robot.txt is a file in the form of words and symbols. that resides in the root directory of your website and gives search engine crawlers instruction as to which page they indexing, Crawl while during the indexing and crawlings.
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress
 
when it’s visiting your website so then the first thing they do is look check out the website content.
 
Also, check your labels of websites and check and read the URL,  for indexing the particular websites.
 

Importance of robot.txt file.

Robot.txt file is important because it helps your website for indexing, crawling, and also helps to SEO and ranking of the article
 
Robot.txt helps your website to find the publicly your page and a particular website can be crawled and added to their index.
 
If you have a big website, where crawling and indexing can be a very resource-intensive process. Crawlers from various search engines will be trying to crawl and index the whole website and can be crate serious problems.
 

Why robot.txt show in Google Console

This is due to showing the Robots.txt file in Google console coverage when we do not update the robot’s txt file on hosting site websites like Blogger and WordPress.
 
In the case of a blogger, we used labels while post the article while we add labels on the post. If we added the same labels on the single post then you get this type of problem.
 
In the case of word press, we used the tags and category if we used the same tags then you can get this type of problem.
 
When we use the same labels, tags, and category on a single article so when the crawler is indexing them and get the three labels to have a single article so when this situation you got the duplicate content error.  so it’s very important to fix it because it will impact your SEO.
 
 
Two Important things you should know about Robot.txt
  1. The first thing is that if you added rules to the robot.txt are the only directive. This means that SEO Search Engine obeys and follows the rules according to a robot txt file.
  2. The second thing is if you added the if block any page on the robot.txt then that will be generated and removed or not appear on the web.

How robots.txt works

Robot.txt file is the form of a simple keywords structure. In which some predefined keywords and value combinations you can use 
 
Allow, Disallow, Crawler Delay, Sitemap, and User agent
 
Allow:- You can use the allow on your robot.txt file it gives access to a specific folder on our website and even the parent directory is disallowed.
 
For example, If you disallow access to photos directory but in case of allowing you can be located your blog subfolders are located in subfolders.
User-agent: *
Disallow: /search
Disallow: /category/
Disallow: /tag/
Allow: /
User-Agent:* It helps to the specific crawler to take the specific account the directive. you can use * it specify the crawler name as you can see on the below side.
 
User-Agent:* this allows crawlers. 
User-Agent: this for the Google bot Only.
 
Disallow:- it directive that instructs the user agent, not any URL
 
Sitemap:- It’s directive the direct major search engine also including Google. it’s very helpful to specify the find the location of the XML file.
 
The search will be able to find the cause of the XML sitemap. this was added to the bottom of the robot.txt file.
 
SITEMAP:- https://emample.com/sitemap.xml
 

How to Fix Robot.Txt 

Fix Robots Txt File In Blogger

First of login to your blogger which has the Robot.Txt error and Clicks on the Setting of your blogger.
As you can see on the Screenshot and follow me by step by step
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress
Setting > Crawlers and indexing
Now you can see the bottom of Crawlers and indexing has Custom robot.txt now enable the  1 Enable custom robots.txt and now Go to google and Denrate the custom eobots.txt file. Click ≻ Blogger site map
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress
Now generating code is a copy on the arrow 2 section and save the section.
Now enable the Custom robots header tag and at active all the button as according to screen show as you see on the picture
In Home page tags
all       enable
no dp    enable  
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress
 
In Archive and search page tags
Noindex  enable
nodp    enable
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress
Inpost and tags
all   enable
nodp   enable
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress

 

now click on the save and close the blogger.
Now open the Google console on your browser and click on the coverage on the google console.
See on the screenshot I marked by the arrow your screen also show like that index through blocked by the robots.txt so then click on this error
fix-the-robots-txt-file-in-blogger-custom-roboto-txt
Robots Txt File In Blogger and WordPress

 

after a click on this screenshot, your screen will be shown like see on the bottom screenshot. Now click on start validation and your error will be removed after some days from the coverage. it can be fixed within 15 days after the fix you will be getting the Gmail from the Google Console.

Robots Txt File In WordPress

In WordPress, we need to install the one plugin YOST SEO install the Yoast SEO
fix-the-robots-txt-file-in-blogger-custom-roboto-txt-wordpress
Robots Txt File In Blogger and WordPress
After install open the Yoast SEO and then your screen will be shown like ↡↡↡
fix-robots-txt-file-in-blogger-and-blogger
Robots Txt File In Blogger and WordPress
Now you can see three options are shown on the screen Category, Tag, and format there some changes in the three options as you see in the pictures
In Category   ➡️ NO
In Tag            ➡️ NO
In Format      ➡️ Disable
after active, these options save the sitting and these fills are fixed after some days.

Note:- If you don’t save all these sitting so you must be save all these settings. If you don’t do this can impact your SEO.so read carefully this Robots Txt File In Blogger and WordPress 

Thanks for the reading please drop the comments if not understand anything we’ll help within 8hr and share the comment in the comment section about Robots Txt File In Blogger and WordPress
 
 
.

7 thoughts on “How to Fix Robots Txt File In Blogger and WordPress”

Leave a Comment