Today, we’re going to learn how to optimize robots.txt for WordPress and why is it so important for your website search engine ranking. This article will also include Robots.txt file editing and Google Submission.
What is WordPress Robots.txt File ?
A robots.txt file is a file on your website hosting server that helps you to restrict any search engine to access specific files or folders. You can stop Google or other search engine’s bots from crawling specific pages on your website. Have a look at this screenshot, just to know how Robots.txt file looks like:
You must be wondering, how blocking unwanted pages on your website would help in search engine rankings or SEO? The more pages on your website, the more Google Search Engine has to crawl. There is no point letting Google bots crawling unwanted pages like admin pages, back-end folders as we don’t index them in Google and so there is no need letting them crawl such useless part of the website.
For example, pages like tags, categories or archives which are not that important can be blocked from robots.txt file as these pages are low quality and consume your website’s crawl budget (Crawl Budget is the number of pages Googlebot crawls and indexes on a website within a given timeframe).
Crawl Budget – “Prioritizing what to crawl, when, and how much resource the server hosting the site can allocate to crawling is more important for bigger sites, or those that auto-generate pages based on URL parameters,” Gary said.
You need to be careful to edit this file correctly as it might affect your website SEO.
How to Optimize Robots File?
You can locate Robots file in your website’s root folder. You need to connect to your hosting server using FTP or Cpanel to get to this file. This is an ordinary text file that you can easily open through Notepad or any other software. If you can’t find a file naming robots.txt in your root folder, you can simply create a new one by saving text file named as robots.txt.
How to create Perfect Robots.txt File ?
With a very simple format, robots.txt file contains first line as a user agent (Also known for communicating with search engine bots e.g. Googlebot or Bingbot). *asterisk is used to order all bots. Next lines are for “Allow” or “Disallow” parts of your website.
Have a look at this screenshot:
There are certain pages which are not required to be added in robots file such as login page, registration page or admin directory because these pages already have noindex tag added by WordPress. I would always recommend to disallow readme.html as this file can be used by anyone to know your WordPress Version. This can also save you from malicious attacks.
Robots file Google Submission
Once you’ve edited or created your robots file, you can submit it to Google using Google Search Console.
Test your file using Google’s robots.txt testing tool before your submit.
Now you know how to optimize your Robots.txt file for better SEO. Contact us if you face any problem in creating this file. Cheers…