WordPress Made Easy

How to Optimize Your WordPress Robots.txt for SEO

One of our readers recently asked for advice on how to improve SEO by optimizing the robots.txt file.

The Robots.txt file instructs search engines on how to crawl your website, making it an extremely effective SEO tool.

We’ll show you how to make a perfect robots.txt file for SEO in this article.

For better SEO, optimize the robots.txt file.

What is the purpose of the robots.txt file?

Robots.txt is a text file that allows website owners to tell search engine bots how to crawl and index their pages.

It’s usually kept in your website’s root directory, also known as the main folder. The following is the basic format for a robots.txt file:

1
2
3
4
5
6
7
User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
User-agent: [user-agent name]
Allow: [URL string to be crawled]
Sitemap: [URL of your XML Sitemap]

You can add multiple sitemaps and have multiple lines of instructions to allow or disallow specific URLs. If you don’t specify whether or not a URL is allowed to be crawled, search engine bots will assume that it is.

The following is an example of a robots.txt file:

1
2
3
4
5
6
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-content/plugins/
Disallow: /wp-admin/
Sitemap: https://example.com/sitemap_index.xml

We’ve allowed search engines to crawl and index files in our WordPress uploads folder using the robots.txt example above.

After that, search bots are no longer allowed to crawl and index plugins and WordPress admin folders.

Finally, we’ve included the location of our XML sitemap.

Is a Robots.txt File Necessary for Your WordPress Site?

Search engines will still crawl and index your website if you don’t have a robots.txt file. You won’t be able to tell search engines which pages or folders they shouldn’t crawl, however.

When you first start a blog and don’t have much content, this won’t have much of an impact.

However, as your website grows and you add more content, you’ll probably want more control over how your site is crawled and indexed.

This is why.

Each website has a crawl quota that search bots must adhere to.

This means that during a crawl session, they will crawl a certain number of pages. If they don’t finish crawling all of the pages on your site in one session, they will return and resume crawling in the next.

This can cause a delay in the indexing of your website.

You can prevent search bots from crawling unnecessary pages like your WordPress admin pages, plugin files, and themes folder by disabling crawling.

You can save the crawl quota by disallowing unnecessary pages. This allows search engines to crawl and index more pages on your site as quickly as possible.

When you want to prevent search engines from indexing a post or page on your website, you can use the robots.txt file.

It’s not the safest way to keep content hidden from the general public, but it will keep it out of search results.

How Does a Perfect Robots.txt File Look?

A simple robots.txt file is used by many popular blogs. Depending on the needs of the specific site, their content may vary:

1
2
3
4
5
User-agent: *
Disallow:
Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml

This robots.txt file provides a link to the website’s XML sitemaps and allows all bots to index all content.

The following rules in the robots.txt file are recommended for WordPress sites:

1
2
3
4
5
6
7
8
User-Agent: *
Allow: /wp-content/uploads/
Disallow: /wp-admin/
Disallow: /readme.html
Disallow: /refer/
Sitemap: http://www.example.com/post-sitemap.xml
Sitemap: http://www.example.com/page-sitemap.xml

This instructs search bots to index all images and files in WordPress. The WordPress admin area, readme file, and cloaked affiliate links are not indexable by search bots.

You make it easy for Google bots to find all of your pages by including sitemaps in your robots.txt file.

Let’s look at how to make a robots.txt file in WordPress now that you know what an ideal robots.txt file looks like.

In WordPress, how do you make a Robots.txt file?

In WordPress, there are two ways to make a robots.txt file. You can select the method that best suits your needs.

Method 1: Using All in One SEO to edit the Robots.txt file

All in One SEO, also known as AIOSEO, is the most popular WordPress SEO plugin, with over 2 million websites using it.

It’s simple to use and includes a robots.txt generator.

You can see our step-by-step guide on how to install a WordPress plugin if you don’t already have the AIOSEO plugin installed.

This feature is also available in the free version of AIOSEO.

You can use the plugin to create and edit your robots.txt file directly from your WordPress admin area once it’s installed and activated.

To edit your robots.txt file, go to All in One SEO » Tools.

All in One SEO includes a file editor for the robots.txt file.

To begin, toggle the ‘Enable Custom Robots.txt’ toggle to blue to enable the editing option.

With this option enabled, you can use WordPress to create a custom robots.txt file.

Create a custom robots.txt file and enable it.

In the ‘Robots.txt Preview’ section at the bottom of your screen, All in One SEO will display your existing robots.txt file.

The default WordPress rules will be displayed in this version.

All in One SEO provides a preview of the robots.txt file.

These default rules tell search engines not to crawl your core WordPress files, but they can index all content and get a link to your XML sitemaps.

Now you can improve your robots.txt for SEO by adding your own custom rules.

In the ‘User Agent’ field, type a user agent to add a rule. When you use a *, the rule will be applied to all user agents.

Then choose whether you want the search engines to crawl ‘Allow’ or ‘Disallow.’

Then, in the ‘Directory Path field, type the filename or directory path.

Add rules for robots to the file

The rule will be applied to your robots.txt file automatically. Click the ‘Add Rule’ button to add another rule.

We recommend adding rules until you achieve the robots.txt format we discussed earlier.

This is how your custom rules will appear.

txt file preview for custom robots

Don’t forget to click the ‘Save Changes’ button when you’re finished to save your changes.

Method 2: Using FTP, manually edit the Robots.txt file

To edit the robots.txt file with this method, you’ll need an FTP client.

Using an FTP client, connect to your WordPress hosting account.

Once inside, look for the robots.txt file in the root folder of your website.

FTP is used to edit the robots file.

You probably don’t have a robots.txt file if you don’t see one.

You can just go ahead and make one in that case.

Make a new robots file.

Because Robots.txt is a plain text file, you can save it to your computer and edit it with any plain text editor, such as Notepad or TextEdit.

You can then upload it back to the root folder of your website after saving your changes.

What Is the Best Way to Test Your Robots.txt File?

It’s always a good idea to test your robots.txt file after you’ve finished it with a robots.txt tester tool.

There are a plethora of robots.txt tester tools available, but we recommend using the one found within Google Search Console.

You’ll need to link your website to Google Search Console first. See our guide on how to add your WordPress site to Google Search Console if you haven’t already.

The Google Search Console Robots Testing Tool can then be used.

Select a property

Simply choose your home from the dropdown menu.

If any errors or warnings are found, the tool will automatically fetch your website’s robots.txt file and highlight them.

Final Thoughts on the Robots Testing Tool

Final Thought

The goal of optimizing your robots.txt file is to keep search engines from crawling pages that aren’t public. Pages in your wp-plugins folder, for example, or pages in your WordPress admin folder.

Blocking WordPress category, tag, and archive pages, according to SEO experts, will improve crawl rate, resulting in faster indexing and higher rankings.

This isn’t correct. In addition, it violates Google’s webmaster guidelines.

To create a robots.txt file for your website, we recommend using the above robots.txt format.

We hope you found this article helpful in learning how to optimize the robots.txt file in WordPress for SEO.

Comments are closed.