WordPress and Robots.txt File: Examples and Best Practices [2020 Edition]

Written by | Date Updated: October 21, 2020

Question: I use WordPress and I am wondering if I should create a robots.txt file. I have read in various places that I should and other places that WordPress creates its own robots.txt file. What’s the real story? –Tim, Madison, Wisconsin

WordPress Robots.txt

WordPress and The Robots.txt File: What’s Best For You

Tim, yet another great question. What should you do about the robots.txt file when you use WordPress?

There are two answers to this question. The first is the short, quick answer and the second is long and involved…you will hear experts discuss the WordPress robots.txt file ad nauseam.

So, let’s get to the quick answer first and then we’ll look at the “long answer” and inundate you with links where experts discuss this issue until their blue in the face.

WordPress and The Robots.TXT File: The Default Virtual Robots.Txt File

Tim, the quick answer is this:

You do not need to create a robots.txt file because WordPress automatically creates a virtual robots.txt for you.

To view this file, you can visit

The file should look something like this:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/


The first line of this file, the “user-agent” line, is a bot declaration. The * indicates all search bots (like Google, Yahoo, etc). And, by default, everything will be indexed except the lines that we mention below.

The second and third lines of this file tell the user agents (in this case all of them) not to search these specific WordPress directories because they provide no added content.

Finally, the “Sitemap” map line informs the bots about the location of your sitemap file. This line is presumed beneficial and should be included in your robots.txt file. If you use the Google XML sitemaps plugin (which you should), this line will be included…and the line break after the last disallow line should also be there.

What If You Don’t See the Robots.Txt File?

Tim, some people have this problem so I thought I’d provide an answer to it as well.

If you do not see the virtual robots.txt file that WordPress should have created, it may be that you are using an outdated version of WordPress OR the virtual robots.txt file may have been preempted by a plugin.

In this case, you can easily create your own robots.txt file. Using the WordPress sample above, simple copy and paste the information into a text file, name it robots.txt, and then upload it to your root directory. Obviously, you want to change yoursite to your actual website URL.

Create Your Own Robots.Txt: Examples

There may be instances where you want other directories (directories that perhaps exist outside of your WordPress environment), to also be disallowed because you do not want them appearing in the search results.

This is also easy to do by creating your own robots.txt. If you have a subdirectory of your website that you do not want the bots to include, you would simply add a line like this:

Disallow: /thisdirectory/

Be sure to add the trailing “/”! If you do not, it will not index anything beginning with the words “thisdirectory”.

Will Your Robots.txt Overwrite the WordPress Virtual Robots.Txt?

Yes. If you upload your own robots.txt file, you will see that it is now the active file by visiting

An Example Robots.Txt to Upload

Tim, if you do not need to disallow any other files, the WordPress virtual robots.txt file will be fine.

However, if you do not see a virtual robots.txt or need to create one manually to exclude other subdirectories on your website, use these lines below as a template:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /subdirdontindex1/
Disallow: /subdirdontindex2/


Appendix: WordPress Robots.Txt File–The Exhaustive Discussion

For most of you, I believe that the strategies mentioned above are sufficient. This is because the “final answer” has not been declared with 100% certainty and, barring a major mistake like inadvertently disallowing your whole site (it’s been done!), your robots.txt file should be fine.

However, an exhaustive discussion about WordPress and Robots.txt is onging.

Here are several links that discuss the matter with a quick summary of each link:

Search Engine Optimization for WordPress: This is the site’s discussion of SEO for WordPress and here they present a recommended robots.txt file. You certainly can use their recommended robots.txt file but it has been debunked in several articles for disallowing too much.

WordPress Robots.txt Guide – What It Is and How to Use It: Take a deep-dive into the robots.txt file with this detailed post by

How to Optimize Your WordPress Robots.txt for SEOThis site is all about WordPress and in this article they devote their attention to the robots.txt file with many solid examples of what you should and should not do.

WordPress robots.txt Example: One author, for whom I have great respect because he is the creator of the best WordPress SEO plugin, says that the WordPress recommendations are too restrictive. He says that the only line that should be in your robots.txt is: “User-agent: * “.

WordPress Needs a Default robots.txt File and More…: Did you know has an ideas section? Well, they do and one of the ideas is to provide a default robots.txt file. I like this idea (though it really exists with the virtual robots.txt) and it would allow us all to be content with just one answer.

Robots.txt Creation Tools: If you are uncomfortable creating your own robots.txt file, this link has tools that will assist you.

Richard Cummings

Director of SEO, Social Media, and Web Content Development at The SEO System
Richard Cummings has been practicing online marketing for many years and has setup and optimized hundreds of WordPress sites.He founded The SEO System to provide SEO, social media, and online marketing services and software to businesses.
Richard CummingsWordPress and Robots.txt File: Examples and Best Practices [2020 Edition]