Robots.txt Generator

Select your crawling preferences to generate a fully optimized robots.txt file.

Are all Robots Allowed?
Sitemap:  (not required) 
User-agent:
Allow Directories (one per line):
Disallow Directories (one per line):



Signup for a Keysearch account to take your keyword research to another level.

Looking for something else? Browse our library of free tools.

Why You Should Use a Robots.txt File Generator


The more you know about how search engines work, the more you can tweak your website to your advantage and improve your SEO. One aspect of SEO that not many people know about is the robots.txt file. The name might sound confusing or technical, but you don’t need to be an SEO expert to understand and use a robots.txt file.

Quickly Create or Edit a Robot.txt File For Any Website


What Is a Robots.txt File?

What Is a Robots.txt File?

Also called the robots exclusion protocol or standard, a robots.txt file is a text file present within your website that either allows or prevents Google and other search engines from:

  • Accessing the entirety of a website
  • Accessing only certain pages of a website
  • Indexing a website

Search engines check the instructions within the robots.txt file before they start crawling a website and its content. A robots.txt file is useful if you don’t want certain parts of your website to be searchable, like Thank You pages or pages with confidential or legal information.

To check whether your website already has a robots.txt file, go to the address bar in your browser and add /robot.txt to your domain name. The URL should be: http://www.yourdomainname.com/robots.txt

You can also log into your hosting website account, go the file management interface and check the root directory.

If your website already has a robots.txt file, there are some additions you can make to further help optimize your SEO. If you can’t find a robots.txt file, you can create one – it’s very easy.

How to Create a Robots.txt file

If you’re a Windows user, use can use NotePad to create the file. For Mac users, the TextEdit program works just fine. We are looking to create just a blank TXT file. Don’t use programs like MS Word for this task as they may cause encoding issues. Name the file “robots.txt” and save it.

The robots.txt file will now be empty; you have to add the instructions that you want – which is what we’re about to see. When you are done with the instructions, upload the robots.txt file to the root of your website using an FTP software like FileZilla or the file manager that your hosting provider provides. Note that if you have subdomains, you should create robots.txt files for each subdomain.

Now let’s see what kind of instructions you can give to robots through your robots.txt file.

If you want all robots to access everything on your website, then your robots.txt file should look like this:

User-agent: 
*Disallow:

Basically, the robots.txt file here disallows nothing, or in other words, is allowing everything to be crawled. The asterisk next to “User-agent” means that the instruction below applies to all types of robots.

On the other hand, if you don’t want robots to access anything, simply add the forward slash symbol like this:

User-agent: 
*Disallow: /

Note that one extra character can render the instruction ineffective, so be careful when editing your robots.txt file.

In case you want to block access to a specific type of GoogleBots, like those that search for images, you can write this:

User-agent: googlebot-images Disallow: /

Or if you want to block access to a certain type of files, like PDFs, write this:

User-agent: *
Allow: /
# Disallowed File Types
Disallow: /*.PDF$

If you want to block access to a directory within your website, for example, the admin directory, write this:

User-agent: *
Disallow: /admin

If you want to block a specific page, simply type its URL:

User-agent: *
Disallow: /page-url

And if you don’t want Google to index a page, add this instruction:

User-agent: *
Noindex: /page-url

If you’re not sure what indexing means, it’s simply the process that makes a page part of web searches.

Lastly, for big websites that are frequently updated with new content, it’s possible to set up a delay timer to prevent servers from being overloaded with crawlers coming to check for new content. In a case like this, you could add the following directive:

User-agent: *
Crawl-delay: 120

Thus all robots (except for Googlebots, which ignore this request) will delay their crawling by 120 seconds, preventing many robots from hitting your server too quickly.

There are other kinds of directives you can add, but these are the most important to know.

How to Create a Robots.txt file
The Importance of Robots.txt for SEO

The Importance of Robots.txt for SEO

The reason it’s recommended to use robots.txt file is that without one, your website can be subject to too many third-party crawlers trying to access its content, which can lead to slower loading times and sometimes even server errors. Loading speed affects the experience of website visitors, many of which will leave your site if it doesn’t load quickly.

As shown above, using a robots.txt file allows you different options:

  • You want to point search engines to your most important pages
  • You want search engines to ignore duplicate pages, like pages formatted for printing
  • You don’t want certain content on your website to be searchable (documents, images, etc.)

That’s why it’s important to know exactly what you put within your robots.txt file so that it enhances your SEO optimization rather than compromises it. A robots.txt file with the wrong directives can cause huge issues and possibly prevent pages from showing up in the search results.

Another benefit of using a robots.txt file is that you can easily direct crawlers to your website’s sitemap by adding this directive:

sitemap: http://www.yourdomainname.com/sitemap.xml   
                        

Free Robots.txt Creator - Generate Robots.txt Files - Download and Save


The Benefits of Using a Robots.txt Generator


Our robots.txt file generator is an online tool that allows you to quickly create robots.txt files for your website. You can either open and edit an existing file or create a new one using the output of our generator. Depending on your preferences, you have the option to easily pick which types of crawlers (Google, Yahoo, Baidu, Alexa, etc.) to allow or disallow. Likewise, you can add other directives like crawl delay with only a few clicks, instead of typing everything from scratch.

If you are looking to use a robot.txt file in your website, check our online robot.txt file generator. You can easily set up any directive you want and generate a text file that you can use right away to improve your SEO.

Need Help With Keyword Research?

Sign Up For a Keysearch Account

 GET STARTED NOW

Take Control of Your Online Success!