Robots Text Generator Tool

Quickly generate a robots.txt for your site.


A/DUser agentDirectory or File

Compare functionality

As you use the robots.txt file generator, to see a side-by-side comparison on how your site currently handles search bots versus how the proposed new robots.txt will work, type or paste your site domain URL or a page on your site in the text box, and then click Compare.


When search engines crawl a site, they first look for a robots.txt file at the domain root. If found, they read the file’s list of directives to see which directories and files, if any, are blocked from crawling. This file can be created with a robots.txt file generator. When you use a robots.txt generator Google and other search engines can then figure out which pages on your site should be excluded. In other words, the file created by a robots.txt generator is like the opposite of a sitemap, which indicates which pages to include.

The robots.txt generator

You can easily create a new or edit an existing robots.txt file for your site with a robots.txt generator. To upload an existing file and pre-populate the robots.txt file generator tool, type or paste the root domain URL in the top text box and click Upload. Use the robots.txt generator tool to create directives with either Allow or Disallow directives (Allow is default, click to change) for User Agents (use * for all or click to select just one) for specified content on your site. Click Add directive to add the new directive to the list. To edit an existing directive, click Remove directive, and then create a new one.

Create custom user agent directives

In our robots.txt generator Google and several other search engines can be specified within your criteria. To specify alternative directives for one crawler, click the User Agent list box (showing * by default) to select the bot. When you click Add directive, the custom section is added to the list with all of the generic directives included with the new custom directive. To change a generic Disallow directive into an Allow directive for the custom user agent, create a new Allow directive for the specific user agent for the content. The matching Disallow directive is removed for the custom user agent.

To learn more about robots.txt directives, see The Ultimate Guide to Blocking Your Content in Search.

You can also add a link to your XML-based Sitemap file. Type or paste the full URL for the XML Sitemap file in the XML Sitemap text box. Click Update to add this command to the robots.txt file list.

When done, click Export to save your new robots.txt file. Use FTP to upload the file to the domain root of your site. With this uploaded file from our robots.txt generator Google or other specified sites will know which pages or directories of your site should not show up in user searches.


Report a Bug
Rerun this Tool

If you like this tool, please Plus it, Like it, Tweet it, or best yet, link to it - Jim