Multipart robots.txt editor


This plugin needs more documentation!

You can edit your robots.txt and add remote content to it.
E.g. you have several sites and want to use a centralized robots.txt.


  • Include or exclude WordPress’ own robots.txt (core function)
  • Include or exclude plugins – e.g. sitemap plugins – output to robots.txt (filter output)
  • Include or exclude a remote text file (the common part)
  • Include or exclude custom records from the settings page (the site specific part)

Where is robot.txt?

WordPress handles robots.txt as a virtual URL – just the same way as posts and pages.

So when you browse to WordPress generates robots.txt on the fly.


  • add more description here
  • add a video too
  • add an admin notice for subdir installs (robots.txt is useless in a subdir)
  • ‘At least one “Disallow” field must be present in the robots.txt file.’ – check for that


Development of this plugin goes on on GitHub.


This section describes how to install the plugin and get it working.

  1. Upload the content of the ZIP feil to the /wp-content/plugins/ directory
  2. Activate the plugin through the ‘Plugins’ menu in WordPress

Preguntas frecuentes

Where is robot.txt? Why isn’t it generated at WordPress root directory?

WordPress handles robots.txt as a virtual URL – just the same way as posts and pages.

So when you browse to WordPress generates robots.txt on the fly.

How often will be the remote text file downloaded?

Every 24 hours and when you press the Sava Changes button on the setting page.


Even after deleting robots.txt still showing

hello why it is still showing your robots.txt file even after deleted your plugin and files. I created my own robots.txt file and uploaded. Still showing your genereted file and even i deleted. Still showing why?????????

Doesn’t always work

Good idea, good interface. Too bad it doesn't work. My wp is in root, discourage search engines is disabled, no file named robots.txt, permalinks enabled in .htaccess, cache reset for robots.txt in browser. Perfect conditions, it just doesn't work. UPD: it does work if I provide it with other url. Maybe my local dev machine have some not perfect https certificate or something (google chrome has it green). Code of the plugin is big and complicated, and yet it dies silently if something goes wrong. This is not a piece of code I'd want to use. Changing from 1star to 3start. It still doesn't have {inline_replace_with_hostname} functionality I've been looking for.
Ler todas as 3 opinións

Colaboradores e desenvolvedores

“Multipart robots.txt editor” é un software de código aberto. As seguintes persoas colaboraron con este plugin.


Traducir “Multipart robots.txt editor” ao teu idioma.

Interesado no desenvolvemento?

Revisa o código, bota unha ollada aorepositorio SVN, ou subscríbete ao log de desenvolvemento por RSS.

Rexistro de cambios


  • After Shiny Updates (AJAX plugin actions) it was not possible to uninstall



  • Added explanation about robots.txt file – NO code change


  • Googlebot needs CSS and JS files
  • Introducing semver


  • Fixed some serious PHP Notices, sorry


  • Initial release