pleroma/docs/administration/CLI_tasks/robots_txt.md
Ilja e0dee833f2 Improve static_dir documentation
* It was still written for From Source installs. Now it's both OTP and From Source
* I linked to the cheatsheet where it was about configuration
* I moved the mix tasks of the robot.txt section to the CLI tasks and linked to it
    * i checked the code at https://git.pleroma.social/pleroma/pleroma/-/blob/develop/lib/mix/tasks/pleroma/robotstxt.ex and it doesn't seem to more than just this one command with this option
* I also added the location of robot.txt and an example to dissallow everything, but allow the fediverse.network crawlers
* The Thumbnail section still linked to distsn.org which doesn't exist any more. I changed it to a general statemant that it can be used by external applications. (I don't know any that actually use it.)
* Both the logo and TOS need an extra `static` folder. I've seen confusion about that in #pleroma so I added an Important note.
2020-08-08 12:21:44 +02:00

632 B

Managing robot.txt

{! backend/administration/CLI_tasks/general_cli_task_info.include !}

Generate a new robot.txt file and add it to the static directory

The robots.txt that ships by default is permissive. It allows well-behaved search engines to index all of your instance's URIs.

If you want to generate a restrictive robots.txt, you can run the following mix task. The generated robots.txt will be written in your instance static directory.

./bin/pleroma_ctl robots_txt disallow_all
mix pleroma.robots_txt disallow_all