Robots

I use django-robots for applying robots to my django apps.

TL;DR

# Install requirements in your virtual env
pip install django-robots
# settings.py
INSTALLED_APPS = [
    ...
    'django.contrib.sites',
    'robots',
    ...
]

SITE_ID=1

TEMPLATES = [
	{
		...
        'APP_DIRS': True, # It's true by default
        ...
    }
]


ROBOTS_CACHE_TIMEOUT = 60*60*24 # Timeout in seconds. 60 * 60 * 24 => 1 day
# urls.py
from django.urls import re_path

urlpatterns = [
	...
    re_path(r'^robots\.txt', include('robots.urls')),
    ...
]

What

The above setup with install robots with default settings. If a sitemap view exists, it should automatically get picked up and be included in the robots.txt output if the Sitemaps post was followed. From here on, rules can be defined from within the Django Admin panel and will automatically be exposed to the /robots.txt output.

For customizing robots, check the app's documentation at Robots Docs