django serving robots.txt efficiently

Yes, robots.txt should not be served by Django if the file is static. Try something like this in your Nginx config file:

location  /robots.txt {
    alias  /path/to/static/robots.txt;
}

See here for more info: https://nginx.org/en/docs/http/ngx_http_core_module.html#alias

Same thing applies to the favicon.ico file if you have one.

The equivalent code for Apache config is:

Alias /robots.txt /path/to/static/robots.txt

Leave a Comment

Hata!: SQLSTATE[HY000] [1045] Access denied for user 'divattrend_liink'@'localhost' (using password: YES)