How to customize robots.txt

Discourse does not have a file named «robots.txt». Instead, it serves the request for /robots.txt dynamically.
So how you can edit non-existent file?
There is a simple way without diving to Discourse: you can catch requests for /robots.txt using Nginx:

server {
    listen 80;
    location = /robots.txt {
        root /var/www/res/discourse;
        access_log off;
        expires max;
    location / {
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header Host $http_host;
        proxy_redirect off;
        proxy_pass http://discourse_forum_ru;

Set your root and upload custom robots.txt.