How to add robots.txt

Enonic version: 6.5.4

Hi,

I have application which is published.
Is there any possibility to add robots.txt to the applications ?

You could create a controller mapping to handle the robots.txt
http://xp.readthedocs.io/en/6.5/developer/site/mappings/index.html

Add this in site.xml:

<mappings>
  <mapping controller="/site/robots/robots.js">
    <pattern>/portal/.*/robots\.txt</pattern>
  </mapping>
</mappings>

Create /site/robots/robots.js file in the app:

var ioLib = require('/lib/xp/io');

exports.get = function (req) {
    var robotsFile = ioLib.getResource('/site/robots/robots.txt').getStream();
    var robotsText = ioLib.readText(robotsFile);

    return {
        contentType: 'text/plain',
        body: robotsText
    };
};

Then create your robots.txt file in the same app: /site/robots/robots.txt

This will work in 6.6, but unfortunately there is a bug in 6.5 that requires that there exists a content in the path for the mapping to work. So in 6.5 you also have to create a content and name it “robots.txt” just under your Site content. It can be anything, even if it is not accessible, like a folder.

Hope this helps. We are planing to release 6.6 next week.

3 Likes

Thank you,

At 6.5 I’ve done it in next way:

mapping.r.host = localhost
mapping.r.source = /robots.txt
mapping.r.target = /PATH_TO_SERVICE/robots-service

mapping.f.host = localhost
mapping.f.source = /favicon.ico
mapping.f.target = /PATH_TO_SERVICE//favicon-service`

3 Likes

I smell a demand for an app that can do this… or perhaps some modifications to one of the existing SEO apps that are on the Enonic Market? Just a friendly reminder to anyone who has some free time and inspiration on their hands :wink:

4 Likes