Expand my Community achievements bar.

Externalizing disallow or allow URLs in Robots.txt using Adobe ACS Commons RobotsServlet

Avatar

Level 2

I am using adobe.acs.commons.wcm.impl.RobotsServlet to create robots.txt file which is having sitemap, allow and disallow URLs. I want to externalize the urls using externalizer.domain but this is only getting applied to sitemap directives. Is there a way to add domain to allow and disallow directives also.

2 Replies

Avatar

Community Advisor

Usually, for my own implementations, I would store the robots.txt file inside of the DAM, so my SEO authors would have full control over what they want to make public available to the robots.txt. Use the 

robots.content.property.path

Robots.txt Generator (adobe-consulting-services.github.io)