I am using adobe.acs.commons.wcm.impl.RobotsServlet to create robots.txt file which is having sitemap, allow and disallow URLs. I want to externalize the urls using externalizer.domain but this is only getting applied to sitemap directives. Is there a way to add domain to allow and disallow directives also.
Views
Replies
Total Likes
Usually, for my own implementations, I would store the robots.txt file inside of the DAM, so my SEO authors would have full control over what they want to make public available to the robots.txt. Use the
robots.content.property.path
How does it work, can you explain more.