Externalizing disallow or allow URLs in Robots.txt using Adobe ACS Commons RobotsServlet | Community
Skip to main content
Level 2
October 17, 2022
Question

Externalizing disallow or allow URLs in Robots.txt using Adobe ACS Commons RobotsServlet

  • October 17, 2022
  • 1 reply
  • 790 views

I am using adobe.acs.commons.wcm.impl.RobotsServlet to create robots.txt file which is having sitemap, allow and disallow URLs. I want to externalize the urls using externalizer.domain but this is only getting applied to sitemap directives. Is there a way to add domain to allow and disallow directives also.

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.

1 reply

BrianKasingli
Community Advisor and Adobe Champion
Community Advisor and Adobe Champion
October 17, 2022

Usually, for my own implementations, I would store the robots.txt file inside of the DAM, so my SEO authors would have full control over what they want to make public available to the robots.txt. Use the 

robots.content.property.path

Robots.txt Generator (adobe-consulting-services.github.io) 

Level 2
October 18, 2022

How does it work, can you explain more.