Expand my Community achievements bar.

Submissions are now open for the 2026 Adobe Experience Maker Awards.

Mark Solution

This conversation has been locked due to inactivity. Please create a new post.

Externalizing disallow or allow URLs in Robots.txt using Adobe ACS Commons RobotsServlet

Avatar

Level 2

I am using adobe.acs.commons.wcm.impl.RobotsServlet to create robots.txt file which is having sitemap, allow and disallow URLs. I want to externalize the urls using externalizer.domain but this is only getting applied to sitemap directives. Is there a way to add domain to allow and disallow directives also.

2 Replies

Avatar

Community Advisor and Adobe Champion

Usually, for my own implementations, I would store the robots.txt file inside of the DAM, so my SEO authors would have full control over what they want to make public available to the robots.txt. Use the 

robots.content.property.path

Robots.txt Generator (adobe-consulting-services.github.io) 

Avatar

Level 2

How does it work, can you explain more.