Still getting below error in dispatcher log and 404 from browser with full path or shorten path.
"GET /content/dam/test-project/robots.txt" - 0ms [publishfarm/-] [actionblocked]
Rewrite rules applied below
RewriteRule ^/robots.txt$ /content/dam/test-project/robots.txt [NC,PT,L]
Filter rules applied below
/0072 { /type "allow" /url "/content/dam/test-project/robots.txt"}
Hi @pradeepdubey82
You said it works on your local SDK. Does it work with direct publish request (http://localhost:4503/robots.txt) or with local dispatcher (http://localhost/robots.txt) or with local dispatcher + actual host mapping in /etc/hosts with the same URL as it is requested from your CDN (http://<website.com>/robots.txt)?
Do not forget to check the publish access when you are not logged in to the publisher, it might be an issue with the anonymous user access restrictions!
If the first doesn't work locally (if works only with /content/dam/...), then your sling resolver mapping configuration is not correct (or you may need to change the order of rules), use the jcrresolver interface to check the configuration and fix it.
If it works well for the first but not the second one, then dispatcher allow or rewrite rules are not good. If the first and second are working, but not the third one, then the issue is still in the dispatcher in per-host configuration files.
If all three are working, you can check the same but on your remote - direct access on the publisher (https://adobecqms...:4503/robots.txt) will tell you if it's something incorrect with your AEM / access rules for the anonymous user or still something on CDN / dispatcher.
If it is still the dispatcher, you can try increasing the log output level for the rewrite (and maybe access) logs on the dispatcher and check logs after that, there should be a visible explanation of why the request has been blocked.
Btw, if you have the rule
RewriteRule ^/robots.txt$ /content/dam/test-project/robots.txt [NC,PT,L]
then you don't need to update mappings in resolver configuration, the localhost:4503/robots.txt won't work but requests through the dispatcher will work anyway.