Prevent Robots.txt file from getting downloaded | Community
Skip to main content
Level 6
August 24, 2022
Solved

Prevent Robots.txt file from getting downloaded

  • August 24, 2022
  • 2 replies
  • 3001 views

My robots.txt file is added to content/dam and I have added rewrite rules for the same in my dispatcher. I have update the robots.txt path in Content Disposition under exclude path. Yet, the the robots file gets downloaded instead of showing in browser. 

 

In disposition config: if i add entire path tille robots.txt, it fails to load on browser. If i add /content/dam*:text/plan, it works.

But the above will block all text type file. So i made change in dispatcher rewrite rule file:

<LocationMatch "/content/dam/site/robots.txt">
ForceType text/plain
Header set Content-Disposition inline
</LocationMatch>

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by Sachin_Arora_

The issue looks like more of browser cache. I followed the same and it works fine after clearing the cache.

Please clean browser cache or try in different browser.

Screenshot of configuration working fine for me.

 

 

2 replies

Sachin_Arora_
Community Advisor
Sachin_Arora_Community AdvisorAccepted solution
Community Advisor
August 24, 2022

The issue looks like more of browser cache. I followed the same and it works fine after clearing the cache.

Please clean browser cache or try in different browser.

Screenshot of configuration working fine for me.

 

 

BrianKasingli
Community Advisor and Adobe Champion
Community Advisor and Adobe Champion
August 24, 2022
Level 6
August 25, 2022

we are not using custom servlet.

BrianKasingli
Community Advisor and Adobe Champion
Community Advisor and Adobe Champion
August 25, 2022