I am trying to create a servlet to serve a robots.txt. The servlet to handle this request is mapped on the website page resource and looks like this:
When I visit the page like this "/content/website.robots.txt" I am presented with an 404 error and the get of the servlet is never called.
When I change "sling.servlet.extensions" to xml and visit "/content/website.robots.xml" the get gets called and the servlet works.
I am using AEM 6.2 with OSGI 6.0.0
Is it impossible to map to .txt as extension or am I doing something wrong ?
Views
Replies
Total Likes
Turn on 'Enable Plain Text' option on 'Apache Sling GET Servlet' in configMgr since you are using METHOD_GET and test if it works.
Views
Replies
Total Likes
This unfortunately just makes it possible for the default GET servlet to handle the request. However I would like my own servlet to handle the request.
Views
Replies
Total Likes
It seems to me that you are using the Felix SCR annotations. In that case you need to add the
@Service()
annotation to the class definition as well (no additional parameters required). That should do the trick.
Views
Replies
Total Likes
I'm actually using the OSGI component annotation the felix annotations are not available in the 6.2 architype I'm using
Views
Replies
Total Likes
Hm, correct... I wonder why I was thinking that you are using the SCR ones ... :-/
Can you post the the generated xmls in target/classes/OSGI-INF for this class?
And btw the resource type is just "webstie/components/structure/websitepage"; the resource type is normally never the full path to the component.
Views
Replies
Total Likes
thanks for the tip I will change the resource type to the short one
Views
Replies
Total Likes
Views
Replies
Total Likes
Thanks for your answer Arun Patidar. At this point I don't really want to switch the whole interface to a sling filter so I will just map to .xml and rewrite from .txt to .xml in apache.
Thanks for your help everyone!
Views
Replies
Total Likes
Ohh ok, sorry I misread , I thought you are using filter.
Views
Replies
Total Likes
to me the url should be /content/page/_jcr_content.robot.txt
Views
Replies
Total Likes
This actually works. Does this mean that all requests to the page node are internally redirected to _jcr_content for HTML and XML calls but not for other extensions ?
Views
Replies
Total Likes
Just add a node /apps/cq/Page/Page.robots.txt.jsp with the content
<%@include file="proxy.jsp" %>
and it should work as well for /content/page.robots.txt
Or change your resourceType, here an example: acs-aem-commons/SiteMapServlet.java at master · Adobe-Consulting-Services/acs-aem-commons · GitHub
Views
Replies
Total Likes
Nice responses Joerg and Feike!
Views
Replies
Total Likes
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies