Hi all, I was going through the following URL for AEM 6.4 SEO. https://helpx.adobe.com/in/experience-manager/6-4/managing/using/seo-and-url-management.html 1. robots.txt file is to block crawling of any content that should not be indexed. XML sitemap is to make it easier for search engines to crawl your content. robots.txt has a reference to XML sitemap. What is the relation between robots.txt and XML sitemap? Or they are independent of each other, serving separate purposes? 2. For SEO, how to implement Compliance indicators for the character length of the page title and meta description? 3. For SEO, how to implement SEO keyword checker functionality? Appreciate your support. Thanks, Rama.
Hi edubey, Thanks for responding. 1. Both robots.txt and XML SiteMap files look to be doing the same thing essentially. Bothe are helping the crawling by Search engines. Kindly elaborate the difference between them and why we need both of them. 2. Compliance indicators for the character length of the page title and meta description: Is it something like if the char length is more, system should say so and show in RED? 3. SEO keyword checker functionality: What is this functionality? How the taken care while creating content by the content author ? Appreciate your support. Thanks, Rama.