Description - Implementing the Analytics CNAME involves creating a subdomain for our website, which is then utilized to place cookies in a first-party context. However, during SEO crawling, these subdomains are scanned as well. To prevent Google from crawling these subdomains, we need to make adjustments to the robots.txt file located within it. Yet, as end users, we lack the capability to perform this task on our own and require assistance from Analytics Engineering to handle it on our behalf
Why is this feature important to you -
Search engines like Google allocate a crawl quota to domains, determining the time and resources dedicated to indexing a website. If these crawlers expend their resources on redundant URLs, like CNAME subdomains, there's a potential for vital content on our site to go unnoticed during indexing. This poses a risk of missing out on organic traffic if the crawlers neglect new or updated content while focusing on less crucial areas.
How would you like the feature to work - We seek support from Product Engineering to facilitate updates to the robots.txt file.
Current Behavior - Analytics Product Engineering currently does not provide support for changes in the robots.txt file. This results in significant financial losses due to expenses associated with SEO configurations and unnecessary utilization of our crawler quotas.