Description - As a user, we would like Adobe to provide us access to the robots.txt file of our analytics subdomain so we can prevent search engines from crawling
Why is this feature important to you - Googlebot is crawling resources on our adobe hosted analytics subdomain millions of times a week. This is hurting our crawl budget, resulting in less traffic and revenue to our eCommerce sites
How would you like the feature to work - We would like the robots.txt set to Disallow all
User-agent: *
Disallow: *
Current Behaviour - The robots.txt file is set to Allow All
User-agent: *
Disallow: