Hello Team,
We maintain a Global Report Suite along with multiple Virtual Report Suites derived from the Global one. While we can exclude bot-identified events or traffic at the Virtual Report Suite level, this functionality is not available in the Global Report Suite unless we manually apply a workspace segment (filtering based on specific dimensions).
Our goal is to ensure data consistency and alignment between the Global and Virtual Report Suites to accommodate diverse business requirements.
Has anyone else encountered this situation and identified an effective way to manage it?
Solved! Go to Solution.
Topics help categorize Community content and increase your ability to discover relevant content.
Yep, as @Harveer_SinghGi1 said, there are actual Bot Filtering rules for your suites... however, these only work going forward and not backwards... whereas an updated segment will apply to all the data retroactively in your VRS.
I assume your current setup is something like:
Global Suite
Subset 1 of Global Suite (which also includes "clean bots" segment")
Subset 2 of Global Suite (which also includes "clean bots" segment")
etc
Have you considered just creating another virtual suite called "Global Suite (Clean)" (the only segment that would be applied would be your bot filtering rule, but other than that, this would be a full suite of data) and hiding the real Global Suite from everyone except admins?
This would mean having to update all your main reports to use the new "clean" suite, but it would keep the visible Global Suite in line with all the rest of your VRSs.
I did this a few years ago in my org... less for bot filtering, but if we had issues with inflated data (due to bad deployments), I could add the logic to "right size" the data into my "clean data" segment... both my visible Global Suite and all the sub-VRSs have the "clean data" applied, but I still have access to the actual main suite (along with a few other admins)
Hi @EliasA ,
You can use Adobe Analytics' in built features like Exclude by IP address and Bot Rules to exclude the unwanted traffic on the global report suite level which then makes sure all VRSs have the consistent data.
These tools however work for live/new upcoming data, the data that has already been collected in report suite won't be affected by these. For the historical data you may want to check with Adobe support if they can reprocess your data based on bot identifiers you provide, it might be possible through VISTA rules but it will endure cost.
Cheers!
Yep, as @Harveer_SinghGi1 said, there are actual Bot Filtering rules for your suites... however, these only work going forward and not backwards... whereas an updated segment will apply to all the data retroactively in your VRS.
I assume your current setup is something like:
Global Suite
Subset 1 of Global Suite (which also includes "clean bots" segment")
Subset 2 of Global Suite (which also includes "clean bots" segment")
etc
Have you considered just creating another virtual suite called "Global Suite (Clean)" (the only segment that would be applied would be your bot filtering rule, but other than that, this would be a full suite of data) and hiding the real Global Suite from everyone except admins?
This would mean having to update all your main reports to use the new "clean" suite, but it would keep the visible Global Suite in line with all the rest of your VRSs.
I did this a few years ago in my org... less for bot filtering, but if we had issues with inflated data (due to bad deployments), I could add the logic to "right size" the data into my "clean data" segment... both my visible Global Suite and all the sub-VRSs have the "clean data" applied, but I still have access to the actual main suite (along with a few other admins)
Thank you very much Jennifer, this was a great suggestion and solution forward,
Views
Replies
Total Likes
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies