Expand my Community achievements bar.

Retroactively Remove Bad Data from Bots


Level 1

My site had an influx of bot visits from April 26th-June 6th. I was able to create a new bot rule to exclude the user agent in the future.


However, is there a way to retroactively exclude/remove this data?

2 Replies


Adobe Champion

Hi @mp03,

For the data that has already come in you will need to use a segment to exclude that data. 

If you're using a VRS, you can add a segment to it that excludes user agents you've identified as bots, and that will work for all historical data. If you're not using a VRS, you would need to add the segment to the workspaces manually.


Community Advisor

In our company, I have what I call the "Clean" Suite, which is a Virtual Suite that all reports are built off of. 


Then I use Segments to exclude garbage, like issues with double tracking or bots, that I can either add to my "Clean" Segment, or if it's complex, I can create a separate segment and add to my Clean Suite(s).


This way I can retroactively add exclusions that will automatically apply to all reports, and then our users don't have to manually add new segments to all their reports....


Now, I see that you are using User Agent to identify that bot data, and if you aren't tracking the User Agent in an eVar this might be hard to create a segment at this time... 

Hopefully there is something you can use to identify this traffic via a segment, and if you do like I do above, this will be a lot of work to transition people to a "clean" virtual suite for future issues like this... but it still may be worth it.