Expand my Community achievements bar.

Join us January 15th for an AMA with Champion Achaia Walton, who will be talking about her article on Event-Based Reporting and Measuring Content Groups!
SOLVED

Analyses of Workspace -Average Landing page visit

Avatar

Level 1

 I am seeing an influx of bot traffic, looking for a good way to filter this out.
(Average landing page visits per day) in workspace ?

1 Accepted Solution

Avatar

Correct answer by
Community Advisor and Adobe Champion

Bots can be hard.... If you do a Data Warehouse export you can look at IP addresses associated to the content... if there are 1 or 2 prevalent IPs from those spammers, you can easily add (a) Custom Bot rule(s) based on those IPs to prevent them in future tracking.

 

Admin > Report Suites

(Choose Suite)

Edit Settings > General > Bot Rules

 

 

Bot Rules can be built based on User Agent, IP Address or IP Range...

 

However, if these rules aren't enough, there's another trick that you can do...

 

If you can create a segment that identifies this bot traffic reliably... One of the things I did in my system was to create what I call my "Clean Suite"... this is a virtual suite that uses my "clean data" segment (in this segment I can add exclusions to remove data I can identify as being non-real traffic, or when developers push some bad code that causes infinite redirect for certain browsers/devices, etc - basically stuff that I don't want included in any reporting)

 

I instruct all users to build their reports using the "Clean Suite" when they are looking at the network, but I also apply my "clean data" segment to ALL my virtual suites.

 

It's a bit of a lift up-front, since you will have to update all your existing reports, but the nice thing is, once you have this set up, when you notice issues you can apply retroactively...

 

This doesn't take the place of the bot rules (I still suggest keeping those rules in place, as you will be able to see the traffic in your bot reports - which should be added to Workspace before Reports in sunset in Dec).

 

Basically, this is solution tries to attack the issue from multiple directions.

View solution in original post

1 Reply

Avatar

Correct answer by
Community Advisor and Adobe Champion

Bots can be hard.... If you do a Data Warehouse export you can look at IP addresses associated to the content... if there are 1 or 2 prevalent IPs from those spammers, you can easily add (a) Custom Bot rule(s) based on those IPs to prevent them in future tracking.

 

Admin > Report Suites

(Choose Suite)

Edit Settings > General > Bot Rules

 

 

Bot Rules can be built based on User Agent, IP Address or IP Range...

 

However, if these rules aren't enough, there's another trick that you can do...

 

If you can create a segment that identifies this bot traffic reliably... One of the things I did in my system was to create what I call my "Clean Suite"... this is a virtual suite that uses my "clean data" segment (in this segment I can add exclusions to remove data I can identify as being non-real traffic, or when developers push some bad code that causes infinite redirect for certain browsers/devices, etc - basically stuff that I don't want included in any reporting)

 

I instruct all users to build their reports using the "Clean Suite" when they are looking at the network, but I also apply my "clean data" segment to ALL my virtual suites.

 

It's a bit of a lift up-front, since you will have to update all your existing reports, but the nice thing is, once you have this set up, when you notice issues you can apply retroactively...

 

This doesn't take the place of the bot rules (I still suggest keeping those rules in place, as you will be able to see the traffic in your bot reports - which should be added to Workspace before Reports in sunset in Dec).

 

Basically, this is solution tries to attack the issue from multiple directions.