Expand my Community achievements bar.

Join us for the next Community Q&A Coffee Break on Tuesday April 23, 2024 with Eric Matisoff, Principal Evangelist, Analytics & Data Science, who will join us to discuss all the big news and announcements from Summit 2024!
SOLVED

how do you handle bots / bad traffic?

Avatar

Level 6

Hi all,

I have a question: what do you guys do when you detect an IP/user agent/old browser etc. that delivers like 10,000 visits per month with no conversions?

We sometimes have that, no campaign code, might just bounce, looks unhuman but is not filtered out automatically.

We currently have a segment to filter out "bad traffic". However, the segment gets larger and larger, so I am looking for alternatives.

Regards

Stefanie

1 Accepted Solution

Avatar

Correct answer by
Level 10

Stefanie, just out of curiosity, have you tried to lookup the IPs that generate the most bad traffic to get insights about networks, namespaces, etc.? Sometimes, the traffic can be originated not only by "bad" bots, but also by QA bots your IT may use on purpose to monitor website availability and vulnerability. In my experience, the auto bot filters detection rate for this type of traffic is low. The analysis by IP helps quickly discover that.

View solution in original post

4 Replies

Avatar

Correct answer by
Level 10

Stefanie, just out of curiosity, have you tried to lookup the IPs that generate the most bad traffic to get insights about networks, namespaces, etc.? Sometimes, the traffic can be originated not only by "bad" bots, but also by QA bots your IT may use on purpose to monitor website availability and vulnerability. In my experience, the auto bot filters detection rate for this type of traffic is low. The analysis by IP helps quickly discover that.

Avatar

Level 6

Hi,

in the meantime I have found ideas relating to the topic:

https://forums.adobe.com/ideas/10104

https://forums.adobe.com/ideas/9553

https://forums.adobe.com/message/11111848#11111848

https://theblog.adobe.com/advance-bot-filtering-powers/

I noticed that some people create a virtual report suite. I'm just wondering if this is practical or has disadvantages. Therefore it would be intresting to hear if people generally filter out little or what they do. Our goal is to have a clear view of the conversion rate trend.

Historically grown, our filter contains about 20 old browsers, 10 user agents, 9 uncommon monitor resolutions and 7 orig. referring domains. Most of them cause a relatively small amount of traffic i.e. 2000 visits per month. IPs might be added to the segment in the future.

@Andrey: the IT testing system is filtered out, but thanks for your suggestion because we haven't checked networks, namespaces etc. :-)

Avatar

Community Advisor

just a word about "Virtual Report Suites" (VRS): we changed access and give business only data in the VRS. means until someone is an analyst, they only get access to filtered data. In my opinion there are at least 2 advantages:

1) filter bot traffic: any VRS can have more than one segment filtering bad traffic. Whenever we detect something, we just update our bot segment (or add a new one). what happens is that every workspace project (and scheduled pdf) depending on that VRs is immediately updated! no internal communication needed, no need for updating projects at large ...

2) component curation: in a VRS you can select what  dimensions (props/eVars) and events should be visible. this makes it easy for new users to start working with project templates since they don't have a bunch of items in the left row.

(almost) no advantages without a drawback: it takes some time to setup (and maintain), but I believe it's worth in the long run.

Avatar

Community Advisor

Hi Stefanie,

Bot detection is really a pain, especially for retail sites. Adobe should come up with a better out of box or engineering solution, current BOT removal process (since 2011-12) is just not seems to be working in case of smart Bots. No solutions mentioned in various threads is seems to be full proof.

Thanks!

Asheesh