Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
BedrockMission!

Learn More

View all

Sign in to view all badges

how do you handle bots / bad traffic?

Avatar

Avatar
Contributor
Level 6
stefanies325986
Level 6

Likes

140 likes

Total Posts

109 posts

Correct Reply

0 solutions
Top badges earned
Contributor
Shape 25
Shape 10
Shape 1
Give Back 3
View profile

Avatar
Contributor
Level 6
stefanies325986
Level 6

Likes

140 likes

Total Posts

109 posts

Correct Reply

0 solutions
Top badges earned
Contributor
Shape 25
Shape 10
Shape 1
Give Back 3
View profile
stefanies325986
Level 6

02-08-2019

Hi all,

I have a question: what do you guys do when you detect an IP/user agent/old browser etc. that delivers like 10,000 visits per month with no conversions?

We sometimes have that, no campaign code, might just bounce, looks unhuman but is not filtered out automatically.

We currently have a segment to filter out "bad traffic". However, the segment gets larger and larger, so I am looking for alternatives.

Regards

Stefanie

Accepted Solutions (1)

Accepted Solutions (1)

Avatar

Avatar
Shape 25
MVP
Andrey_Osadchuk
MVP

Likes

536 likes

Total Posts

968 posts

Correct Reply

263 solutions
Top badges earned
Shape 25
Bedrock
Coach
Boost 500
Affirm 250
View profile

Avatar
Shape 25
MVP
Andrey_Osadchuk
MVP

Likes

536 likes

Total Posts

968 posts

Correct Reply

263 solutions
Top badges earned
Shape 25
Bedrock
Coach
Boost 500
Affirm 250
View profile
Andrey_Osadchuk
MVP

02-08-2019

Stefanie, just out of curiosity, have you tried to lookup the IPs that generate the most bad traffic to get insights about networks, namespaces, etc.? Sometimes, the traffic can be originated not only by "bad" bots, but also by QA bots your IT may use on purpose to monitor website availability and vulnerability. In my experience, the auto bot filters detection rate for this type of traffic is low. The analysis by IP helps quickly discover that.

Answers (3)

Answers (3)

Avatar

Avatar
Applaud 500
MVP
Asheesh_P
MVP

Likes

130 likes

Total Posts

334 posts

Correct Reply

145 solutions
Top badges earned
Applaud 500
Affirm 100
Contributor
Boost 100
Give Back 10
View profile

Avatar
Applaud 500
MVP
Asheesh_P
MVP

Likes

130 likes

Total Posts

334 posts

Correct Reply

145 solutions
Top badges earned
Applaud 500
Affirm 100
Contributor
Boost 100
Give Back 10
View profile
Asheesh_P
MVP

04-08-2019

Hi Stefanie,

Bot detection is really a pain, especially for retail sites. Adobe should come up with a better out of box or engineering solution, current BOT removal process (since 2011-12) is just not seems to be working in case of smart Bots. No solutions mentioned in various threads is seems to be full proof.

Thanks!

Asheesh

Avatar

Avatar
Coach
MVP
ursboller
MVP

Likes

525 likes

Total Posts

1,020 posts

Correct Reply

257 solutions
Top badges earned
Coach
Contributor
Bedrock
Seeker
Springboard
View profile

Avatar
Coach
MVP
ursboller
MVP

Likes

525 likes

Total Posts

1,020 posts

Correct Reply

257 solutions
Top badges earned
Coach
Contributor
Bedrock
Seeker
Springboard
View profile
ursboller
MVP

02-08-2019

just a word about "Virtual Report Suites" (VRS): we changed access and give business only data in the VRS. means until someone is an analyst, they only get access to filtered data. In my opinion there are at least 2 advantages:

1) filter bot traffic: any VRS can have more than one segment filtering bad traffic. Whenever we detect something, we just update our bot segment (or add a new one). what happens is that every workspace project (and scheduled pdf) depending on that VRs is immediately updated! no internal communication needed, no need for updating projects at large ...

2) component curation: in a VRS you can select what  dimensions (props/eVars) and events should be visible. this makes it easy for new users to start working with project templates since they don't have a bunch of items in the left row.

(almost) no advantages without a drawback: it takes some time to setup (and maintain), but I believe it's worth in the long run.

Avatar

Avatar
Contributor
Level 6
stefanies325986
Level 6

Likes

140 likes

Total Posts

109 posts

Correct Reply

0 solutions
Top badges earned
Contributor
Shape 25
Shape 10
Shape 1
Give Back 3
View profile

Avatar
Contributor
Level 6
stefanies325986
Level 6

Likes

140 likes

Total Posts

109 posts

Correct Reply

0 solutions
Top badges earned
Contributor
Shape 25
Shape 10
Shape 1
Give Back 3
View profile
stefanies325986
Level 6

02-08-2019

Hi,

in the meantime I have found ideas relating to the topic:

https://forums.adobe.com/ideas/10104

https://forums.adobe.com/ideas/9553

https://forums.adobe.com/message/11111848#11111848

https://theblog.adobe.com/advance-bot-filtering-powers/

I noticed that some people create a virtual report suite. I'm just wondering if this is practical or has disadvantages. Therefore it would be intresting to hear if people generally filter out little or what they do. Our goal is to have a clear view of the conversion rate trend.

Historically grown, our filter contains about 20 old browsers, 10 user agents, 9 uncommon monitor resolutions and 7 orig. referring domains. Most of them cause a relatively small amount of traffic i.e. 2000 visits per month. IPs might be added to the segment in the future.

@Andrey: the IT testing system is filtered out, but thanks for your suggestion because we haven't checked networks, namespaces etc. 🙂