Expand my Community achievements bar.

Join us for the next Community Q&A Coffee Break on Tuesday April 23, 2024 with Eric Matisoff, Principal Evangelist, Analytics & Data Science, who will join us to discuss all the big news and announcements from Summit 2024!
SOLVED

Bot Filtering using User Agent

Avatar

Level 2

Hello Community, 

 

While I was analyzing site traffic at our company, I noticed traffic from some of the user agents has Bounce rate = 100%, Unique Visitors, Visits and Page Views are all same numbers. Can I conclude that these are Bot traffic by taking these into consideration? or Is there anything else I would need to look at before concluding these User Agents as Bots? 

vallus_0-1668187465812.png

 

 

1 Accepted Solution

Avatar

Correct answer by
Community Advisor

IP Addresses (unless you've obfuscated them) are available in raw data and Data Warehouse exports (they are not available in Workspaces/Reporting/etc)

 

When you build a Data Warehouse Export, they are a standard dimension that is available to you:

 

Jennifer_Dungan_0-1668443648711.png

 

 

You can then create custom exclusion rules (in this case, I would use Custom Bot rules, so that you can at least monitor ongoing traffic via the bot reports)

 

Jennifer_Dungan_1-1668443745222.png

 

 

Jennifer_Dungan_2-1668443802508.png

Jennifer_Dungan_3-1668443999681.png

 

 

 

While you can exclude by User Agents, and if this is a very exclusive User Agent (like for instance, we use Screaming Frog SEO tools to check out site, we could exclude that by User Agent if we wanted to), it is probably safer to exclude by IP address (so long as your identified traffic seems to be coming from the same IP - you do have to watch out though... a lot of bots funnel through cloud services like AWS, and therefore their IPs can change)

 

View solution in original post

6 Replies

Avatar

Community Advisor

Those certainly look suspicious... given values like that, I imagine that would be a candidate for us to exclude the data... 

I would also look at the city where the traffic is coming from... unless you are an international company, if the location is some strange area that you don't expect traffic from, it's probably safe to add an exclusion rule... we do that frequently, and if we do lose a little real traffic with this, we accept the risk.

Avatar

Employee Advisor

@vallus It does look like bot traffic, but it is better to check the IP address of these hits, you can check if any of those belong to your internal IP address.

 

You can get IP addresses through data warehouses or data feeds, You can check what these IP addresses are and if you think these are suspicious IP addresses you can add these IP addresses in bot filters or IP exclusion 

Below are the docs on IP exclusion and bot filtering in Adobe analytics

IP Exclusion:

https://experienceleague.adobe.com/docs/analytics/admin/admin-tools/exclude-ip.html?lang=en

 

Bot Filters:

https://experienceleague.adobe.com/docs/analytics/admin/admin-tools/bot-removal/bot-rules.html?lang=...

 

Hope this helps

 

 

Avatar

Level 2

@VaniBhemarasetty @Jennifer_Dungan  Thank you very much for your great solutions! appreciate!

 

Sorry if this is a bit basic question - How do we export IP's from DW? Would you mind sharing any documentations on how to do it please?

 

Avatar

Correct answer by
Community Advisor

IP Addresses (unless you've obfuscated them) are available in raw data and Data Warehouse exports (they are not available in Workspaces/Reporting/etc)

 

When you build a Data Warehouse Export, they are a standard dimension that is available to you:

 

Jennifer_Dungan_0-1668443648711.png

 

 

You can then create custom exclusion rules (in this case, I would use Custom Bot rules, so that you can at least monitor ongoing traffic via the bot reports)

 

Jennifer_Dungan_1-1668443745222.png

 

 

Jennifer_Dungan_2-1668443802508.png

Jennifer_Dungan_3-1668443999681.png

 

 

 

While you can exclude by User Agents, and if this is a very exclusive User Agent (like for instance, we use Screaming Frog SEO tools to check out site, we could exclude that by User Agent if we wanted to), it is probably safer to exclude by IP address (so long as your identified traffic seems to be coming from the same IP - you do have to watch out though... a lot of bots funnel through cloud services like AWS, and therefore their IPs can change)