Let's face it - bots are getting smarter. Bot Filtering needs to get smarter too. Bots are getting better at changing user-agents, IP addresses, domains and more in between visits. Bots are now even adding delays between page views to look more like a normal visitor. And don't forget, they're running JavaScript too.
Adobe needs to expand Bot Filtering to use look-alike modelling to discover visits with abnormally high page views.
Similar to this Idea: http://ideas.omniture.com/t5/Idea-Exchange-for-Adobe/IAB-Bot-List-Segment-in-Analytics/idi -p/11708
we need a way to be able to apply 'Bot Segments' to reports so we can better understand what traffic changes will result prior to committing them as rules.
Right now the industry is losing too much accuracy in our data to rely solely on a black box of IP and user-agent based filtering.