In a large organisation with a secondary report suite the low traffic bucket has far reaching implications for data integriy. For example a primary report suite may not ever hit the low traffic limit and as such is able to accurately report on items such as url's and page names. However when you go to the secondary report suite which has many more values from other primary report suites we commonly find that items are bundled int he low traffic bucket.
This creates significant reporting descepancies between the two report suites. Which means depending on what type of query and which report suite they are looking at they can get very different or simply missing data when comparing the two.
As you can imagine consistency in data across primary and secondary report suites is key, in establishing trust in data and driving data's use through out the organisation.
I believe for larger implementations the use of the low traffic bucket, undermines the entire implementation.
My suggestion to Adobe is to remove this from all reports and products and report on the full data set regardless of size.