Hi,
I was reviewing the documentation (as seen in the image below) and came across a point that raised a question for me. According to the document, with a 5-second gap between events, the system would process about 17,280 events in a day (12 events per minute × 60 minutes × 24 hours). Based on this, I’m wondering: Given this volume, would this still be considered large-scale or big data by your platform’s standards?
Thanks,
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
Hi @HảoHo
can you please add a little more context like
Views
Replies
Total Likes
Hi @bjoern__koth ,
I am referring to this document FAQ of Segmentation Service Guide.
I haven’t had a chance to try Adobe Experience Platform yet. I’ve only read the documentation to evaluate whether it meets our requirements. I’m looking to achieve streaming segmentation that can handle large-scale events, such as 10,000 events per second. However, based on the documentation, it seems there must be a 5-second gap between events to avoid them being recognized as bot-generated. I might have misunderstood this—could you confirm this information and provide more details? Thanks!
Views
Replies
Total Likes
Views
Replies
Total Likes
Views
Likes
Replies