Hi - I'm hoping to gather details around scenarios where an audience with a Batch evaluation apparently streams audience data into a Streaming destination.
More specifically, even though the Batch evaluation for the audience itself occurs every 24 hours, the dataflows for the streaming destination (e.g. Facebook Custom Audiences) still continues to process records on an hourly basis.
Wondering if anyone can provide more information around how this is possible?
Thanks in advance.
Topics help categorize Community content and increase your ability to discover relevant content.
Hi @AdamKi1
I believe you might be thinking of this incorrectly.
Audiences that are batch evaluated, are being fed via datasets that are sourced from batch ingestion data. (Think Amazon S3)
Audiences that are streaming, are being fed via datasets that are sourced from near real time data (Think Amazon Kinesis)
You can still export audiences that evaluate via the global evaluation to streaming/edge destinations, but there are guardrails behind that.
Default guardrails for data activation | Adobe Experience Platform
Thanks for your response, David!
This is sort of an odd case, where the dataset that's driving the audience is being fed via streaming (HTTP API), and the audience segmentation is being defined by Adobe Analytics data, but the only option to evaluate the audience itself is Batch(?). Also entirely possible that there is a single batch entry source that's feeding into the same dataset...
However, that might explain what I'm seeing here -- even though the audience itself is based off of a Batch evaluation, the primary source dataset is fed via streaming and making it so the resulting data is still "streaming" to the streaming destination
However, that might explain what I'm seeing here -- even though the audience itself is based off of a Batch evaluation, the primary source dataset is fed via streaming and making it so the resulting data is still "streaming" to the streaming destination.
Above statement makes sense to me. If ingested data (streaming) is making its way to RT-CDP in less than 15 mins. Streaming records will carry the most fresh data to streaming destination.
I have not explored this. If publishing audiences to streaming destination relies on RT-CDP, it will carry the refreshed data. But if it is sourced from profile snapshot (data-lake), delayed for until snapshot is completed successfully.
You can do a quick POC on this. Let us know what the platform behavior is.
@AdamKi1 Did you find the suggestion helpful? Please let us know if you require more information. Otherwise, please mark the answer as correct for posterity. If you've discovered a solution yourself, we would appreciate it if you could share it with the community. Thank you!
Views
Replies
Total Likes
Views
Likes
Replies
Views
Like
Replies