Hi @akwankl ,You can try to create another schedule job only for the segment which you want to run after the batch data is ingested, this way you can evaluate the segment twice once with global evaluation job and once with your schedule. I am not 100% sure, if we can have more than one schedule.@Kum...
Hi @trojan_horse , One way you can use this is to model the matching data as lookup data in the AEP. Like you ingest the code in the Actual dataset and profile reference through schema attribute to the value in the lookup schema. As your are going to mark the code as primary identity in the lookup s...
Hi @MichaelBa4 I believe the reason for making compression mandatory to avoid over usage of DLZ space. What if everyone start exporting datasets into DLZ and soon it might run out of space. Regards,Vinod
Hi @TylerKrause , I believe you are using "Dataset Export to cloud storage" option? The limitation for this is it cannot export data older than 365 days. I would recommend you to trick the AEP platform by creating a new dataset without enabling it for profile. This will acting as temp dataset for ex...
Hi @akwankl , You need to know the Segment evaluation process.The Streaming evaluation happens only if the source data is Streaming. Even though you have ingested data through batch, the evaluation of segments happen on during the global segment evaluation time. Also, in your case the Daily Segment...
Hi @DavidSlaw1 , Be it any type of Segment evaluation, every segments gets evaluated once in a day during the Segment batch job execution. By default if your segment has qualified for Streaming or edge then it only gets evaluated in this category only of the source data is qualified as streaming or ...