Join us on September 25th for a must-attend webinar featuring Adobe Experience Maker winner Anish Raul. Discover how leading enterprises are adopting AI into their workflows securely, responsibly, and at scale.
Hi ,I have an question:If ingest a data using Batch ingestion and just to avoid unwanted data and error before profiling ,i don't enabled the dataset for profile. so if i enabled the dataset for profile after few days of Data ingestion.How much time it takes to do profiling done of that data ?or is ...
Customer has to buy add-on in order to have Customer AI & Look alike audiences?These are bought in packages as explained below? https://helpx.adobe.com/legal/product-descriptions/real-time-customer-data-platform-b2c-edition-prime-and-ultimate-packages.html
We are migrating from appmeasurement (using Analytics Source Connector in RTCDP) to WebSDK (Datastream) and part of that conversion is switching our RTCDP from relying on the Analytics Source Connector to Datastreams directly into AEP.We want to cutover to the datastream to push into the same datase...
Hi there,As we all know we have now new ruleset for the Streaming Segment Evaluation, As per that I have few query, can some one help me to clarify:Query 1 What if I have 5 attributes in my Streaming segment, and out of them 2 attributes are batch attributes and other 3 are Streaming attributes, Now...
Hi!!We are trying to configure new destination with mapping having one column name as "event" equal to a certain string or text as "XYZ in destination export file.Is there any way we can update the value based on audience or every time this would require setting up a new destination for each audienc...
Hi, when I am creating the segment with only the event condition it is not allowing me to Create it as streaming segment. Condition for the segment is Event type name equals Itemurchased. what could be the reason for it
Hi Adobe,We have a source connection to Azure Blob, and the dataflow initially failed with the error "failed to copy data from source." Since then, the dataflow has been running every hour. Although there are files present in the Azure Blob, the dataflow runs without picking up the files, but the st...