I'm sending consent preferences upon account creation for a website, planned to be sent by the Web SDK. I want to set preferences at the channel level, but on my RTCDP profile, I'm seeing a path that is specific to the ECID. How can I set consent at the email level, rather than something that is dev...
There is CDP data ingestion guardrails, that talk about what size of data can be ingested vs dropped. We don't find any document that clearly explains how to measure these during solutioning or implementation phase ? Adobe support always gives answer about profile richness which is dataLake size dev...
I see these options:1. There are connectors like SFTP / BigQuery2. Is there any API which can send data futher? Which API?3. Maybe external tool can access specific profile dataset and get these data? Which API?
The requirement is to create a datastream to push the event data using AEP event forwarding services. but the website doesn't have any launch container( neither dev or prod) implemented on the page.@brian_au
I created a package that includes a schema in Sandbox A. After resetting the sandbox (which wiped all resources, including the schema) but not package, I re-imported the same package in Sandbox A and, surprisingly, the schema was re-created successfully. However, the package only references the sch...
This may be a repetitive question, but I can't find the exact solution to solve this issue.I'm currently using this code event.value = util.printd("name mm/dd/yyyy", new Date); for a stamp that is meant to sign off my work documents, but the stamp itself isn't automating the new date. I've had to go...
Hi everyone, In my current adobe setup, I have AEP, Event forwarding and Adobe data collection. The use case I have requires me to send data from Websdk to Event forwarding property to Google Analytics. For this particular use case, I need to generate new google client if it doesnt exist and use th...
Hi Community,I have to replace complete array with new values of array on every batch ingestion, but on every batch ingestion new items gets appended to this array in target dataset instead of replace. I have used upsert_array_replace in a calculated field and my dataset is profile and upsert enable...
I have an existing destination setup in Adobe Experience Platform where audience data is being exported. I would like to update the existing field mapping (e.g., map a new schema attribute or modify the transformation logic) using the API, without recreating the destination or dataflow.
Is there any way to find out what records failed after partial batch ingestion in adobe experience platform UI? I don't have access to API. There is one record failed when ingested 100K record and want to find out what is the issue with failed record?