HI,
I'm working on incremental dataset exports (150 days historical + daily incremental) - Destinations.
Quick points I want clarity on:
1. Incremental Export file name : Handling customization filename with timestamps.
How are filenames changed or updates when doing this?
2. Suppose we've already configured a destination—if the dataset schema later changes (e.g., 10 new fields are added), how does this affect exports? Will existing dataflows break or adapt automatically?
3. Error Handling: Adobe’s approach when the destination or source system is unavailable or experiences failures.
If a dataset export fails due to adobe runtime failure. Will adobe retry the failed ones's.?
4. Sample Export: How can we export one day’s data sample (via Query Service, API, or filtered Dataflow) to analyze the file size, nested structure, and format before the full export?
Would appreciate insights or recommendations!
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
Hi @JathiRatnalu - Please find the response below for the mentioned queries.
1. File name - Document says, The default file name is randomly generated and ensures that exported file names are unique. Also, for dataset export customizing the file name is not currently supported.
2. Update on the schema - Honestly, I haven't come across this scenario. Logically, it should be incorporating all those newly added fields along with ingested data on the incremental load.
3. Error Handling - Retying exporting the file is always in place automatically when any system failure occurs. However, periodically monitor the export file dataflow and act it accordingly.
4. Sample Export - Run a query for a sample data and update that with a new dataset using schedule option. Later, export that dataset to the destination for your analysis.
Check the document for your reference - https://experienceleague.adobe.com/en/docs/experience-platform/destinations/ui/activate/export-datas...
Thank you,
Jayakrishnaa P.
Views
Replies
Total Likes
Hi, @JathiRatnalu,
I will only add some info, as @jayakrishnaaparthasarathy summarised it perfectly.
2. Yes, it does add automatically the 10 new fields in your example.
3. I think it's up to 3 times.
4. Create a new dataset. Use query service to copy the one-day sample data to that new dataset. Export the newly created dataset. Analyse the info.
Views
Replies
Total Likes