Hi all,
I created a scheduled export in the Datawarehouse using a segment as filter.
It runs several time before I noticed an adjustment that has to be done in the segment I used to filter.
To avoid to recreate my DWH export, I did the modification directly in the segment and keep running the same DWH export..
My questions are:
- will the segment updated be taken into account in the next runs of my export. I think so but you'll be great if you can confirm.
- will it also be ok if I use the option Resend files for the previous runs in the DWH interface to send now the correct files from the past
or the file is just resend without any calculation update, or using the segment definition in scope at this period?
I tried and the segment modification are not taken into account for files Resend.
Maybe do I have to wait a specific time to synchronize segment update with the export?
Thanks for your help
Robin
Solved! Go to Solution.
Views
Replies
Total Likes
Ah.. daily reports for a long stretch... I wasn't aware of that....
Is your current report using any granularity? If not, I would suggest duplicating once, getting the entire range of dates that needed to be re-run by adding a daily granularity... this is how I re-run reports like this...
Depending on the process that reads them, it can either read the whole file and use the dates to break it up... or you can manually split the one file into 19 files and remove the extra column....
The problem with "resend" is it will use your rolling date range.. so it's it's set to "yesterday" when you resend it, I believe it will use "yesterday" each time.
Views
Replies
Total Likes
The next time your export runs on its own it will take the segment updates into account. Unless you made a whole new segment, which I doubt.
I believe that resend will re-process data.... so it should take the segment updates into account.... but if processes and sends really quickly, or you don't want to take the chance.. you can always duplicate the export to be safe.
Views
Replies
Total Likes
Thanks for your answer. Duplicate is also a possibility but I have to reprocess files from the 1st June until 19th June as a daily export (because I need one file per day for internal process reason). Not sure I can reschedule that in the past. And create 19 duplicates is not also the ideal solution.
I will check again tommorow to check if a sync is needed or not after the segment update by rerun the past process.
Not sure the files are reprocessed. Based on the doc (https://experienceleague.adobe.com/docs/analytics/export/ftp-and-sftp/ftp-limits.html?lang=en), there is a retention period but it is on the server only?? Not sure...
Views
Replies
Total Likes
Ah.. daily reports for a long stretch... I wasn't aware of that....
Is your current report using any granularity? If not, I would suggest duplicating once, getting the entire range of dates that needed to be re-run by adding a daily granularity... this is how I re-run reports like this...
Depending on the process that reads them, it can either read the whole file and use the dates to break it up... or you can manually split the one file into 19 files and remove the extra column....
The problem with "resend" is it will use your rolling date range.. so it's it's set to "yesterday" when you resend it, I believe it will use "yesterday" each time.
Views
Replies
Total Likes