Hello Community,
In this scenario we are trying to pull data from adobe cloud and save it in AWS S3 bucket as a pipe delimited file.
Solved! Go to Solution.
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
Hi Nitin,
You should use scheduler available within the incremental query activity.
If this is not possible for some reasons then you can use scheduler +JS to keep track of the last execution date and a standard query activity to use that execution date from JS in a workflow.
Regards,
Amit
Hi Nitin,
You should use scheduler available within the incremental query activity.
If this is not possible for some reasons then you can use scheduler +JS to keep track of the last execution date and a standard query activity to use that execution date from JS in a workflow.
Regards,
Amit
Hi Amit, But can we determine where it left off and how are we going to store and read that value without doing any coding is their any custom code which we can refer ?
Views
Replies
Total Likes
Views
Replies
Total Likes
Hey Nitin,
let me reiterate the problem.
1. let's say customer wants to extract the data from 1st Jan to till now [18th Jan] and first time workflow was ran on 14th hence the data till 14th was exported.
2. Next time workflow ran on 15th so rather than sending out the data from 1-15 Jan, you want to export the data post 14 Jan[ last execution] but before 15 Jan, right?
solution:
create a option with name last run
once this is created put a scheduler and write a js code to read the option.
1.schedule to trigger the workflow everyday.
2. JS code- read option and store last run date and also getCurrentdate() on two separate event variable.
3.pass the last run date in query activity like : last modified data on or after $datetime(vars/@lastRepDate)
4. Once done then update the option using code: setOption(vars.optionName, vars.nextRepDate); // added on end activity advanced tab.
wasim
Views
Replies
Total Likes
Views
Likes
Replies