Expand my Community achievements bar.

SOLVED

Splitting of workflow to avoid wait activity

Avatar

Level 4

Hi Team,

We have some hundreds of workflows running in parallel everyday through a schedular activity which is triggered daily.
So each of these workflows having a wait activity in between the workflow like below:

 

Ramaswami_0-1590004706451.png

 

 

The wait period is around 2 days due to which number of concurrent workflows is exceeding the limit which is specified as every workflow has a wait activity it continuously holding the execution.

So our idea is to split up the workflow into two
1. first workflow will run only up to the email activity 1 and then sends the export of recipient results to some S3 bucket.
2. the next workflow will pickup the recipient results from S3 bucket after two days and will process the next execution( that is sending the next email).

Here my problem is i have exported the results in csv format to S3 bucket sucessfully so part 1 is done.
part 2: i have downloaded the recipient file from S3 bucket and when i am loading the data and sending the results to continously delivery. it is throwing the below error:

Part 1: Extracting the results and sending email and storing the result file in S3 bucket: which went sucessfully

Ramaswami_1-1590004783097.png

 

part 2: 

Ramaswami_2-1590004856076.png

 

I am downing the file with columns used as additional data in part 1 and uploading the data in part 2. i am using update data activity in order to linkup the targetting columns with recipient table and then sending an email. it should send only 4 emails but it is continously sending me emails like 20,21,22 and so on. 

 

When i don't use the update data activity. i am getting an error like below : 

if i dont use update data activity : My data loading result will be like this : 

Ramaswami_3-1590005071386.png

and i will feed this to continous delivery i am getting this erro: 

Ramaswami_4-1590005406978.png

Let me know if any ready made solution or any alternative we have for this. 

 

 

hi @Milan_Vucetic 

 

i have followed your second suggestion. kept a intersection query with data loading query: 

 

Ramaswami_0-1590162127309.png

 

So i am getting the below error : 

Ramaswami_1-1590162162929.png

 

Am i doing mistake anywhere. i guess this intersection takes some time as the intersection query should run all the records first and then intersect with the temp table. 

 

 

 

 

 

1 Accepted Solution

Avatar

Correct answer by
Community Advisor

Hello @Ramaswami,

I have skimmed over the question and it seems like you do not have the targeting dimension (recipient table linked to your working table) you cannot send an email from a file this way.

The error is also self explanatory that your targeting data are not type of nms:recipient, and which are awaited by the delivery (set by target mapping)

 

Note:

You  always will need to reconcile the file data  with adobe campaign table eg. recipient. if you want to use unions, intersections, change dimensions...

 

Note2:

why you just do not query delivery log table in the second workflow. The query could look like:

  • everybody who got the email 1  two days ago
    • DateOnly(logDate) == DateOnly(DaysAgo(2))
  • in case no results create test activity to check on vars.recCount > 0
  • put it on scheduler to run daily.

 

Note3:

control group can be easily set on the delivery template level, also it is easily accessible by query after wards from exclusion logs

 

Marcel

View solution in original post

9 Replies

Avatar

Correct answer by
Community Advisor

Hello @Ramaswami,

I have skimmed over the question and it seems like you do not have the targeting dimension (recipient table linked to your working table) you cannot send an email from a file this way.

The error is also self explanatory that your targeting data are not type of nms:recipient, and which are awaited by the delivery (set by target mapping)

 

Note:

You  always will need to reconcile the file data  with adobe campaign table eg. recipient. if you want to use unions, intersections, change dimensions...

 

Note2:

why you just do not query delivery log table in the second workflow. The query could look like:

  • everybody who got the email 1  two days ago
    • DateOnly(logDate) == DateOnly(DaysAgo(2))
  • in case no results create test activity to check on vars.recCount > 0
  • put it on scheduler to run daily.

 

Note3:

control group can be easily set on the delivery template level, also it is easily accessible by query after wards from exclusion logs

 

Marcel

Avatar

Level 4
Hi @MarceL, The idea is pretty good to not query the second part. But can you please elaborate the explination. So every day it runs some respondents and sends some emails on email 1. Day 1 : query (some respondents) -> email 1 -> (some respondents) -> js( i will capture the date here ) -> (so what comes here....?) Day 2 : again the same thing query (some respondents) -> email 1 -> (some respondents) -> js can you suggest me how can i store the respondents from first run and send them email2. Can you please elaborate a bit that would be helpful.

Avatar

Community Advisor
you do not have to as the email one is stored in delivery logs you only need to query delivery logs where the date sent was 2 days ago and the message has been delivered.. and if you query zero recipient just create zero population check with test activity. There is no need to use any Javascript or saving dates its simple.. if email 1 was sent 2 days ago sent second email thats it

Avatar

Level 4
Hi @MarceL, that's actually a fantastic idea checking the broadlog for two days ago modified respondents and sending them email 2. Thanks for this idea. a minor defect which i am thinking that will occur is for example : on 26th the query gave me 10 reespondents and sent email 1 ( so email 1 last modified date is 26th ) and on 27th the query again selected some of the same respondents here the email 1 last modified date will be changed to 27th which means if the query give the same respondents for consecutive days this approach will not give correct result. Will there be any solution for this bug.

Avatar

Community Advisor
Hello @Ramaswami did you stripped time from the broad load (check the broad log when the message was sent not when delivery was updated or created)) also you can do split to remove everybody who got email 2 as a fall back

Avatar

Community Advisor

Hi @Ramaswami ,

 

based on your preferences you can:

  • use node Change dimension node in order to switch to preferred targeting dimension (from temp:fileImport to nms:recipient)
  • use Intersection node and intersect between downloaded file and nms:recipient table before delivery
  • use External file () target mapping which allows you to send email to the customers which do not exist in nms:recipient table (in your case they exist but they are not reconciled) 

Take care about data types to avoid these errors Not valid integer. This will break your workflows often.

 

Regards,

Milan

 

Avatar

Level 4
hi @milan, thanks for the reply. I have tried your second suggestion and can you see my observations in the my asked question (bottom of the question). Also can you eloborate option 1: use node Change dimension node in order to switch to preferred targeting dimension (from temp:fileImport to nms:recipient) i guess option 1 will be simpler.

Avatar

Community Advisor
you cannot do intersection on table without targeting dimension aka recipient you need to reconcile first with recipient table. change dimension wont help either