Expand my Community achievements bar.

Join us LIVE in San Francisco on November 14th for Experience Makers The Skill Exchange. Don't miss out on this free learning event!

Execute sdenario in parallel

Avatar

Level 2

Hello!

Is it possible to execute bundles in parallel? I have a scenario with long lasting step, it would be nice if it was possible to make it parallel in order to save processing time.

Alternatively, some module forcing "processing all bundles till this point" could do the trick.

8 Replies

Avatar

Level 10

Hi,

did you try working with a Router module?

Visually it seems to continue processing on each path simultaneously, but still, I'm not really sure if that's what happens internally.

 

 

Regards

Lars

Avatar

Level 2

Hi.

I tried with router but I couldn't figure out how to configure it. However, Flow Control module contains Aggregator and Iterator steps, I combined the two to achieve what I wanted.

This, unfortunately, has a drawback - it seems that Aggregator uses filed names exactly as they come from previous modules, so you can't really aggregate step1 name and step2 name. This may be a problem in the future.

For now I've got this monstrosity doing exactly what I wanted:

Mateusz64_0-1694769702241.png

 

Avatar

Level 10

The router itself does not really need to be configured. It depends on what you really want to do...

 

Apart from that, I don't really understand what you want to achieve with an aggregator and an iterator, as this doesn't really make sense from my point of view.

 

As we are currently working on the AEM integration, my I ask which connector you are working with? Do you use the "native connector" or the "enhanced connector" in Workfront?

 

Regards.

Lars

 

Avatar

Level 2

My (simplified) scenarios needs to:

  1. Read documents from Workfront
  2. Put them in AEM DAM
  3. Set some metadata based on values from Workfront
  4. Publish assets.

My problem is that step 2 puts files to DAM and doesn't wait until AEM processes metadata. And this takes significant amount of time. I wanted to add a step to simply wait few minutes, but the problem is that scenario is executed one by one, so it waits few minutes for each document, which is not acceptable.

What I want is to upload all documents (preferably in parallel), then wait few minutes and than set properties and activate.

I don't know how to check which connector I'm using.

Avatar

Level 10

I would love to have a detailed exchange concerning this scenario with you, as I want to achieve some similar approaches.

As already mentioned, we are currently in the middle of setting up the whole thing in cooperation with Adobe Professional Services.

 

If you are interested, you could send a direct message to me.

 

Perhaps another approach could be a native API call using the WF API api-unsupported, as this has an action called sendDocumentsToExternalProvider.

 

To use it in Fusion you will need to do some steps but I was not able to successfully run it yet.

 

Enough for today.

Have a nice weekend.

 

Regards

Lars

Avatar

Community Advisor

Here's an option you could do.

1. Setup a second Fusion scenario using a custom webhook. This scenario should include everything that you want to do with the document, so downloading/uploading/pause/metadata etc. Note: one of the first things you'll have to do is provide a 200 response with the webhook response module.

2. In your original scenario, instead of doing all your document manipulation, send a message to your custom webhook for each document, including any information you would need.

 

As long as you leave the fusion configuration to run asyncronously then all of the documents will be pushing at one time.

Avatar

Level 6

We do this type of thing all the time.  You have two paths, generally, for how to implement it.  Either way, you’re effectively creating a distributed job queue.

One option is to use a data store to hold the information needed to process.  In your first job, add records to the DS and mark them as “processed=no”.  In the second job, search records from the DS for that same condition, process things, then delete the record or mark as “processed=yes”.  The second job should be set to run every 5 minutes so it effectively always checks for new records to process.

You can do a similar thing with Workfront itself and custom fields (“data extensions”, what the “DE:” prefix means, btw).  In that, poll Workfront for a limited number of records with the criteria.

Doing it in WF is nicer because you don’t have to contend with max DS sizes.  But not all objects have custom field support.

Hopefully this makes some sense. Let me know if not.  (I’m actually giving a talk on Sep 26 in partnership with Adobe and this is one of the topics that may come up.

Thanks,

CV

Avatar

Level 6

Routers are sequential in processing, with priority order determined by the order the connections are made.