Best Practice for processing many bundles | Community
Skip to main content
tibormolnar
Level 4
March 27, 2025
Solved

Best Practice for processing many bundles

  • March 27, 2025
  • 1 reply
  • 1433 views

Hi All,

this is rather generic question about how to best handle a high number of search results.

The Workfront Search module in Fusion gives you the option to set a limit on the number of search results returned, but what if I need all matching records and the number can be high (several hundreds)? For example, I need to pull a list of all Assignments in a project and process each of them. For a large and complex project with many tasks this list can be extensive.

How can I make sure that all matching records are returned and processed, while not risking to reach the 40 minutes runtime limit, etc.

Is there a way to obtain and process them in batches?

Any ideas are appreciated.

Thanks,

Tibor

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by Sven-iX

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items

1 reply

Sven-iX
Community Advisor
Sven-iXCommunity AdvisorAccepted solution
Community Advisor
March 27, 2025

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items
Level 4
June 20, 2025

@sven-ix, do you have an example of such batch split setup?


 
 
Level 4
July 15, 2025

@viovi, have you tried enabling the "JSON pass-through" option for your webhook in the 2nd scenario?

With this setting the output of the webhook module will be a string, same as the payload sent to the webhook. Then you can do with that what you want, e.g. parse it as JSON.

Does this help?


Thank you, I think that was my issue with json formatting not passing properly.