Best Practice for processing many bundles | Community
Skip to main content
tibormolnar
Level 4
March 27, 2025
Solved

Best Practice for processing many bundles

  • March 27, 2025
  • 1 reply
  • 1407 views

Hi All,

this is rather generic question about how to best handle a high number of search results.

The Workfront Search module in Fusion gives you the option to set a limit on the number of search results returned, but what if I need all matching records and the number can be high (several hundreds)? For example, I need to pull a list of all Assignments in a project and process each of them. For a large and complex project with many tasks this list can be extensive.

How can I make sure that all matching records are returned and processed, while not risking to reach the 40 minutes runtime limit, etc.

Is there a way to obtain and process them in batches?

Any ideas are appreciated.

Thanks,

Tibor

Best answer by Sven-iX

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items

1 reply

Sven-iX
Community Advisor
Sven-iXCommunity AdvisorAccepted solution
Community Advisor
March 27, 2025

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items
Level 4
June 20, 2025

@sven-ix, do you have an example of such batch split setup?


 
 
Level 4
July 15, 2025

Sorry I've been sidetracked. 
You already found a way but put simply cerate batches by grouping the bundles you iterate through

  • set batch size (setVar)
  • iteration (from a search or iterator module)
  • you may need an incrementer to count bundles unless you use an iterator (then use bundlePosition)
  • get batch size (getVar)
  • set the batch (setVar =  ceil ( bundleCounter / batch size) )
  • aggregator where you set groupBy to the batch (or you can put the ceil... in here)

This means: As we run through all the bundles, we group bundles into batches of a certain size.

Each of these batches we then push to the second scenario. 

 

Here's a forced example: I use a repeater to create 100 bundles, and in the aggregator I group them by 20s. 
Convert the batch to JSON, and send it to the webhook

 

since we send the array as a named object the webhook receives a property "batch" that is an array. 

 

 

 


Thank you, I ended up with the similar setup, just did not realize that I can pass batchNo in the heading.