Best Practice for processing many bundles | Community
Skip to main content
tibormolnar
Level 4
March 27, 2025
Solved

Best Practice for processing many bundles

  • March 27, 2025
  • 1 reply
  • 1407 views

Hi All,

this is rather generic question about how to best handle a high number of search results.

The Workfront Search module in Fusion gives you the option to set a limit on the number of search results returned, but what if I need all matching records and the number can be high (several hundreds)? For example, I need to pull a list of all Assignments in a project and process each of them. For a large and complex project with many tasks this list can be extensive.

How can I make sure that all matching records are returned and processed, while not risking to reach the 40 minutes runtime limit, etc.

Is there a way to obtain and process them in batches?

Any ideas are appreciated.

Thanks,

Tibor

Best answer by Sven-iX

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items

1 reply

Sven-iX
Community Advisor
Sven-iXCommunity AdvisorAccepted solution
Community Advisor
March 27, 2025

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items
Level 4
June 20, 2025

@sven-ix, do you have an example of such batch split setup?


 
 
Level 4
July 7, 2025

Hi @viovi 
Go ahead and try it. I'll help you along the way. 


Our data is coming from file with 1000+ entries that we need to split into batches as we have an issue with hitting 40 minutes runtime and other limits.

I tried to setup Repeater, and it seemed to define batch size and Number of steps correctly, but it is still processing each step as individual operation (1 operation = 1 collection of values/ bundle). Also, it looks like it all was just repeated 3 times and not split into 3 parts, i.e., 3000+ bundles instead of 1000+) when passing data to another scenario.

 

How to group the bundles and pass them to another scenario to process in 3 batches, so, e.g., 1st batch would have bundles 1 to 500, 2nd 501-1000, 3rd - all the rest?