Best Practice for processing many bundles | Community
Skip to main content
tibormolnar
Level 4
March 27, 2025
Solved

Best Practice for processing many bundles

  • March 27, 2025
  • 1 reply
  • 1404 views

Hi All,

this is rather generic question about how to best handle a high number of search results.

The Workfront Search module in Fusion gives you the option to set a limit on the number of search results returned, but what if I need all matching records and the number can be high (several hundreds)? For example, I need to pull a list of all Assignments in a project and process each of them. For a large and complex project with many tasks this list can be extensive.

How can I make sure that all matching records are returned and processed, while not risking to reach the 40 minutes runtime limit, etc.

Is there a way to obtain and process them in batches?

Any ideas are appreciated.

Thanks,

Tibor

Best answer by Sven-iX

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items

1 reply

Sven-iX
Community Advisor
Sven-iXCommunity AdvisorAccepted solution
Community Advisor
March 27, 2025

Hi @tibormolnar here is what I've used before: 

  • First get a count of items, then get batches of 2000 records each until you exhaust all items
  • do the processing in a second scenario that is called from the first one, for each batch. This "multi-threads" your scenario.

 

Example: 

  1. customAPI module with the query but instead "search" action, use "count", gives you a number
  2. setVar calculate the batches - you can go up to 2000 records or anything smaller
    1. set batch size
    2. cal number of batches = round( countbatchSize )
  3. Repeater module with number of steps set to number of batches
  4. setVar define start (the item to start the batch with) = the "i" from repeater * (batch size-1)
  5. send these parameters to the "worker" scenario (ideally passing along the query from the count module)
  6. in the worker, do the Search based on the query, start and batchSize
  7. process the found items
Doug_Den_Hoed__AtAppStore
Community Advisor
Community Advisor
March 27, 2025

 

Hi @tibormolnar,

 

To spare you the day I recently lost when api-unsupported started returning needle duplicates among such haystack batches, I strongly urge you to ensure the request you use in step 3 (and optionally, step 1) from @sven-ix is SORTED to avoid such duplicates and future proof your work.

 

Our www.atappstore.com lowest level plumbing now inspects every such API call to Workfront prior to execution and if the count exceeds the batch size but there is no "_Sort=" within, adds a magnetic "&ID_Sort=asc" to ensure Good Behavior.

 

Regards,

Doug 

Sven-iX
Community Advisor
Community Advisor
March 27, 2025

OMG - YES - sorting is a must, thank you for adding, @doug_den_hoed__atappstore 
Had that experience too!
Seems weird the API doesn't already return a default sort...