Expand my Community achievements bar.

Js functionality issue

Avatar

Level 2

Hi All , 

I am passing some data through JS , but while pass small amount of data like around in 100 or 500 JS is working as per expectation , but while passing data in huge number like 200000 its not working correctly , on same data is exactly of same type and JS should work correctly for that data , Do we have restriction on passing number of data in JS , Can someone please help.

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

4 Replies

Avatar

Community Advisor

Hi @at7140 ,

Yes, JS code activity has limitations. For example, QueryDef by default will process only 10000 records.

Avatar

Level 2

Hi @ParthaSarathy 

Thanks for this info and how we solve it to pass as much of record we want so JS works correctly

Avatar

Community Advisor

Hi @at7140 ,

There is few attributes like forceNoLineCount="true" to add more counts than 10000. But in general it is not advised to use this for huge volume due to performance issue. As a solution, you can create a loop. But for huge volume, there will be multiple loop / iterations. Example, for 200000 volume, 20 iterations will happen.  

Avatar

Level 3

Hi @at7140,

To handle large datasets (e.g., 200,000 records) in Adobe Campaign and ensure your JS works correctly, you can use the following strategies:

1. Paginate Queries with QueryDef

  • Instead of fetching all 200,000 records at once, use pagination to process the data in smaller chunks (e.g., 10,000 records at a time). The QueryDef API supports pagination via the startLine and lineCount parameters.

var batchSize = 10000; // Process 10,000 records at a time
var startLine = 0;
var totalProcessed = 0;

do {
    var query = xtk.queryDef.create(
        <queryDef schema="nms:recipient" operation="select">
            <select>
                <node expr="@id"/>
                <node expr="@email"/>
            </select>
            <orderBy>
                <node expr="@id"/>
            </orderBy>
        </queryDef>
    );

    var result = query.ExecuteQuery(startLine, batchSize);
    var records = result.getElementsByTagName("recipient");

    if (records.length == 0) break; // Exit if no more records

    // Process the batch
    for (var i = 0; i < records.length; i++) {
        var record = records[i];
        var id = record.getAttribute("id");
        var email = record.getAttribute("email");
        // Your logic here
        logInfo("Processing ID: " + id + ", Email: " + email);
    }

    totalProcessed += records.length;
    startLine += batchSize; // Move to the next batch
} while (records.length == batchSize); // Continue if we got a full batch

logInfo("Total records processed: " + totalProcessed);​
Key Points:
  • Adjust batchSize based on your server’s capacity (e.g., 5,000 or 20,000).
  • Use an orderBy clause to ensure consistent pagination.
  • This approach avoids loading all 200,000 records into memory at once.

Please check if this helps.

Thanks

Sushant Trimukhe