Expand my Community achievements bar.

SOLVED

Js functionality issue

Avatar

Level 2

Hi All , 

I am passing some data through JS , but while pass small amount of data like around in 100 or 500 JS is working as per expectation , but while passing data in huge number like 200000 its not working correctly , on same data is exactly of same type and JS should work correctly for that data , Do we have restriction on passing number of data in JS , Can someone please help.

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

1 Accepted Solution

Avatar

Correct answer by
Level 3

Hi @at7140,

To handle large datasets (e.g., 200,000 records) in Adobe Campaign and ensure your JS works correctly, you can use the following strategies:

1. Paginate Queries with QueryDef

  • Instead of fetching all 200,000 records at once, use pagination to process the data in smaller chunks (e.g., 10,000 records at a time). The QueryDef API supports pagination via the startLine and lineCount parameters.

var batchSize = 10000; // Process 10,000 records at a time
var startLine = 0;
var totalProcessed = 0;

do {
    var query = xtk.queryDef.create(
        <queryDef schema="nms:recipient" operation="select">
            <select>
                <node expr="@id"/>
                <node expr="@email"/>
            </select>
            <orderBy>
                <node expr="@id"/>
            </orderBy>
        </queryDef>
    );

    var result = query.ExecuteQuery(startLine, batchSize);
    var records = result.getElementsByTagName("recipient");

    if (records.length == 0) break; // Exit if no more records

    // Process the batch
    for (var i = 0; i < records.length; i++) {
        var record = records[i];
        var id = record.getAttribute("id");
        var email = record.getAttribute("email");
        // Your logic here
        logInfo("Processing ID: " + id + ", Email: " + email);
    }

    totalProcessed += records.length;
    startLine += batchSize; // Move to the next batch
} while (records.length == batchSize); // Continue if we got a full batch

logInfo("Total records processed: " + totalProcessed);​
Key Points:
  • Adjust batchSize based on your server’s capacity (e.g., 5,000 or 20,000).
  • Use an orderBy clause to ensure consistent pagination.
  • This approach avoids loading all 200,000 records into memory at once.

Please check if this helps.

Thanks

Sushant Trimukhe

View solution in original post

6 Replies

Avatar

Community Advisor

Hi @at7140 ,

Yes, JS code activity has limitations. For example, QueryDef by default will process only 10000 records.

Avatar

Level 2

Hi @ParthaSarathy 

Thanks for this info and how we solve it to pass as much of record we want so JS works correctly

Avatar

Community Advisor

Hi @at7140 ,

There is few attributes like forceNoLineCount="true" to add more counts than 10000. But in general it is not advised to use this for huge volume due to performance issue. As a solution, you can create a loop. But for huge volume, there will be multiple loop / iterations. Example, for 200000 volume, 20 iterations will happen.  

Avatar

Correct answer by
Level 3

Hi @at7140,

To handle large datasets (e.g., 200,000 records) in Adobe Campaign and ensure your JS works correctly, you can use the following strategies:

1. Paginate Queries with QueryDef

  • Instead of fetching all 200,000 records at once, use pagination to process the data in smaller chunks (e.g., 10,000 records at a time). The QueryDef API supports pagination via the startLine and lineCount parameters.

var batchSize = 10000; // Process 10,000 records at a time
var startLine = 0;
var totalProcessed = 0;

do {
    var query = xtk.queryDef.create(
        <queryDef schema="nms:recipient" operation="select">
            <select>
                <node expr="@id"/>
                <node expr="@email"/>
            </select>
            <orderBy>
                <node expr="@id"/>
            </orderBy>
        </queryDef>
    );

    var result = query.ExecuteQuery(startLine, batchSize);
    var records = result.getElementsByTagName("recipient");

    if (records.length == 0) break; // Exit if no more records

    // Process the batch
    for (var i = 0; i < records.length; i++) {
        var record = records[i];
        var id = record.getAttribute("id");
        var email = record.getAttribute("email");
        // Your logic here
        logInfo("Processing ID: " + id + ", Email: " + email);
    }

    totalProcessed += records.length;
    startLine += batchSize; // Move to the next batch
} while (records.length == batchSize); // Continue if we got a full batch

logInfo("Total records processed: " + totalProcessed);​
Key Points:
  • Adjust batchSize based on your server’s capacity (e.g., 5,000 or 20,000).
  • Use an orderBy clause to ensure consistent pagination.
  • This approach avoids loading all 200,000 records into memory at once.

Please check if this helps.

Thanks

Sushant Trimukhe

Avatar

Level 2

Hi @SushantTrimukheD ,

Thanks this approach worked for me , but its taking too much time to process data in JS , i tried spilt before JS to divide data in different Subsets and process through JS but still JS processing time is too long , is their any way to process data faster , for me JS is processing 1000 data/min.

Avatar

Level 3

Hi @at7140,

1. Optimize the QueryDef Execution

The current pagination approach is solid, but the query itself might be slowing things down. Here’s how to refine it:

  • Minimize Selected Fields: Only select the fields you absolutely need. If you’re processing @id and @email, don’t include unnecessary columns in the <select> node.
  • Add a Where Clause (if applicable): If you don’t need all 200,000 records, filter the dataset upfront with a <where> condition to reduce the total rows processed.
<queryDef schema="nms:recipient" operation="select">
    <select>
        <node expr="@id"/>
        <node expr="@email"/>
    </select>
    <where>
        <condition expr="@email IS NOT NULL"/> <!-- Example filter -->
    </where>
    <orderBy>
        <node expr="@id"/>
    </orderBy>
</queryDef>

 

  • Index the Schema: Ensure the nms:recipient schema has database indexes on fields used in the orderBy (@id) or where clauses. Contact your Adobe Campaign administrator to verify or add indexes.

2. Batch Processing with SQL Instead of JS

JS in Adobe Campaign can be slow for row-by-row processing. If possible, offload heavy lifting to the database using xtk.sql.exec:

  • Update in Bulk: If your JS logic involves updates (e.g., setting a field), use SQL UPDATE statements instead of looping through records in JS.
var sql = "UPDATE nmsRecipient SET sEmail = UPPER(sEmail) WHERE iRecipientId >= " + startId + " AND iRecipientId < " + endId;
xtk.sql.exec(sql);

 

  • Temp Table Approach: Insert the subset of records into a temporary table, process them with SQL, then pull results back into JS if needed.

This reduces JS overhead significantly since database engines are optimized for bulk operations.


3. Parallel Processing with Workflows

If your JS is still the bottleneck, split the workload across multiple Adobe Campaign workflows:

  • Split the Data: Use a Split activity in a workflow to divide the 200,000 records into smaller subsets (e.g., 5 subsets of 40,000).
  • Run JS in Parallel: Trigger separate workflows or JS activities for each subset simultaneously. This leverages Adobe Campaign’s multi-threading capabilities.
  • Monitor Server Load: Ensure your server can handle parallel execution without crashing—test with smaller subsets first (e.g., 10,000).

4. Optimize JS Code

If you must stick with JS, streamline the code:

  • Reduce Logging: Calls to logInfo() are useful for debugging but slow down execution.
Avoid DOM Parsing: Instead of getElementsByTagName and getAttribute, use the result object directly if possible:
var result = query.ExecuteQuery(startLine, batchSize);
for each (var record in result.recipient) {
    var id = record.@id;
    var email = record.@email;
    // Your logic here
}

 

  • Precompute Values: If your logic involves repetitive calculations, compute them once outside the loop.

5. Increase Batch Size

You’re processing 1,000 records per minute with a batchSize of 10,000, which suggests each batch takes ~10 minutes. Test larger batch sizes (e.g., 20,000 or 50,000) to reduce the number of ExecuteQuery calls. However:

  • Monitor memory usage—too large a batch might crash the JS engine.
  • Adjust based on your server’s RAM and CPU capacity.

Optimized Version

Here’s an optimized version combining some of these ideas:

var batchSize = 20000; // Test larger batches
var startLine = 0;

while (true) {
    var query = xtk.queryDef.create(
        <queryDef schema="nms:recipient" operation="select">
            <select>
                <node expr="@id"/>
                <node expr="@email"/>
            </select>
            <orderBy>
                <node expr="@id"/>
            </orderBy>
        </queryDef>
    );

    var result = query.ExecuteQuery(startLine, batchSize);
    var records = result.recipient;

    if (records.length() == 0) break;

    for each (var record in records) {
        var id = record.@id;
        var email = record.@email;
        // Minimal logic here
    }

    startLine += batchSize;
}

 

Thanks

Sushant Trimukhe