Hi All ,
I am passing some data through JS , but while pass small amount of data like around in 100 or 500 JS is working as per expectation , but while passing data in huge number like 200000 its not working correctly , on same data is exactly of same type and JS should work correctly for that data , Do we have restriction on passing number of data in JS , Can someone please help.
Solved! Go to Solution.
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
Hi @at7140,
To handle large datasets (e.g., 200,000 records) in Adobe Campaign and ensure your JS works correctly, you can use the following strategies:
var batchSize = 10000; // Process 10,000 records at a time
var startLine = 0;
var totalProcessed = 0;
do {
var query = xtk.queryDef.create(
<queryDef schema="nms:recipient" operation="select">
<select>
<node expr="@id"/>
<node expr="@email"/>
</select>
<orderBy>
<node expr="@id"/>
</orderBy>
</queryDef>
);
var result = query.ExecuteQuery(startLine, batchSize);
var records = result.getElementsByTagName("recipient");
if (records.length == 0) break; // Exit if no more records
// Process the batch
for (var i = 0; i < records.length; i++) {
var record = records[i];
var id = record.getAttribute("id");
var email = record.getAttribute("email");
// Your logic here
logInfo("Processing ID: " + id + ", Email: " + email);
}
totalProcessed += records.length;
startLine += batchSize; // Move to the next batch
} while (records.length == batchSize); // Continue if we got a full batch
logInfo("Total records processed: " + totalProcessed);
Please check if this helps.
Thanks
Sushant Trimukhe
Views
Replies
Total Likes
Thanks for this info and how we solve it to pass as much of record we want so JS works correctly
Views
Replies
Total Likes
Hi @at7140 ,
There is few attributes like forceNoLineCount="true" to add more counts than 10000. But in general it is not advised to use this for huge volume due to performance issue. As a solution, you can create a loop. But for huge volume, there will be multiple loop / iterations. Example, for 200000 volume, 20 iterations will happen.
Hi @at7140,
To handle large datasets (e.g., 200,000 records) in Adobe Campaign and ensure your JS works correctly, you can use the following strategies:
var batchSize = 10000; // Process 10,000 records at a time
var startLine = 0;
var totalProcessed = 0;
do {
var query = xtk.queryDef.create(
<queryDef schema="nms:recipient" operation="select">
<select>
<node expr="@id"/>
<node expr="@email"/>
</select>
<orderBy>
<node expr="@id"/>
</orderBy>
</queryDef>
);
var result = query.ExecuteQuery(startLine, batchSize);
var records = result.getElementsByTagName("recipient");
if (records.length == 0) break; // Exit if no more records
// Process the batch
for (var i = 0; i < records.length; i++) {
var record = records[i];
var id = record.getAttribute("id");
var email = record.getAttribute("email");
// Your logic here
logInfo("Processing ID: " + id + ", Email: " + email);
}
totalProcessed += records.length;
startLine += batchSize; // Move to the next batch
} while (records.length == batchSize); // Continue if we got a full batch
logInfo("Total records processed: " + totalProcessed);
Please check if this helps.
Thanks
Sushant Trimukhe
Views
Replies
Total Likes
Hi @SushantTrimukheD ,
Thanks this approach worked for me , but its taking too much time to process data in JS , i tried spilt before JS to divide data in different Subsets and process through JS but still JS processing time is too long , is their any way to process data faster , for me JS is processing 1000 data/min.
Views
Replies
Total Likes
Hi @at7140,
1. Optimize the QueryDef Execution
The current pagination approach is solid, but the query itself might be slowing things down. Here’s how to refine it:
<queryDef schema="nms:recipient" operation="select">
<select>
<node expr="@id"/>
<node expr="@email"/>
</select>
<where>
<condition expr="@email IS NOT NULL"/> <!-- Example filter -->
</where>
<orderBy>
<node expr="@id"/>
</orderBy>
</queryDef>
JS in Adobe Campaign can be slow for row-by-row processing. If possible, offload heavy lifting to the database using xtk.sql.exec:
var sql = "UPDATE nmsRecipient SET sEmail = UPPER(sEmail) WHERE iRecipientId >= " + startId + " AND iRecipientId < " + endId;
xtk.sql.exec(sql);
This reduces JS overhead significantly since database engines are optimized for bulk operations.
If your JS is still the bottleneck, split the workload across multiple Adobe Campaign workflows:
If you must stick with JS, streamline the code:
var result = query.ExecuteQuery(startLine, batchSize);
for each (var record in result.recipient) {
var id = record.@id;
var email = record.@email;
// Your logic here
}
You’re processing 1,000 records per minute with a batchSize of 10,000, which suggests each batch takes ~10 minutes. Test larger batch sizes (e.g., 20,000 or 50,000) to reduce the number of ExecuteQuery calls. However:
Here’s an optimized version combining some of these ideas:
var batchSize = 20000; // Test larger batches
var startLine = 0;
while (true) {
var query = xtk.queryDef.create(
<queryDef schema="nms:recipient" operation="select">
<select>
<node expr="@id"/>
<node expr="@email"/>
</select>
<orderBy>
<node expr="@id"/>
</orderBy>
</queryDef>
);
var result = query.ExecuteQuery(startLine, batchSize);
var records = result.recipient;
if (records.length() == 0) break;
for each (var record in records) {
var id = record.@id;
var email = record.@email;
// Minimal logic here
}
startLine += batchSize;
}
Thanks
Sushant Trimukhe
Views
Replies
Total Likes
Views
Likes
Replies