Expand my Community achievements bar.

Announcing the launch of new sub-community for Campaign Web UI to cater specifically to the needs of Campaign Web UI users!

Best approach to target and send large volume email via SOAP API (Adobe Campaign Classic)


Level 5

Hi All,

We need to be able to trigger a large volume email from an external application using the Adobe Campaign Classic SOAP API.

The external application will be running processes that produce a large list of customer IDs (that are present against recipient records in the Adobe Campaign recipient table in the @account field).

Programmatically, we want to be able to target these customer ID's in an email delivery, prepare and send.

Can someone suggest the best SOAP method(s) to use to achieve our goal.

Many Thanks


Note: there might be more customer ID's in the external application output than exist in Adobe Campaign - but that's fine, we are just looking to target those that exist in Adobe Campaign already (e.g a subset).

2 Replies


Level 4


I can suggest exporting this list into the file, upload file into Adobe server, then call xtk.workflow.SpawnWithParameters to spawn a workflow from a template:

var parameters = <variables file={pathToFile} />

//  all parameters will be visible in workflow as vars.*  In this case file will become vars.file




Inside the workflow, load the file into a temporary table and spawn a delivery.

Alternatively, one can configure delivery to be spawn from a CSV file, call nms.delivery.SubmitNotification and spawn a delivery directly from the memory context:

var ctx = loadFile(csvFilePath)



        <targets fromExternalSource='true'>





But in this case, delivery might require lots of memory to process.


Community Advisor


Create a schema to buffer incoming requests, then use its ootb crud api to insert rows into it.

A periodic workflow can then read the table and send emails using ordinary deliveries, at whichever interval matches the use case.

NB For a high perf variant of this, the number of staging tables can be switched from one to infinity, with a process that manages the tables and balances load across them and workers processing them- I've implemented similar for a very high-volume low-latency project.