Expand my Community achievements bar.

We are excited to introduce our latest innovation to enhance the Adobe Campaign user experience — the Adobe Campaign v8 Web User Interface!
SOLVED

What is the maximum file size used in the data loading activity and how to increase time limit for uploading data in campaign server?

Avatar

Level 2

Hi Team,

We are uploading 4.5 GB data into campaign server using data loading activity. Also we need to process the file and insert data into schema.

We are facing some errors while uploading data into server.

Also we tried moving file to campaign server using SSH and used File collector to process it. This approach also facing too much time while executing the workflow.

Is there any option for increasing time limit in data loading activity?

Is there any maximum file size restriction in campaign server?

Let us know the best practice to process bulk files.

1095352_pastedImage_1.jpg

1 Accepted Solution

Avatar

Correct answer by
Employee Advisor

Hi,

Uploading huge files on to the server using the File Upload functionality is not recommended.

You need to place this file on sFTP and then using File Collector provide the file to Data Loading activity.

Since the file is huge, it will take some time to process. The only possible way to reduce time will be to use multiple workflows and break this file into smaller chunks.

Use each such workflow to process one chunk. This method will provide you some sort of parallelism but eventually the data is being inserted inside the same database/table. You will see a performance improvement with concurrent workflows.

When using the sFTP way of uploading to data loading, there is no file size restriction.

Hope this helps

View solution in original post

1 Reply

Avatar

Correct answer by
Employee Advisor

Hi,

Uploading huge files on to the server using the File Upload functionality is not recommended.

You need to place this file on sFTP and then using File Collector provide the file to Data Loading activity.

Since the file is huge, it will take some time to process. The only possible way to reduce time will be to use multiple workflows and break this file into smaller chunks.

Use each such workflow to process one chunk. This method will provide you some sort of parallelism but eventually the data is being inserted inside the same database/table. You will see a performance improvement with concurrent workflows.

When using the sFTP way of uploading to data loading, there is no file size restriction.

Hope this helps