Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
BedrockMission!

Learn More

View all

Sign in to view all badges

What is the maximum file size used in the data loading activity and how to increase time limit for uploading data in campaign server?

Avatar

Avatar
Validate 1
Level 2
nithyanandhanm4
Level 2

Likes

4 likes

Total Posts

9 posts

Correct Reply

0 solutions
Top badges earned
Validate 1
Boost 3
Boost 1
Applaud 5
View profile

Avatar
Validate 1
Level 2
nithyanandhanm4
Level 2

Likes

4 likes

Total Posts

9 posts

Correct Reply

0 solutions
Top badges earned
Validate 1
Boost 3
Boost 1
Applaud 5
View profile
nithyanandhanm4
Level 2

02-01-2017

Hi Team,

We are uploading 4.5 GB data into campaign server using data loading activity. Also we need to process the file and insert data into schema.

We are facing some errors while uploading data into server.

Also we tried moving file to campaign server using SSH and used File collector to process it. This approach also facing too much time while executing the workflow.

Is there any option for increasing time limit in data loading activity?

Is there any maximum file size restriction in campaign server?

Let us know the best practice to process bulk files.

1095352_pastedImage_1.jpg

Accepted Solutions (1)

Accepted Solutions (1)

Avatar

Avatar
Validate 1
Employee
Vapsy
Employee

Likes

369 likes

Total Posts

726 posts

Correct Reply

342 solutions
Top badges earned
Validate 1
Boost 50
Boost 5
Boost 3
Boost 250
View profile

Avatar
Validate 1
Employee
Vapsy
Employee

Likes

369 likes

Total Posts

726 posts

Correct Reply

342 solutions
Top badges earned
Validate 1
Boost 50
Boost 5
Boost 3
Boost 250
View profile
Vapsy
Employee

02-01-2017

Hi,

Uploading huge files on to the server using the File Upload functionality is not recommended.

You need to place this file on sFTP and then using File Collector provide the file to Data Loading activity.

Since the file is huge, it will take some time to process. The only possible way to reduce time will be to use multiple workflows and break this file into smaller chunks.

Use each such workflow to process one chunk. This method will provide you some sort of parallelism but eventually the data is being inserted inside the same database/table. You will see a performance improvement with concurrent workflows.

When using the sFTP way of uploading to data loading, there is no file size restriction.

Hope this helps

Answers (0)