Asynchronous upload to s3 bucket

Avatar

Avatar
Boost 5
Level 2
keerthana_hn
Level 2

Likes

6 likes

Total Posts

32 posts

Correct reply

0 solutions
Top badges earned
Boost 5
Boost 3
Boost 1
Validate 1
View profile

Avatar
Boost 5
Level 2
keerthana_hn
Level 2

Likes

6 likes

Total Posts

32 posts

Correct reply

0 solutions
Top badges earned
Boost 5
Boost 3
Boost 1
Validate 1
View profile
keerthana_hn
Level 2

21-06-2021

Hi all,

 

Requirement is to upload files in asychronously to s3. Instead of uploading all files at once, it has to be uploaded batchwise. For example: if the user is uploading 100 files, starting 10 files will be uploaded and will the display the uploaded and failed files list. similarly next 10 files will upload to s3.

 

for (let i = 0; i < filesNames.length; i++){
var dataItems = new FormData();
dataItems.append("fileName", filesNames[i]);
dataItems.append("jobID", jobID);
debugger;
$.ajax({
type: "POST",
url: getSignedUrl,
data: dataItems,
contentType: false,
cache: false,
processData: false,
timeout: 1200000,
success: function(data) {
fileUploadStatus = "Upload In Progress";
progressListDisplay(i, fileUploadStatus);
uploadAsset(data);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
},
error: function() {
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_ERROR);
}
});

function uploadAsset(data) {
progressBar.style.display = "block";
var startUploadTime = new Date();
var endUploadTime;
var url = data.signedUrl;
$.ajax({
type: 'PUT',
async: true,
url: url,
data: fileList[i],
contentType: false,
cache: false,
processData: false,
timeout: 1200000,
success: function() {
totalUploadSuccessor(startUploadTime, endUploadTime, fileList[i].name, jobID, true);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
},
error: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Failed";
totalUploadSuccessor(startUploadTime, endUploadTime, fileList[i].name, jobID, false);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
}
});
}

function totalUploadSuccessor(startUploadTime, endUploadTime, filename, jobID, message) {
fileCount++;
if (message) {
$.ajax({
type: 'POST',
async: true,
url: getUploadFinishUrl,
data: {
filename: filename,
jobID: jobID,
allUploadFinish: fileList.length === fileCount
},
cache: false,
timeout: 1200000,
success: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Successful";
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
},
error: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Failed";
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
}
});

 

In this code uploading all the files at once, has to be modified in such a way that upload should happen batchwise. 

 

Thank you.

Accepted Solutions (1)

Accepted Solutions (1)

Avatar

Avatar
Boost 100
Level 6
Dipti_Chauhan
Level 6

Likes

114 likes

Total Posts

136 posts

Correct reply

40 solutions
Top badges earned
Boost 100
Give Back 10
Affirm 25
Boost 50
Boost 25
View profile

Avatar
Boost 100
Level 6
Dipti_Chauhan
Level 6

Likes

114 likes

Total Posts

136 posts

Correct reply

40 solutions
Top badges earned
Boost 100
Give Back 10
Affirm 25
Boost 50
Boost 25
View profile
Dipti_Chauhan
Level 6

22-06-2021

Hi @keerthana_hn 

 did you try exploring this new feature : https://aws.amazon.com/blogs/aws/new-amazon-s3-batch-operations/

 

Thanks

Dipti

 

Answers (1)

Answers (1)

Avatar

Avatar
Affirm 100
MVP
shelly-goel
MVP

Likes

249 likes

Total Posts

410 posts

Correct reply

107 solutions
Top badges earned
Affirm 100
Give Back 25
Ignite 3
Give Back 10
Validate 1
View profile

Avatar
Affirm 100
MVP
shelly-goel
MVP

Likes

249 likes

Total Posts

410 posts

Correct reply

107 solutions
Top badges earned
Affirm 100
Give Back 25
Ignite 3
Give Back 10
Validate 1
View profile
shelly-goel
MVP

26-06-2021

@keerthana_hn  You can leverage batch-upload npm module for this

https://www.npmjs.com/package/s3-batch-upload