Asynchronous upload to s3 bucket | Community
Skip to main content
Adobe Employee
June 21, 2021
Solved

Asynchronous upload to s3 bucket

  • June 21, 2021
  • 2 replies
  • 1957 views

Hi all,

 

Requirement is to upload files in asychronously to s3. Instead of uploading all files at once, it has to be uploaded batchwise. For example: if the user is uploading 100 files, starting 10 files will be uploaded and will the display the uploaded and failed files list. similarly next 10 files will upload to s3.

 

for (let i = 0; i < filesNames.length; i++){
var dataItems = new FormData();
dataItems.append("fileName", filesNames[i]);
dataItems.append("jobID", jobID);
debugger;
$.ajax({
type: "POST",
url: getSignedUrl,
data: dataItems,
contentType: false,
cache: false,
processData: false,
timeout: 1200000,
success: function(data) {
fileUploadStatus = "Upload In Progress";
progressListDisplay(i, fileUploadStatus);
uploadAsset(data);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
},
error: function() {
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_ERROR);
}
});

function uploadAsset(data) {
progressBar.style.display = "block";
var startUploadTime = new Date();
var endUploadTime;
var url = data.signedUrl;
$.ajax({
type: 'PUT',
async: true,
url: url,
data: fileList[i],
contentType: false,
cache: false,
processData: false,
timeout: 1200000,
success: function() {
totalUploadSuccessor(startUploadTime, endUploadTime, fileList[i].name, jobID, true);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
},
error: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Failed";
totalUploadSuccessor(startUploadTime, endUploadTime, fileList[i].name, jobID, false);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
}
});
}

function totalUploadSuccessor(startUploadTime, endUploadTime, filename, jobID, message) {
fileCount++;
if (message) {
$.ajax({
type: 'POST',
async: true,
url: getUploadFinishUrl,
data: {
filename: filename,
jobID: jobID,
allUploadFinish: fileList.length === fileCount
},
cache: false,
timeout: 1200000,
success: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Successful";
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
},
error: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Failed";
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
}
});

 

In this code uploading all the files at once, has to be modified in such a way that upload should happen batchwise. 

 

Thank you.

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by Dipti_Chauhan

Hi @keerthana_h_n 

 did you try exploring this new feature : https://aws.amazon.com/blogs/aws/new-amazon-s3-batch-operations/

 

Thanks

Dipti

 

2 replies

Dipti_Chauhan
Community Advisor
Dipti_ChauhanCommunity AdvisorAccepted solution
Community Advisor
June 23, 2021

Hi @keerthana_h_n 

 did you try exploring this new feature : https://aws.amazon.com/blogs/aws/new-amazon-s3-batch-operations/

 

Thanks

Dipti

 

shelly-goel
Adobe Employee
Adobe Employee
June 26, 2021

@keerthana_h_n  You can leverage batch-upload npm module for this

https://www.npmjs.com/package/s3-batch-upload