Expand my Community achievements bar.

Adobe Summit 2025: AEM Session Recordings Are Live! Missed a session or want to revisit your favorites? Watch the latest recordings now.

Mark Solution

This conversation has been locked due to inactivity. Please create a new post.

SOLVED

Asynchronous upload to s3 bucket

Avatar

Level 5

Hi all,

 

Requirement is to upload files in asychronously to s3. Instead of uploading all files at once, it has to be uploaded batchwise. For example: if the user is uploading 100 files, starting 10 files will be uploaded and will the display the uploaded and failed files list. similarly next 10 files will upload to s3.

 

for (let i = 0; i < filesNames.length; i++){
var dataItems = new FormData();
dataItems.append("fileName", filesNames[i]);
dataItems.append("jobID", jobID);
debugger;
$.ajax({
type: "POST",
url: getSignedUrl,
data: dataItems,
contentType: false,
cache: false,
processData: false,
timeout: 1200000,
success: function(data) {
fileUploadStatus = "Upload In Progress";
progressListDisplay(i, fileUploadStatus);
uploadAsset(data);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
},
error: function() {
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_ERROR);
}
});

function uploadAsset(data) {
progressBar.style.display = "block";
var startUploadTime = new Date();
var endUploadTime;
var url = data.signedUrl;
$.ajax({
type: 'PUT',
async: true,
url: url,
data: fileList[i],
contentType: false,
cache: false,
processData: false,
timeout: 1200000,
success: function() {
totalUploadSuccessor(startUploadTime, endUploadTime, fileList[i].name, jobID, true);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
},
error: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Failed";
totalUploadSuccessor(startUploadTime, endUploadTime, fileList[i].name, jobID, false);
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
}
});
}

function totalUploadSuccessor(startUploadTime, endUploadTime, filename, jobID, message) {
fileCount++;
if (message) {
$.ajax({
type: 'POST',
async: true,
url: getUploadFinishUrl,
data: {
filename: filename,
jobID: jobID,
allUploadFinish: fileList.length === fileCount
},
cache: false,
timeout: 1200000,
success: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Successful";
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
},
error: function() {
endUploadTime = new Date();
fileUploadStatus = "Upload Failed";
form.delay(500).removeClass(CSS_LOADING).addClass(CSS_SHARE_SUCCESS);
changeUploadStatus(i, fileUploadStatus, startUploadTime, endUploadTime);
}
});

 

In this code uploading all the files at once, has to be modified in such a way that upload should happen batchwise. 

 

Thank you.

1 Accepted Solution

Avatar

Correct answer by
Community Advisor
2 Replies

Avatar

Correct answer by
Community Advisor

Hi @keerthana_hn 

 did you try exploring this new feature : https://aws.amazon.com/blogs/aws/new-amazon-s3-batch-operations/

 

Thanks

Dipti

 

Avatar

Employee Advisor

@keerthana_hn  You can leverage batch-upload npm module for this

https://www.npmjs.com/package/s3-batch-upload