Expand my Community achievements bar.

S3 Datastore access denied error

Avatar

Level 1

Whenever we run a workflow to push the asset to S3, we are facing the below error.

 

16.09.2024 19:01:25.771 *ERROR* [oak-ds-async-upload-thread-10] org.apache.jackrabbit.oak.plugins.blob.UploadStagingCache Error adding file to backend
org.apache.jackrabbit.core.data.DataStoreException: Could not upload 7432-595bcfab9c3dac5e033ebaf49f776f6767bde8d064274f215dbb43242df8
	at org.apache.jackrabbit.oak.blob.cloud.s3.S3Backend.write(S3Backend.java:334) [org.apache.jackrabbit.oak-blob-cloud:1.10.1]
	at org.apache.jackrabbit.oak.plugins.blob.AbstractSharedCachingDataStore$2.write(AbstractSharedCachingDataStore.java:173) [org.apache.jackrabbit.oak-blob-plugins:1.22.16]
	at org.apache.jackrabbit.oak.plugins.blob.UploadStagingCache$3.call(UploadStagingCache.java:367) [org.apache.jackrabbit.oak-blob-plugins:1.22.16]
	at org.apache.jackrabbit.oak.plugins.blob.UploadStagingCache$3.call(UploadStagingCache.java:362) [org.apache.jackrabbit.oak-blob-plugins:1.22.16]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
16.09.2024 19:01:25.771 *ERROR* [oak-ds-async-upload-thread-10] org.apache.jackrabbit.oak.plugins.blob.UploadStagingCache Error adding [7432595bcfab9c3dac5e033ebaf49f776f6767bde8d064274f215dbb43242df8] with file [/apps/adobe/aem6.5/author/crx-quickstart/repository/repository/datastore/upload/74/32/59/7432595bcfab9c3dac5e033ebaf49f776f6767bde8d064274f215dbb43242df8] to backend org.apache.jackrabbit.core.data.DataStoreException: Could not upload 7432-595bcfab9c3dac5e033ebaf49f776f6767bde8d064274f215dbb43242df8 at org.apache.jackrabbit.oak.blob.cloud.s3.S3Backend.write(S3Backend.java:334) [org.apache.jackrabbit.oak-blob-cloud:1.10.1] at org.apache.jackrabbit.oak.plugins.blob.AbstractSharedCachingDataStore$2.write(AbstractSharedCachingDataStore.java:173) [org.apache.jackrabbit.oak-blob-plugins:1.22.16] at org.apache.jackrabbit.oak.plugins.blob.UploadStagingCache$3.call(UploadStagingCache.java:367) [org.apache.jackrabbit.oak-blob-plugins:1.22.16] at org.apache.jackrabbit.oak.plugins.blob.UploadStagingCache$3.call(UploadStagingCache.java:362) [org.apache.jackrabbit.oak-blob-plugins:1.22.16] at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: DFDFEWDEFFF; S3 Extended Request ID: sdfvds/sdfdsf+Goa8uV5CZSw+sdfsdf/sdfsd=) at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1632) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1304) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1058) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4365) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4312) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1755) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:133) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:125) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:143) [com.amazonaws.aws-java-sdk-osgi:1.11.330] at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:48) [com.amazonaws.aws-java-sdk-osgi:1.11.330] ... 4 common frames omitted
2 Replies

Avatar

Community Advisor

@tejgkingz Can you please try the following steps:

1. Is this S3 bucket owned by Adobe or AWS?

2. Can you please check permission of folder where the file is supposed to be uploaded

3. Try creating any file inside this folder manually, with same user from which AEM is running to check whether that user has correct permissions.

4. Have you created a service account to run the workflow? 

Avatar

Level 1

@diksha_mishra I have already tried the below steps.

 

1. AWS owns it

2. Policies and roles look good, there are no recent changes.

3. I can create or modify files using the same role.

4. Yes

 

Thank you!