Hi all,
We are seeing more data transfer in s3 bucket which consume the high cost. When i checked the logs it shows the below information. I tried to find any solution but i couldnt able to see anything. Initailly data store garabage collection failed later it was ran successfully. Also, there is no missing blobs i did the consistency check as well.
GET /bin/acs-commo ns/jcr-compare.dump.json.servlet.css/1.ico HTTP/1.1] org.apache.jackrabbit.oak. plugins.blob.DataStoreCacheUtils Deleted file [/crx/aem/author/crx-quickstart/r epository/datastore/download/88/e2/76/8
How to stop this activity. Please provide your suggestion.
Solved! Go to Solution.
Views
Replies
Total Likes
Hi @user65294,
You mentioned that the garbage collection initially failed but later ran successfully. This process is crucial for cleaning up unused blobs in the Data Store, which helps manage storage costs. The log entry you provided indicates a request to a servlet, possibly related to the ACS Commons JCR Compare tool, which might generate data traffic.
To identify the root cause please check the below details:
Thanks
Pranay
Views
Replies
Total Likes
anyone have idea on this ?
Views
Replies
Total Likes
Hi @user65294,
You mentioned that the garbage collection initially failed but later ran successfully. This process is crucial for cleaning up unused blobs in the Data Store, which helps manage storage costs. The log entry you provided indicates a request to a servlet, possibly related to the ACS Commons JCR Compare tool, which might generate data traffic.
To identify the root cause please check the below details:
Thanks
Pranay
Views
Replies
Total Likes
Views
Replies
Total Likes
Hi @user65294 ,
Please let us know if you need any additional information to resolve the issue.
Views
Replies
Total Likes