I am using Aem 6.5 server and updating the node properties in 12K nodes. i am getting the java.heap.space and out of memory error.
- I have tried these below ways, but no luck
- Using synchronized block, save the session and resolver.commit() after updating the JCR properties. Even I am facing the same issue.
- And tried to increase the heap size but when I am increasing more than 1024
-XX:MaxPermSize=512M -Xmx2048m... server not allowed to start itself.
- And while exploring the google I got the below documentation mentioned to update the oak.update.limit=10000. (https://cqdump.wordpress.com/2017/08/30/aem-transaction-size-or-do-a-save-every-1000-nodes/)
but I don't know where this property is added.
how to resolve this issue? Please guide me. Thanks in advance
Increasing the heap size would be best recommended here, can you specify the errors you see on increasing the heap? We generally advise that your allocated heap does not exceed 50% of your total RAM.
Related to oak.update.limit, you can check https://helpx.adobe.com/in/experience-manager/kb/performance-tuning-tips.html#TuningyourOakRepositor... on further recommendations.
Add the below JVM parameters in the AEM start script to prevent expansive queries from overloading the systems:
-Dupdate.limit=250000 (only for DocumentNodeStore, eg. MongoMK, RDBMK)
If you really need to update 12k nodes in a single transaction, you should really increase the heap, because this transaction must fit into the heap. And 1GB of heapsize might be sufficient for DEV systems, but for PROD systems you should increase the heap.
Do you really need to update these 12k in a single transaction? Isn't it possible to split it up?
1. You are not able to set a higher heap size than 1GB. Please verify if there are any other system limitations - meaning, what is the memory in your computer and how much is free for AEM?
2. If you have sufficient memory, try setting 4GB in a new aem instance to ensure the issue is something other than the heap size. Many will normally set more than 1 GB in local.
with regards to 12k transactions - I have done similar in the past but always chunk the transactions to 1000 or so update/inserts per commit. Ideally this is a configurable number in your logic to test out what is optimal for your scenario.
So, in short you might want to verify the higher heap size and reduced number of updates per commit/save.