Highlighted

How to resolve the java.heap.space out of memoryerror while updating more than 10000 node properties ?

Avatar

Avatar

nareshkumarpart

Avatar

nareshkumarpart

nareshkumarpart

28-06-2020

I am using Aem 6.5 server and updating the node properties in 12K nodes. i am getting the java.heap.space and out of memory error.

- I have tried these below ways, but no luck

- Using synchronized block, save the session and resolver.commit() after updating the JCR properties. Even I am facing the same issue.

- And tried to increase the heap size but  when I am increasing more than 1024 

     -XX:MaxPermSize=512M -Xmx2048m... server not allowed to start itself.

 

- And while exploring the google I got the below documentation mentioned to update the oak.update.limit=10000. (https://cqdump.wordpress.com/2017/08/30/aem-transaction-size-or-do-a-save-every-1000-nodes/)

but I don't know where this property is added.

 

how to resolve this issue? Please guide me. Thanks in advance

Replies

Highlighted

Avatar

Avatar

toimrank

Avatar

toimrank

toimrank

28-06-2020

Are we creating multiple resourceresolver objects for every node update?

Highlighted

Avatar

Avatar

vanegi

Employee

Avatar

vanegi

Employee

vanegi
Employee

28-06-2020

Hi Naresh,

Increasing the heap size would be best recommended here, can you specify the errors you see on increasing the heap? We generally advise that your allocated heap does not exceed 50% of your total RAM.

 

Related to oak.update.limit, you can check https://helpx.adobe.com/in/experience-manager/kb/performance-tuning-tips.html#TuningyourOakRepositor... on further recommendations.

 

 

Add the below JVM parameters in the AEM start script to prevent expansive queries from overloading the systems:

 

-Dupdate.limit=250000 (only for DocumentNodeStore, eg. MongoMK, RDBMK)

 

 

Thanks,

Vaishali

Avatar

Avatar

Jörg_Hoh

Employee

Total Posts

3.0K

Likes

910

Correct Answer

1.0K

Avatar

Jörg_Hoh

Employee

Total Posts

3.0K

Likes

910

Correct Answer

1.0K
Jörg_Hoh
Employee

28-06-2020

If you really need to update 12k nodes in a single transaction, you should really increase the heap, because this transaction must fit into the heap. And 1GB of heapsize might be sufficient for DEV systems, but for PROD systems you should increase the heap.

Do you really need to update these 12k in a single transaction? Isn't it possible to split it up?

Highlighted

Avatar

Avatar

nareshkumarpart

Avatar

nareshkumarpart

nareshkumarpart

28-06-2020

yes i need to update the 12k records . based on this property i am doing some query operation to search results.
Highlighted

Avatar

Avatar

tatvam

Avatar

tatvam

tatvam

28-06-2020

1. You are not able to set a higher heap size than 1GB. Please verify if there are any other system limitations - meaning, what is the memory in your computer and how much is free for AEM?

2. If you have sufficient memory, try setting 4GB in a new aem instance to ensure the issue is something other than the heap size. Many will normally set more than 1 GB in local.

 

with regards to 12k transactions - I have done similar in the past but always chunk the transactions to 1000 or so update/inserts per commit. Ideally this is a configurable number in your logic to test out what is optimal for your scenario.

 

So, in short you might want to verify the higher heap size and reduced number of updates per commit/save.

Highlighted

Avatar

Avatar

nareshkumarpart

Avatar

nareshkumarpart

nareshkumarpart

29-06-2020

Thanks Tatvam. I am managing the transactions for commit to 1000 nodes in code itself.