Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
BedrockMission!

Learn More

View all

Sign in to view all badges

How to resolve the java.heap.space out of memoryerror while updating more than 10000 node properties ?

Avatar

Avatar
Ignite 1
Level 2
nareshkumarpart
Level 2

Likes

3 likes

Total Posts

8 posts

Correct Reply

0 solutions
Top badges earned
Ignite 1
Give Back
Boost 3
Boost 1
View profile

Avatar
Ignite 1
Level 2
nareshkumarpart
Level 2

Likes

3 likes

Total Posts

8 posts

Correct Reply

0 solutions
Top badges earned
Ignite 1
Give Back
Boost 3
Boost 1
View profile
nareshkumarpart
Level 2

28-06-2020

I am using Aem 6.5 server and updating the node properties in 12K nodes. i am getting the java.heap.space and out of memory error.

- I have tried these below ways, but no luck

- Using synchronized block, save the session and resolver.commit() after updating the JCR properties. Even I am facing the same issue.

- And tried to increase the heap size but  when I am increasing more than 1024 

     -XX:MaxPermSize=512M -Xmx2048m... server not allowed to start itself.

 

- And while exploring the google I got the below documentation mentioned to update the oak.update.limit=10000. (https://cqdump.wordpress.com/2017/08/30/aem-transaction-size-or-do-a-save-every-1000-nodes/)

but I don't know where this property is added.

 

how to resolve this issue? Please guide me. Thanks in advance

Replies

Avatar

Avatar
Shape 1
Level 2
toimrank
Level 2

Likes

9 likes

Total Posts

22 posts

Correct Reply

1 solution
Top badges earned
Shape 1
Validate 1
Give Back
Boost 5
Boost 3
View profile

Avatar
Shape 1
Level 2
toimrank
Level 2

Likes

9 likes

Total Posts

22 posts

Correct Reply

1 solution
Top badges earned
Shape 1
Validate 1
Give Back
Boost 5
Boost 3
View profile
toimrank
Level 2

28-06-2020

Are we creating multiple resourceresolver objects for every node update?

Avatar

Avatar
Give Back 5
Employee
vanegi
Employee

Likes

389 likes

Total Posts

377 posts

Correct Reply

147 solutions
Top badges earned
Give Back 5
Give Back 3
Give Back 10
Give Back
Boost 50
View profile

Avatar
Give Back 5
Employee
vanegi
Employee

Likes

389 likes

Total Posts

377 posts

Correct Reply

147 solutions
Top badges earned
Give Back 5
Give Back 3
Give Back 10
Give Back
Boost 50
View profile
vanegi
Employee

28-06-2020

Hi Naresh,

Increasing the heap size would be best recommended here, can you specify the errors you see on increasing the heap? We generally advise that your allocated heap does not exceed 50% of your total RAM.

 

Related to oak.update.limit, you can check https://helpx.adobe.com/in/experience-manager/kb/performance-tuning-tips.html#TuningyourOakRepositor... on further recommendations.

 

 

Add the below JVM parameters in the AEM start script to prevent expansive queries from overloading the systems:

 

-Dupdate.limit=250000 (only for DocumentNodeStore, eg. MongoMK, RDBMK)

 

 

Thanks,

Vaishali

Avatar

Avatar
Coach
Employee
Jörg_Hoh
Employee

Likes

1,118 likes

Total Posts

3,149 posts

Correct Reply

1,073 solutions
Top badges earned
Coach
Give back 600
Ignite 5
Ignite 3
Ignite 1
View profile

Avatar
Coach
Employee
Jörg_Hoh
Employee

Likes

1,118 likes

Total Posts

3,149 posts

Correct Reply

1,073 solutions
Top badges earned
Coach
Give back 600
Ignite 5
Ignite 3
Ignite 1
View profile
Jörg_Hoh
Employee

28-06-2020

If you really need to update 12k nodes in a single transaction, you should really increase the heap, because this transaction must fit into the heap. And 1GB of heapsize might be sufficient for DEV systems, but for PROD systems you should increase the heap.

Do you really need to update these 12k in a single transaction? Isn't it possible to split it up?

Avatar

Avatar
Ignite 1
Level 2
nareshkumarpart
Level 2

Likes

3 likes

Total Posts

8 posts

Correct Reply

0 solutions
Top badges earned
Ignite 1
Give Back
Boost 3
Boost 1
View profile

Avatar
Ignite 1
Level 2
nareshkumarpart
Level 2

Likes

3 likes

Total Posts

8 posts

Correct Reply

0 solutions
Top badges earned
Ignite 1
Give Back
Boost 3
Boost 1
View profile
nareshkumarpart
Level 2

28-06-2020

yes i need to update the 12k records . based on this property i am doing some query operation to search results.

Avatar

Avatar
Give Back 3
Level 2
tatvam
Level 2

Likes

13 likes

Total Posts

15 posts

Correct Reply

1 solution
Top badges earned
Give Back 3
Give Back
Boost 5
Boost 3
Boost 10
View profile

Avatar
Give Back 3
Level 2
tatvam
Level 2

Likes

13 likes

Total Posts

15 posts

Correct Reply

1 solution
Top badges earned
Give Back 3
Give Back
Boost 5
Boost 3
Boost 10
View profile
tatvam
Level 2

28-06-2020

1. You are not able to set a higher heap size than 1GB. Please verify if there are any other system limitations - meaning, what is the memory in your computer and how much is free for AEM?

2. If you have sufficient memory, try setting 4GB in a new aem instance to ensure the issue is something other than the heap size. Many will normally set more than 1 GB in local.

 

with regards to 12k transactions - I have done similar in the past but always chunk the transactions to 1000 or so update/inserts per commit. Ideally this is a configurable number in your logic to test out what is optimal for your scenario.

 

So, in short you might want to verify the higher heap size and reduced number of updates per commit/save.

Avatar

Avatar
Ignite 1
Level 2
nareshkumarpart
Level 2

Likes

3 likes

Total Posts

8 posts

Correct Reply

0 solutions
Top badges earned
Ignite 1
Give Back
Boost 3
Boost 1
View profile

Avatar
Ignite 1
Level 2
nareshkumarpart
Level 2

Likes

3 likes

Total Posts

8 posts

Correct Reply

0 solutions
Top badges earned
Ignite 1
Give Back
Boost 3
Boost 1
View profile
nareshkumarpart
Level 2

29-06-2020

Thanks Tatvam. I am managing the transactions for commit to 1000 nodes in code itself.