Custom index for DAM expiry job to not exceed memory limit

bobkranson

11-08-2020

We are operating on a large repository.  I see this warning message re-occurring and am concerned the expired assets are not fully listed to the authors.  Is there an index which can be used or created to solve this?

com.day.cq.dam.core.impl.ExpiryNotificationJobImpl] org.apache.jackrabbit.oak.query.FilterIterators The query read more than 500000 nodes in memory.

DAM DAM Schema index indexing

Accepted Solutions (1)

Accepted Solutions (1)

Vijayalakshmi_S

MVP

11-08-2020

Hi @bobkranson,

Can you share the query that is getting executed from the existing logs. 

If it is not available, add a Log entry (for below in DEBUG mode) and reproduce the scenario.

http://localhost:4502/system/console/slinglog

org.apache.jackrabbit.oak.query.QueryEngineImpl
org.apache.jackrabbit.oak.query.SQL2Parser
org.apache.jackrabbit.oak.query.QueryImpl
org.apache.jackrabbit.oak.query

Based on the query executed, we can decide on the index definition.

 

Answers (1)

Answers (1)

vanegi

Employee

11-08-2020

Hi @bobkranson,

Have you enabled debug logs for all the below classes, if not yet can you please do that. DEBUG logging configuration can be added for the following packages, write a separate log file. Also let me know what specific queries you are running.

 

org.apache.jackrabbit.oak.plugins.index

org.apache.jackrabbit.oak.query

com.day.cq.search

 

Once you have the query, you can post the query in Explain Query http://host:port/libs/granite/operations/content/diagnosistools/queryPerformance.html and see if it is traversal or utilizing any OOTB index?

 

If it is traversal, a custom index can be created using https://oakutils.appspot.com/generate/index.

 

Thanks!!