Hi,
We are currently using Adobe Experience Manager (AEM) 6.5 and Akamai cache invalidation via custom CP (Cache Purge) code. However, we are experiencing performance issues where the Akamai cache flush causes high CPU usage and slowness on the AEM Publisher as every request is being directed to the Publisher.
Could you please provide guidance on the best approach to resolve this issue? Specifically, we are looking for recommendations on how to implement custom replication through AEM to reduce the load on the Publisher and optimize cache management.
Thank you for your support.
Thanks,
Sai
Solved! Go to Solution.
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
Hi,
I agree with @Anil_Chennapragada —it’s difficult to provide a definitive answer because this is tied to your business requirements and how your AEM implementation is structured. In general terms, it seems like you rely heavily on the CDN cache, so I would suggest starting there. Try asking the following questions:
Why is it necessary to flush the CDN cache? Is there a way to switch to an auto-invalidation (TTL) method instead?
Which parts of the CDN cache are being flushed? Is it possible to narrow down and avoid invalidating the cache for the more expensive pages?
Are there specific pages causing the CPU spikes? Do those pages seem okay? Can you troubleshoot for bottlenecks or poor coding practices that could improve overall performance and reduce the reliance on cache?
Have you ever conducted a performance test on these pages?
Hope this helps
hi @Sai1278 ,
Thanks for reaching out to Adobe community.
There are multiple ways to handle this and it actually depends on the business use-case of how frequently you want to see the latest content on the final site
So based on the business use-case we can approach the solution that best works for improving performance of the pages.
Thanks,
Anil
Hi,
I agree with @Anil_Chennapragada —it’s difficult to provide a definitive answer because this is tied to your business requirements and how your AEM implementation is structured. In general terms, it seems like you rely heavily on the CDN cache, so I would suggest starting there. Try asking the following questions:
Why is it necessary to flush the CDN cache? Is there a way to switch to an auto-invalidation (TTL) method instead?
Which parts of the CDN cache are being flushed? Is it possible to narrow down and avoid invalidating the cache for the more expensive pages?
Are there specific pages causing the CPU spikes? Do those pages seem okay? Can you troubleshoot for bottlenecks or poor coding practices that could improve overall performance and reduce the reliance on cache?
Have you ever conducted a performance test on these pages?
Hope this helps
Addition to what @EstebanBustamante Added,
you must check :
Are you flushing cache for entire CP code or just a page with every replication?
Why there is a load on publish if you just replicate one page? The rest of the pages must be there on both CDN and dispatcher?
Please check statfile if entire dispatcher flush is getting flushed with single page replication.
Hi @arunpatidar @EstebanBustamante @Anil_Chennapragada
I'm not able to see the changes after the release js/css changes aren't reflecting on publish. However, after clearing the cache through Akamai the changes become visible. I have also cleared the cache globally using the CP code.
Hi @Sai1278,
if you are correctly using Client libs in AEM, then with each change, you get a different hash value as part of the JS/CSS file name. For example: "/etc.clientlibs/my-project/client-lubs/global.lc-152acd4a1d8361df497f1c77d403780e-lc.min.css"
Therefore, you can cache JS/CSS files for a long time on the CDN, as each change triggers a new request. It doesn't sound like a use-case for a custom CDN flush.
Good luck,
Daniel
As mentioned by @daniel-strmecki , each clientlib generated in AEM includes a unique hash based on its content. This ensures that any change in the clientlib (e.g., code or CSS updates) will create a new file with a new hash, making caching a non-issue for these updated js/css.
However, if the page still references an old version of the clientlib (due to outdated cache), you will need to regenerate page again to reference new clientlibs. This requires clearing both the AEM Dispatcher and CDN cache for the affected pages so they load the latest clientlib versions and reflect recent updates.
Hi @Sai1278,
my opinion on this topic is that CDN cache invalidation via publication events usually gets quite complex and is not worth the effort. I generally avoid it unless it is really necessary. Therefore, I would usually server 90% of the content with the TTL approach and stale while revalidate option on CDN.
I only use CDN's API-based cache invalidation for specific APIs and pages where there is a strict requirement that they need to be available on the CDN immediately.
Good luck,
Daniel
Views
Like
Replies