Expand my Community achievements bar.

AEM 6.5 Akamai Custom Replication Agent

Avatar

Level 3

Hi,

We are currently using Adobe Experience Manager (AEM) 6.5 and Akamai cache invalidation via custom CP (Cache Purge) code. However, we are experiencing performance issues where the Akamai cache flush causes high CPU usage and slowness on the AEM Publisher as every request is being directed to the Publisher.

Could you please provide guidance on the best approach to resolve this issue? Specifically, we are looking for recommendations on how to implement custom replication through AEM to reduce the load on the Publisher and optimize cache management.

Thank you for your support.


Thanks,
Sai

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

7 Replies

Avatar

Level 4

hi @Sai1278 ,

Thanks for reaching out to Adobe community. 

 

There are multiple ways to handle this and it actually depends on the business use-case of how frequently you want to see the latest content on the final site

 

  • Generally, we cache the content (pages, Static Assets etc) in both dispatcher and CDN (in this case Akamai) to improve the performance and avoid load on the publishers.
  • In a typical static site where the content is all static, we can go with TTL option in akamai, which can cache the data for specific time-interval. The caching also depends on the type of the files, Example - Static assets can be cached for longer duration like days where as the page html can be cached between hours to days (based on the frequency of the updates)
  • We can configure an akamai Invalidation utility if needed to flush it on demand or specific pages (by configuring the regex paths specifically) 
  • also, we should leverage the dispatcher cache to full extent where the data (Images/html/static assets) should be served from dispatcher and should be invalidated based on page activations.

So based on the business use-case we can approach the solution that best works for improving performance of the pages.

 

Thanks,

Anil

Avatar

Community Advisor

Hi,

 

I agree with @Anil_Chennapragada —it’s difficult to provide a definitive answer because this is tied to your business requirements and how your AEM implementation is structured. In general terms, it seems like you rely heavily on the CDN cache, so I would suggest starting there. Try asking the following questions:

 

  • Why is it necessary to flush the CDN cache? Is there a way to switch to an auto-invalidation (TTL) method instead?

  • Which parts of the CDN cache are being flushed? Is it possible to narrow down and avoid invalidating the cache for the more expensive pages?

  • Are there specific pages causing the CPU spikes? Do those pages seem okay? Can you troubleshoot for bottlenecks or poor coding practices that could improve overall performance and reduce the reliance on cache?

  • Have you ever conducted a performance test on these pages?

 

Hope this helps



Esteban Bustamante

Avatar

Community Advisor

Addition to what @EstebanBustamante Added,

you must check :

Are you flushing cache for entire CP code or just a page with every replication? 

Why there is a load on publish if you just replicate one page? The rest of the pages must be there on both CDN and dispatcher?
Please check statfile if entire dispatcher flush is getting flushed with single page replication.



Arun Patidar

Avatar

Level 3

Hi @arunpatidar @EstebanBustamante @Anil_Chennapragada 

I'm not able to see the changes after the release js/css changes aren't reflecting on publish. However, after clearing the cache through Akamai the changes become visible. I have also cleared the cache globally using the CP code.

 

  • Now my requirement is need to create customer replication agent through aem 

Avatar

Level 8

Hi @Sai1278,

if you are correctly using Client libs in AEM, then with each change, you get a different hash value as part of the JS/CSS file name. For example: "/etc.clientlibs/my-project/client-lubs/global.lc-152acd4a1d8361df497f1c77d403780e-lc.min.css"

Therefore, you can cache JS/CSS files for a long time on the CDN, as each change triggers a new request. It doesn't sound like a use-case for a custom CDN flush.

 

Good luck,

Daniel

Avatar

Community Advisor

As mentioned by @daniel-strmecki , each clientlib generated in AEM includes a unique hash based on its content. This ensures that any change in the clientlib (e.g., code or CSS updates) will create a new file with a new hash, making caching a non-issue for these updated js/css.

However, if the page still references an old version of the clientlib (due to outdated cache), you will need to regenerate page again to reference new clientlibs. This requires clearing both the AEM Dispatcher and CDN cache for the affected pages so they load the latest clientlib versions and reflect recent updates.



Arun Patidar

Avatar

Level 8

Hi @Sai1278,

my opinion on this topic is that CDN cache invalidation via publication events usually gets quite complex and is not worth the effort. I generally avoid it unless it is really necessary. Therefore, I would usually server 90% of the content with the TTL approach and stale while revalidate option on CDN.

I only use CDN's API-based cache invalidation for specific APIs and pages where there is a strict requirement that they need to be available on the CDN immediately.

 

Good luck,

Daniel