AEM 6.5 Akamai Custom Replication Agent | Community
Skip to main content
Level 3
November 13, 2024
Solved

AEM 6.5 Akamai Custom Replication Agent

  • November 13, 2024
  • 3 replies
  • 1704 views

Hi,

We are currently using Adobe Experience Manager (AEM) 6.5 and Akamai cache invalidation via custom CP (Cache Purge) code. However, we are experiencing performance issues where the Akamai cache flush causes high CPU usage and slowness on the AEM Publisher as every request is being directed to the Publisher.

Could you please provide guidance on the best approach to resolve this issue? Specifically, we are looking for recommendations on how to implement custom replication through AEM to reduce the load on the Publisher and optimize cache management.

Thank you for your support.


Thanks,
Sai

Best answer by EstebanBustamante

Hi,

 

I agree with @anil_chennapragada —it’s difficult to provide a definitive answer because this is tied to your business requirements and how your AEM implementation is structured. In general terms, it seems like you rely heavily on the CDN cache, so I would suggest starting there. Try asking the following questions:

 

  • Why is it necessary to flush the CDN cache? Is there a way to switch to an auto-invalidation (TTL) method instead?

  • Which parts of the CDN cache are being flushed? Is it possible to narrow down and avoid invalidating the cache for the more expensive pages?

  • Are there specific pages causing the CPU spikes? Do those pages seem okay? Can you troubleshoot for bottlenecks or poor coding practices that could improve overall performance and reduce the reliance on cache?

  • Have you ever conducted a performance test on these pages?

 

Hope this helps

3 replies

Level 4
November 13, 2024

hi @sai1278 ,

Thanks for reaching out to Adobe community. 

 

There are multiple ways to handle this and it actually depends on the business use-case of how frequently you want to see the latest content on the final site

 

  • Generally, we cache the content (pages, Static Assets etc) in both dispatcher and CDN (in this case Akamai) to improve the performance and avoid load on the publishers.
  • In a typical static site where the content is all static, we can go with TTL option in akamai, which can cache the data for specific time-interval. The caching also depends on the type of the files, Example - Static assets can be cached for longer duration like days where as the page html can be cached between hours to days (based on the frequency of the updates)
  • We can configure an akamai Invalidation utility if needed to flush it on demand or specific pages (by configuring the regex paths specifically) 
  • also, we should leverage the dispatcher cache to full extent where the data (Images/html/static assets) should be served from dispatcher and should be invalidated based on page activations.

So based on the business use-case we can approach the solution that best works for improving performance of the pages.

 

Thanks,

Anil

EstebanBustamante
Community Advisor and Adobe Champion
EstebanBustamanteCommunity Advisor and Adobe ChampionAccepted solution
Community Advisor and Adobe Champion
November 13, 2024

Hi,

 

I agree with @anil_chennapragada —it’s difficult to provide a definitive answer because this is tied to your business requirements and how your AEM implementation is structured. In general terms, it seems like you rely heavily on the CDN cache, so I would suggest starting there. Try asking the following questions:

 

  • Why is it necessary to flush the CDN cache? Is there a way to switch to an auto-invalidation (TTL) method instead?

  • Which parts of the CDN cache are being flushed? Is it possible to narrow down and avoid invalidating the cache for the more expensive pages?

  • Are there specific pages causing the CPU spikes? Do those pages seem okay? Can you troubleshoot for bottlenecks or poor coding practices that could improve overall performance and reduce the reliance on cache?

  • Have you ever conducted a performance test on these pages?

 

Hope this helps

Esteban Bustamante
arunpatidar
Community Advisor
Community Advisor
November 13, 2024

Addition to what @estebanbustamante Added,

you must check :

Are you flushing cache for entire CP code or just a page with every replication? 

Why there is a load on publish if you just replicate one page? The rest of the pages must be there on both CDN and dispatcher?
Please check statfile if entire dispatcher flush is getting flushed with single page replication.

Arun Patidar
daniel-strmecki
Community Advisor and Adobe Champion
Community Advisor and Adobe Champion
November 13, 2024

Hi @sai1278,

my opinion on this topic is that CDN cache invalidation via publication events usually gets quite complex and is not worth the effort. I generally avoid it unless it is really necessary. Therefore, I would usually server 90% of the content with the TTL approach and stale while revalidate option on CDN.

I only use CDN's API-based cache invalidation for specific APIs and pages where there is a strict requirement that they need to be available on the CDN immediately.

 

Good luck,

Daniel