I want to integrate AEM 6.5 with Akamai to handle replication and caching.
Is their any documentation for implementation of Akamai with AEM ?
Solved! Go to Solution.
If you get your expiration headers right on AEM/dispatcher side, just configure Akamai to use your AEM instance as origin and configure it to acept the expiration dates provided by AEM.
Then you don't need to write any code which connects AEM with Akamai.
There is no specific way how AEM should integrate.
Adobe Consulting typically recommends to use only TTL-based expiration, so there's no need to deal with active invalidation on Akamai side. That works quite well in most cases and delivers the expected benefit of Akmai.
Jörg
Views
Replies
Total Likes
Yes, I agree with you but I need some steps to configure AEM with Akamai and sample service utility class code to understand how to connect AEM with Akamai.
Views
Replies
Total Likes
If you get your expiration headers right on AEM/dispatcher side, just configure Akamai to use your AEM instance as origin and configure it to acept the expiration dates provided by AEM.
Then you don't need to write any code which connects AEM with Akamai.
Ohh that's the point. Thanks Jorg for the quick reply.
Views
Replies
Total Likes
@Jörg, I am no Akamai expert, but I assume what you mean is that cached pages in Akamai will only be refreshed from origin after the TTL has elapsed, correct? so if you have a long TTL, say an hour, this means there will be at most one hour where there are stale pages?
That's true. It depends if you can live with that.
To be honest, unless you provide time critical content like news or SEC relevant data, being a bit out of date doesn't really matter.
Ah ok. Yeah, I agree with you on that. But it's hard to get a client to adopt that. Clients typically want an instantaneous experience and want their content live, as soon as they hit publish. Which is not a bad expectation, if the code and APIs support it without complications.
Thanks for the insights!
But even in that case, invalidating the a page on Akamai will take minutes to take effect. So it's never instanteous. If you want to have full control over your site world-wide, you should not use CDNs.
But that has downsides too, and that's the reason why most customers give up some control for a better performance.
yes, of course, minutes are acceptable. I think Akamai's is 5 minutes or something close to that. But that is significantly lower than 30 minutes or an hour. and would allow clients to have much longer TTLs.
But that doesn't make sense. If you set a TTL of 1 hour to your content, but invalidate content directly on Akamai (and let's say with an overall delay of 5 minutes), a browser will cache that for up to 60 minutes, and even if immediately after retrieving new content is published on the very same page, the browser can present for 55 minutes the same old content to the user.
Anyway, you can set a TTL of a page also to 5 minutes. In that case your traffic to origin will increase a bit, but you still get the benefit from using a CDN. Without too much delays and without actively invalidating Akamai.
Regards,
Jörg
Are you saying you cannot have seperat cache policy for Akamai and The browser? They HAVE to use the same policy? This seems like a strange limitation...
Does it make sense to have a different policy? Does it make sense to have Akamai refresh every 5 minutes, but instruct the browser to cache for 1hour?
Anyway, for all these caching things I try to make simple solutions. I rather accept a lower cache-hit ratio and a flawless operations than a highly-sophisticated setup with great cache-hit ratios, which is hard to understand and even harder to debug.