I have a requirement to use google cloud storage (CDN) to store the AEM pages when ever the page is published to publisher environment. How can i achieve this? Whenever the page is published the already exisiting page must be updated and if a new page is published this page has to be added to the google bucket storage. We have a simple workflow to publish pages , so after page is published, this page has to be updated/added to Google cloud Storage.
Can someone please give me any pointers how i can start this with.
Thanks in advance
Solved! Go to Solution.
So do I understand you correctly: You want to store copies of rendered pages (created right after publication of the pages) in Google Cloud Storage. And if a copy of a page already exists at Google Cloud Storage, this existing should be moved to Google Bucket Storage and then be overwritten by the latest copy of that page.
Is that correct?
To Clarify - We have a Bucket created specifically for our project in Google Cloud Storage, So the pages has to be added/updated onto this Bucket. Does that makes sense?
There is no out-of-the-box solution available for this, you would need to implement it on your own. There is a static file exporter available as replication transport, which renders the page to a local file. When you already use a workflow, you could trigger this functionality via workflow and then pick up the files and upload them to the Google Cloud Storage. Or you code it all by yourself (which might be required when you need to export CSS/JS/referenced assets as well to store them in the cloud.
I am not familiar with Google Cloud as a CDN, therefor I cannot tell you how to do it. But when you base you caching strategy on TTL-based expiration (as I always recommend for any CDN, including Akamai), it shoulnd't be to hard to configure Google CDN to respect these headers.