Hi Team
Is there is any utility or process to take full backup of AEM Server (on-premises) instead of taking "crx-quickstart" folder level i.e. zip or tar because this folder size large size we don't have enough spaces at server . Looking for full backup then subsequent backup on incremental basis rather than each time taking as full backup.
Regards
Vara
Views
Replies
Total Likes
Hi,
The type of backup you mentioned is suitable for a full restoration of AEM. However, if you are only looking to backup the application, the package manager is your best option. With packages, you can backup specific "parts" of the application, such as content (pages and assets), limiting the scope to what you are interested in.
You can learn more here: https://experienceleague.adobe.com/en/docs/experience-manager-65/content/sites/administering/content...
Hope this helps.
Hi @varaande
You can look into using the AEM Online backup for the same. Here is the Adobe documentation and an helpful article for the same -
https://medium.com/vrt-digital-studio/consistent-aem-backups-ac8a49d6701e
Hope this helps!
Thanks
Narendra
So a lot of this will depend on how your server architecture is setup and what tools you have available to you. But at my previous company we used to do a backup strategy similar to what you mentioned. We would take every 3 hours an online backup using the documentation here https://experienceleague.adobe.com/en/docs/experience-manager-65/content/sites/administering/operati...
Then every night we would do a copy of the crx-quickstart and everything over to a separate backups mount on the same server. Because it was the daily backup we didn't have the same restoration timeline as the online backup so we could go for a cheaper but slower to recover NFS storage on the backup mount. Depending on how many days of backups you want to keep you can do that number of mounts on the server. One note is that we had to make sure we had 3 times the size of the repository on the backup mount (so if your repository is 300GB your backup mount should be about 1TB) because when you copy the current repository to the backups mount there would be two copies of the repository (so 600GB taken) before you deleted the previous day's backup.
In addition to that, we also had redundancy in a different data center where if there was a disaster we could recover a snapshot of the server into a newly stood up server (using automated chef scripts). So we had online incremental backups, a daily backup of everything on the same server, and then server snapshots which could be recovered in case of a total disaster.
Did you find the suggestions from users helpful? Please let us know if more information is required. Otherwise, please mark the answer as correct for posterity. If you have found out solution yourself, please share it with the community
Did you find the suggestions from users helpful? Please let us know if more information is required. Otherwise, please mark the answer as correct for posterity. If you have found out solution yourself, please share it with the community
cc: @kautuk_sahni
Views
Likes
Replies
Views
Likes
Replies