コミュニティアチーブメントバーを展開する。

Submissions are now open for the 2026 Adobe Experience Maker Awards.

Using replication agent or similar mechanism to push to s3

Avatar

Level 1

Is it possible to utilize the replication agents or a similar mechanism from an author instance to distribute content to an S3 bucket?  I don't intend to use S3 for the entire instance's datastore, I only need certain content directories.  My desire is to have a push mechanism in place rather than a side loaded script that will export data from my AEM instance and then push to downstream to S3.  

 

Apologies for the vague nature of my query as I can only provide so much information due to the sensitive nature of the project.  

 

Additional info : 6.5 SP18 On premise installation on RHEL

トピック

トピックはコミュニティのコンテンツの分類に役立ち、関連コンテンツを発見する可能性を広げます。

1 返信

Avatar

Community Advisor

Hi @pronic45,

IMO, you won’t be able to use the OOTB replication agents directly to push to S3 - they’re really meant for AEM-to-AEM communication. There isn’t a native S3 transport handler that you can just configure.

That said, there are a couple of ways I’d approach it depending on how tightly you want to integrate with AEM’s replication framework:

  • Custom TransportHandler -> you could extend the replication agent mechanism by writing a custom transport that uses the AWS SDK (or REST API) to drop content into your S3 bucket. This way you still get replication queues, retries, and can limit it by path (say only /content/dam/myproject).

  • Workflow step triggered by activation -> probably the simpler option IMO. You add a workflow process that listens to replication/activation events and then pushes the asset or JSON extract to S3. This is clean if you only care about specific directories.

  • Other routes -> Sling Distribution or ACS Commons could also be extended, but in practice the two options above are what I’ve seen used most often.

Personally, I’d lean toward the workflow step if you just need certain folders/files to go to S3, since it’s easier to maintain. If you need proper queuing/retry semantics, then implementing a custom TransportHandler for replication is more robust.

Key things to watch for: large binary handling (multi-part uploads), IAM permissions (bucket-scoped), and making sure you filter the right paths so you don’t overload S3.

So, short answer: not OOTB, but yes - with a little customization you can definitely have a push mechanism from Author -> S3 without resorting to a side-loaded script.


Santosh Sai

AEM BlogsLinkedIn