Hi All,
We are working on using AEM with Amazon S3.In, normal scenario AEM provide connector to Communicate with S3. All we need to do is change the configuration.
Our case- Amazon S3 stores data in a flat directory structure, but in our case we want to create a new folder every time the folder reaches a particular object count.
Example- suppose we want to store 5 objects and we have restriction of 3 objects per folder then after 3 objects it will save the next object in a new folder.
is there any way to do these? Thanks in advance.
Solved! Go to Solution.
Views
Replies
Total Likes
Hi Devendra,
the connector is not meant to be extended. As such, there does not appear to be any standrd/official approach for you to take. You are left with writing your own implementation, you would either need to get in touch with Adobe consulting to discuss this or a better approach might be to contact daycare with your use case, you may be able to buy a paid for feature pack, depending on whether it is possible.
Regards,
Opkar
Views
Replies
Total Likes
I have asked internal experts to have a look at this one.
~kautuk
Views
Replies
Total Likes
Hi,
why do you need to change the behaviour? The connector should be transparent to you. You should control how the folders are created in AEM, the connector determines the optimum strategy for how to create the content in S3.
Perhaps you can give more details on the use case here. Is there a limitation you are trying to get around or you feel the connector does not save in an optimal way?
Regards,
Opkar
Views
Replies
Total Likes
we're wanting to write AEM binaries into a in house S3 (an exact replica of amazon S3). however, our in house S3 is limited to 64K objects per directory (prefix). this means we have a limit of 64k object count (number of objects). so whenever we write our binainres into S3 we want to split this into 3 subfolders. We could not see how we could achive this with the Adobe S3Connector, but keen to understand if this can be achieved with the S3Connector? We also thought of developing a custom workflow If the S3Connctor is not suitable. Our approach would than be to develop a workflow which listens to new binary data and pushes it to S3 into three subfolders that we can progrematically set.
Views
Replies
Total Likes
Hi Devendra
Adding couple of more comments from other experts on top Opkar (AEM expert):
1. You should not care about how the objects are structured in S3. That is an implementation detail. It is strongly discouraged to implement our own S3 connector (or their own DataStore in general), although it is possible.
2. The requirement requires project level tweaking. All S3 actions are taken care by s3 connector bits released by product team. The s3 objects ids are based on the hash created by the binary. Is there any specific reason why do you need folders in s3 ? S3 folders are some extra meta at s3 object level.
~kautuk
Views
Replies
Total Likes
Hi Kautuksahni,
I agree with you. However, as previoulsy stated, we have a limit on our inhouse S3 as to how many objects can be stored in a folder. That is what we are trying to get around.
example- It seems the provided implementation stores the data in a flat directory structure - there is a limit with the implementation which means we'll need to make additional directories (perhaps in a similar way to the file system store where file with hash f00886f74ecd85bdd1ef27055b0aea529ce5200f is stored as f0/08/86/f00886f74ecd85bdd1ef27055b0aea529ce5200f)
Views
Replies
Total Likes
Hi Devendra,
the advice here is that the AEM S3 connector cannot be customised, so your approach is not possible with OOTB functionality, as you suggested you need to then use the S3 connector. If you go down this route, you would be writing your own S3 connector and data store implementation, which is discouraged. I understand you are looking for advice on how to use the S3 connector and what is possible, but as you have seen, just about all AEM customers never go down to this level of detail, so it's the wrong audience for this type of question on S3. It's not that we are not being helpful, more there is not the knowledge here to ask such an edge case question. I'd recommend talking to AWS to get their view on your use case, if you must go down this route.
Regards,
Opkar
Views
Replies
Total Likes
Hi Opkar,
Let me put my question in a more clear way. we have our own cloud data store which uses the same API as Amazon S3. apart from the API our cloud system is different. there are lots of restriction and that is why we need to write a connector for our cloud storage. is there anything that adobe can help. we are not sure how to read the binaries from AEM. we also need to take care of some other tasks as well like caching,garbage collection e.t.c so, could you please tell me the feasibility of this approach.
Regards,
Devendra
Views
Replies
Total Likes
Hi Devendra,
the connector is not meant to be extended. As such, there does not appear to be any standrd/official approach for you to take. You are left with writing your own implementation, you would either need to get in touch with Adobe consulting to discuss this or a better approach might be to contact daycare with your use case, you may be able to buy a paid for feature pack, depending on whether it is possible.
Regards,
Opkar
Views
Replies
Total Likes
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies