Expand my Community achievements bar.

Adobe Developers Live 2021, Session | Asset Bulk Ingestion

Avatar

Administrator

BlogImage.jpg

Session Details

This session will introduce the new feature of assets bulk ingestion in cloud service and its' scalability & performance.

Experience League Session Recording 

Session Schedule

8-Feb, 10:45 AM - 11:15 AM PST

Speaker(s)

Kaushal Mall & Jun Zhang

Full Schedule

Check Here

Q&A

Please use this thread to ask the question related to this Session. Click here for the session slides.

Session FAQs

Q. Will this only work for photo assets and documents, will it also work for content fragments that are also assets? 

A. This will import all assets in the source, but not Content Fragments because there is no real way for you to put a CF on your source S3/Azure location 

Q. is this an enhanced version of the S3 ingestor which is used earlier to ingest assets from the AWS S3 bucket? 

 A. I am not sure which earlier version of the S3 ingestor is being referenced here, but this is not an enhanced version of any existing tool and was written from scratch 

Q. is the tool only for a one-off import or can it be scheduled to run periodically to import the delta. 

 A. The scheduled feature will be added to the product soon.  

Q. When upload happens, does the DAM Update Asset WF trigger? 

A. In Cloud Service, there is no concept of a DAM Update Asset WF, but the Asset Compute Service will process the asset, create renditions, etc and AEM will run the post-processing workflow that you have configured. 

Q. What happens if the assets/folders with the same name are present at the destination already 

A. the folder if exists, it won't create it again, for existing assets, you control the behavior (Skip, Overwrite or Create Version) 

Q. Is there api available, or some way to schedule these without having to invoke through the UI? 

 A. No APIs available right now. 

Q. What type of target folders will it create? Ordered folder or sling folder? will a configuration for that be available? 

 A. Sling Folder, no configuration 

Q. Size of batches when mentioned small? Will be dependent on the JVM or heap? or count specific 

 A. count specific 

Q. What is the maximum file size that this tool could support importing from S3? 

A. There is really no recommended batch size, like mentioned we've been able to import ~20,000 assets/hour. But, the batch size depends on your use case. 

Q. Does it create asset renditions as well later? 

A. This is a full asset ingestor, thus it will trigger renditions. 

Q. will the Asset Compute Service cause a memory issue if not disabled during upload? 

A. Asset Compute Service is a microservice. So no. 

Q. is there a plan to make this available for AMS customers? 

A. Regarding making the tool available for AMS, currently not on the roadmap.  Note that we have product posture of cloud first or recommended. 

Q. What is main limiting factor for scaling/speed of the import? 

A. The current rate limit of of asset compute service is about 15k per hour is major factor for importing speed, since the processing rate limit make it could not keep up with importing, it will introduce too many backlog for processing queue for big repo. 

Q. Does it works with customer owns AWS/Azure bucket or only adobe provided buckets? 

A.Customer owned buckets 

Q. Will the default metadata schema be used for the imported folders ? 

A.Yes, or for existing folders, whatever schema is assigned to the tree 

Q. Does the utility/tool retry failed assets? When an asset failed, will it just be skipped and try the next asset? Or will it fail the whole process? 

A. The transfer itself is actually in block level, each transfer of block have several retries until it’s success. If really failure for some asset, the. failure will not block whole process. And will be reported  in final job detail. Rerun the import job with skip mode will be able to only import failed assets again. 

Q. Are there any plans for allowing Dropbox or Box as sources in addition to S3/Azure Blob? 

A. We have received feedback to have Dropbox and other file repositories.  Currently no committed pl...

Q. If there is  any custom metadata along with the existing assets. Can we map metadata along with binaries? 

A.This feature will be added soon to the tool. 

Q. Does the migration report show AEM paths for all assets? 

A. For now, it’s not. But run some queries over the target folder will be able to get the list. 

Q. can this tool be integrated with workflow for approval & publishing? also does it support auto-publishing 

A.The tool does not support the publishing of assets today 

Don't forget to register yourself for this session using the registration link shared above. 



Kautuk Sahni
0 Replies