Expand my Community achievements bar.

Dive into Adobe Summit 2024! Explore curated list of AEM sessions & labs, register, connect with experts, ask questions, engage, and share insights. Don't miss the excitement.
SOLVED

File size

Avatar

Level 1

Hi there,

I was wondering if I can upload very large files(up to 10GB) on AEM system. The files should be shared with our agencies over the world and can be downloaded though any kind of API called in a Windows application. Thanks.!

1 Accepted Solution

Avatar

Correct answer by
Employee Advisor

You need to Configure the maximum size of the buffered image cache for this scenario

When uploading large amounts of assets to Adobe Experience Manager, to allow for unexpected spikes in memory consumption and to prevent JVM fails with OutOfMemoryErrors, reduce the configured maximum size of the buffered image cache. Consider an example that you have a system with a maximum heap (-Xmx param) of 5 GB, an Oak BlobCache set at 1 GB, and document cache set at 2 GB. In this case, the buffered cache would take maximum 1.25 GB and memory, that would leave only 0.75 GB memory for unexpected spikes.

Configure the buffered cache size in the OSGi Web Console. At http://host:port/system/console/configMgr/com.day.cq.dam.core.impl.cache.CQBufferedImageCache, set the property cq.dam.image.cache.max.memory in bytes. For example, 1073741824 is 1 GB (1024 x 1024 x 1024 = 1 GB).

From AEM 6.1 SP1, if you're using a sling:osgiConfig node for configuring this property, make sure to set the data type to Long. For more details, see CQBufferedImageCache consumes heap during Asset uploads.

Also, It is not recommended if you are using Cold standby or S3 Datastore:

There are two major known issues related to large files in AEM. When files reach sizes greater than 2 GB, cold standby synchronization can run into an out-of-memory situation. In some cases, it prevents standby sync from running. In other cases, it causes the primary instance to crash. This scenario applies to any file in AEM that is larger than 2GB, including content packages.

Likewise, when files reach 2GB in size while using a shared S3 datastore, it may take some time for the file to be fully persisted from the cache to the filesystem. As a result, when using binary-less replication, it is possible that the binary data may not have been persisted before replication completes. This situation can lead to issues, especially if the availability of data is important, for example in offloading scenarios.

View solution in original post

3 Replies

Avatar

Employee Advisor

Yes you can upload files of that size. You should use AEM Desktop App[0] for uploading files of that big in size. For sharing files with  agencies or external user you should implement Asset Share portal[1] on AEM publish instance or else use Adobe Brand Portal[2] which is a cloud based SASS solution. Also, before using AEM Assets for processing big size assets make sure you are aware of impact on performance.[3]

[0] - AEM Desktop App Release Notes

[1]- Asset Share Commons

[2]-Overview of AEM Assets Brand Portal

[3]- Assets Performance Guide

Avatar

Correct answer by
Employee Advisor

You need to Configure the maximum size of the buffered image cache for this scenario

When uploading large amounts of assets to Adobe Experience Manager, to allow for unexpected spikes in memory consumption and to prevent JVM fails with OutOfMemoryErrors, reduce the configured maximum size of the buffered image cache. Consider an example that you have a system with a maximum heap (-Xmx param) of 5 GB, an Oak BlobCache set at 1 GB, and document cache set at 2 GB. In this case, the buffered cache would take maximum 1.25 GB and memory, that would leave only 0.75 GB memory for unexpected spikes.

Configure the buffered cache size in the OSGi Web Console. At http://host:port/system/console/configMgr/com.day.cq.dam.core.impl.cache.CQBufferedImageCache, set the property cq.dam.image.cache.max.memory in bytes. For example, 1073741824 is 1 GB (1024 x 1024 x 1024 = 1 GB).

From AEM 6.1 SP1, if you're using a sling:osgiConfig node for configuring this property, make sure to set the data type to Long. For more details, see CQBufferedImageCache consumes heap during Asset uploads.

Also, It is not recommended if you are using Cold standby or S3 Datastore:

There are two major known issues related to large files in AEM. When files reach sizes greater than 2 GB, cold standby synchronization can run into an out-of-memory situation. In some cases, it prevents standby sync from running. In other cases, it causes the primary instance to crash. This scenario applies to any file in AEM that is larger than 2GB, including content packages.

Likewise, when files reach 2GB in size while using a shared S3 datastore, it may take some time for the file to be fully persisted from the cache to the filesystem. As a result, when using binary-less replication, it is possible that the binary data may not have been persisted before replication completes. This situation can lead to issues, especially if the availability of data is important, for example in offloading scenarios.

Avatar

Employee

Kunal23's advice around the Desktop App is incorrect. It is well documented that the Desktop App is NOT suitable for large file transfers.

Secondly, if you're intending to upload assets this large you should ensure that binary data is being written to a data store and not being stuffed into that TAR files for Tarmk. Performance will be impacted greatly in that scenario.

Third to allow AEM to upload larger than 2gb files via the web interface you need to overlay.

  • /libs/dam/gui/content/assets/jcr:content/actions/selection/create/items/fileupload
  • /libs/dam/gui/content/assets/jcr:content/actions/secondary/create/items/fileupload

and set the sizeLimit property to something larger than 2147483648 bytes.

See :

Inappropriate use of AEM Desktop App

AEM Desktop App Best Practices