Hi,
Is there a way to retrieve the asset's metadata properties during a re-upload? Currently, when the same asset is re-uploaded, all the previous metadata properties are lost because new metadata is being written. How can we read the previous metadata before the asset is replaced?
We are using AEM 6.5
Thank you.
Topics help categorize Community content and increase your ability to discover relevant content.
Views
Replies
Total Likes
Hi @Divya_T13
Yes, in AEM 6.5, when an asset is re-uploaded (e.g., via the DAM UI or programmatically), the binary is replaced, and by default, associated metadata (stored under jcr:content/metadata) may be overwritten depending on how the re-upload happens.
To retain or read existing metadata before re-upload and optionally preserve or merge it, you have a few options depending on your use case.
You can hook into the asset update process and read existing metadata before it's lost using a custom listener or workflow.
Use a Custom Sling Event Listener
@Component(service = EventHandler.class,
immediate = true,
property = {
EventConstants.EVENT_TOPIC + "=" + DamEventAsset.EVENT_TOPIC,
EventConstants.EVENT_FILTER + "=(event.application=assets.core)"
})
public class AssetReplaceListener implements EventHandler {
@Reference
private ResourceResolverFactory resolverFactory;
@Override
public void handleEvent(Event event) {
String assetPath = (String) event.getProperty(DamEventAsset.PATH);
if (event.getTopic().equals(DamEventAsset.ASSET_UPDATED)) {
try (ResourceResolver resolver = resolverFactory.getServiceResourceResolver(null)) {
AssetManager assetManager = resolver.adaptTo(AssetManager.class);
Asset asset = assetManager.getAsset(assetPath);
if (asset != null) {
ValueMap metadata = asset.getMetadata();
// Save or process metadata before it’s overwritten
log.info("Existing metadata for {}: {}", assetPath, metadata);
// Optionally save it elsewhere temporarily
}
} catch (LoginException e) {
log.error("Unable to get resolver", e);
}
}
}
}
Create a Custom Workflow Step
If re-upload triggers a DAM Update Asset workflow, you can insert a custom workflow step at the beginning to back up metadata.
Go to AEM Workflow Models.
Copy or edit the DAM Update Asset workflow.
Add a custom process step before metadata extraction:
WorkflowData data = workItem.getWorkflowData();
String path = data.getPayload().toString();
Resource resource = resolver.getResource(path + "/jcr:content/metadata");
ValueMap vm = resource.adaptTo(ValueMap.class);
// Store vm in a temporary location (e.g., in jcr:content/metadataBackup node)
If you only want to preserve custom metadata:
Create a filter of properties to retain and re-apply after asset update.
Use the MetadataWritebackProcess to write the old metadata back.
AEM replaces the whole jcr:content/renditions and metadata nodes during re-upload. So any unmerged or unbacked-up data is lost unless explicitly handled.
Hope this hepful.:)
Regards,
Karishma.
Thank you. I am trying the solution with the Event Handler, but it's not working.
By the time this event handler is invoked, the metadata has already been replaced, and I am unable to retrieve the old metadata. Here is my use case: when I initially upload an asset, I set some properties in the metadata. Now, when a user tries to re-upload the same asset and selects "Replace," I want to retrieve that property saved in the metadata and perform some processing. However, with the current solution, the asset is already replaced, and the metadata is overwritten. Below is the code tried.
import com.marketing.core.workflow.utility.ResourceResolverUtils;
import org.apache.sling.api.resource.*;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Reference;
import org.osgi.service.event.Event;
import org.osgi.service.event.EventHandler;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.day.cq.dam.api.Asset;
import com.day.cq.dam.api.AssetManager;
import com.day.cq.dam.api.DamEvent;
import org.osgi.service.event.EventConstants;
import java.util.Map;
import java.util.Objects;
@Component(service = EventHandler.class,
immediate = true,
property = {
EventConstants.EVENT_TOPIC + "=" + DamEvent.EVENT_TOPIC
})
public class AssetReplaceListener implements EventHandler {
@Reference
private ResourceResolverFactory resolverFactory;
private static final Logger log = LoggerFactory.getLogger(AssetReplaceListener.class);
@Override
public void handleEvent(Event event) {
log.info("Activating AssetReplaceListener...");
DamEvent damEvent = DamEvent.fromEvent(event);
String assetPath = damEvent.getAssetPath();
// String assetPath = (String) event.getProperty(DamEvent.PATH);
// if (event.getTopic().equals(DamEventAsset.ASSET_UPDATED)) {
if (assetPath != null && assetPath.startsWith("/content/dam/sales/sales-and-solution")) {
if (DamEvent.Type.ORIGINAL_UPDATED.equals(damEvent.getType()) || DamEvent.Type.METADATA_UPDATED.equals(damEvent.getType())) {
try (ResourceResolver resolver = ResourceResolverUtils.getDefaultServiceResourceResolver(resolverFactory)) {
Resource assetResource = resolver.getResource(assetPath);
Resource metadataResource = resolver.getResource(assetPath + "/jcr:content/metadata");
if (metadataResource != null) {
ModifiableValueMap modifiableValueMap = Objects.requireNonNull(metadataResource.adaptTo(ModifiableValueMap.class));
String doccID = modifiableValueMap.get("DocumentId", String.class);
log.info("Existing metadata for {}: {}", assetPath, doccID);
}
Asset asset = assetResource.adaptTo(Asset.class);
if (asset != null) {
Map<String, Object> metadata = asset.getMetadata();
log.info("Asset metadata for {}: {}", assetPath, metadata);
}
} catch (LoginException e) {
log.error("Unable to get resolver", e);
}
}
}
}
}
HI @Divya_T13
In AEM 6.5, by the time `DamEvent.Type.ORIGINAL_UPDATED` or `METADATA_UPDATED` is triggered, the asset’s metadata has already been replaced, making it too late to capture the original values. To preserve existing metadata before it gets overwritten during a re-upload, you need to intercept the process earlier—before the asset is persisted. While customizing the Asset Upload Servlet is one way, a more stable and maintainable approach is to insert a custom step at the beginning of the DAM Update Asset workflow to back up the existing metadata before any update occurs.
Try this custom workflow
public class BackupMetadataStep implements WorkflowProcess {
@Reference
private ResourceResolverFactory resolverFactory;
private static final Logger log = LoggerFactory.getLogger(BackupMetadataStep.class);
@Override
public void execute(WorkItem workItem, WorkflowSession session, MetaDataMap args) throws WorkflowException {
String payloadPath = workItem.getWorkflowData().getPayload().toString();
try (ResourceResolver resolver = resolverFactory.getServiceResourceResolver(
Collections.singletonMap(ResourceResolverFactory.SUBSERVICE, "datawrite"))) {
Resource metadataRes = resolver.getResource(payloadPath + "/jcr:content/metadata");
if (metadataRes != null) {
ValueMap metadataMap = metadataRes.adaptTo(ValueMap.class);
// Optional: save metadata into a backup node
Resource parent = resolver.getResource(payloadPath + "/jcr:content");
if (parent != null && !parent.hasChild("metadataBackup")) {
Node parentNode = parent.adaptTo(Node.class);
Node backupNode = parentNode.addNode("metadataBackup", "nt:unstructured");
for (Map.Entry<String, Object> entry : metadataMap.entrySet()) {
if (entry.getValue() instanceof String) {
backupNode.setProperty(entry.getKey(), (String) entry.getValue());
}
// Add more type checks if needed
}
resolver.commit();
log.info("Metadata backed up for {}", payloadPath);
}
}
} catch (LoginException | PersistenceException | RepositoryException e) {
log.error("Error while backing up metadata for " + payloadPath, e);
}
}
}
Hope this helpful.:)
Regards,
Karishma.
I tried to insert a process step at the beginning of the OOTB Asset Update workflow, but it didn't work. Even after placing the process step at the start of the workflow, the metadata is already overwritten, and I do not have access to the old metadata values. Below is the code I tried (with the property value hardcoded for testing). However, neither the old metadata nor the `metadataBackup` node exists when re-uploading or replacing the asset.
Please let me know if I am doing something wrong.
import com.adobe.granite.workflow.WorkflowException;
import com.adobe.granite.workflow.WorkflowSession;
import com.adobe.granite.workflow.exec.WorkItem;
import com.adobe.granite.workflow.exec.WorkflowProcess;
import com.adobe.granite.workflow.metadata.MetaDataMap;
import com.marketing.core.workflow.utility.ResourceResolverUtils;
import org.apache.sling.api.resource.*;
import org.osgi.service.component.annotations.Component;
import org.osgi.service.component.annotations.Reference;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import javax.jcr.Node;
import java.util.Map;
import java.util.Objects;
@Component(
service = WorkflowProcess.class,
property = {
"process.label=Custom BackupMetadataStep"
}
)
public class BackupMetadataStep implements WorkflowProcess {
@Reference
private ResourceResolverFactory resolverFactory;
private static final Logger log = LoggerFactory.getLogger(BackupMetadataStep.class);
@Override
public void execute(WorkItem workItem, WorkflowSession session, MetaDataMap args) throws WorkflowException {
String payloadPath = workItem.getWorkflowData().getPayload().toString();
// Extract the base asset path (e.g., /content/dam/.../test.pdf)
String baseAssetPath = extractBaseAssetPath(payloadPath);
try (ResourceResolver resolver = ResourceResolverUtils.getDefaultServiceResourceResolver(resolverFactory)) {
Resource metadataResource = resolver.getResource(baseAssetPath + "/jcr:content/metadata");
Resource metadataBackupResource = resolver.getResource(baseAssetPath + "/jcr:content/metadataBackup");
if (metadataResource != null) {
ValueMap metadataMap = Objects.requireNonNull(metadataResource.adaptTo(ValueMap.class));
String docID = Objects.requireNonNull(metadataMap).get("DocumentId", String.class);
log.info("Existing metadata for {}: {}", payloadPath, docID);
ValueMap metadataBackUpMap = null;
if (metadataBackupResource != null) {
metadataBackUpMap = Objects.requireNonNull(metadataBackupResource.adaptTo(ValueMap.class));
String documentId = Objects.requireNonNull(metadataBackUpMap).get("DocumentId", String.class);
log.info("Existing metadataBackup for {}: {}", payloadPath, documentId);
}
// Optional: save metadata into a backup node
Resource parent = resolver.getResource(baseAssetPath + "/jcr:content");
if (parent != null && parent.getChild("metadataBackup") == null) {
Node parentNode = parent.adaptTo(Node.class);
Node backupNode = parentNode.addNode("metadataBackup", "nt:unstructured");
backupNode.setProperty("DocumentId", "12345");
resolver.commit();
log.info("Metadata backed up for {}", payloadPath);
}
}
} catch (Exception e) {
log.error("Error while backing up metadata for " + baseAssetPath, e);
}
}
}
Thank you.
Hi @Divya_T13 ,
In AEM 6.5, when an asset is re-uploaded (using the "Replace" option), its binary and metadata are immediately overwritten before the DAM Update Asset workflow is triggered. This means you cannot retrieve the previous metadata within the workflow because it's already lost by that point.
To retrieve the previous metadata before it gets replaced, you need to intercept the upload process earlier — before the asset is saved. The recommended approach is to use a SlingPostProcessor, which lets you run custom logic during the asset upload, before the new metadata is written.
Here’s an example using a SlingPostProcessor to read existing metadata:
@Component(service = SlingPostProcessor.class)
public class AssetReplacePostProcessor implements SlingPostProcessor {
@Reference
private ResourceResolverFactory resolverFactory;
@Override
public void process(SlingHttpServletRequest request, List<Modification> modifications) throws Exception {
String path = request.getParameter(":contentFileName");
if (path == null || !path.contains("/content/dam/")) {
return;
}
try (ResourceResolver resolver = request.getResourceResolver()) {
Resource metadata = resolver.getResource(path + "/jcr:content/metadata");
if (metadata != null) {
ValueMap props = metadata.adaptTo(ValueMap.class);
String docId = props.get("DocumentId", String.class);
log.info("Pre-upload DocumentId: {}", docId);
// You can back it up here if needed
}
}
}
}
This allows you to access and preserve existing metadata during a re-upload before it gets replaced.
Thanks & Regards,
Vishal
Hi @VishalKa5
I’ve implemented a SlingPostProcessor to intercept asset uploads, but noticed it doesn’t get triggered when using the "Replace" option . There are also no errors in the logs during the replace action.
Am I missing something, or is this the confirmed behavior for AEM 6.5?
Thank you.
Hi @Divya_T13 ,
In AEM 6.5, when you use the "Replace" option in the Assets UI, it does not trigger a Sling POST, which means SlingPostProcessor will not execute. This is expected behavior and not a bug.
The "Replace" functionality uses a different servlet internally, bypassing the standard upload flow where SlingPostProcessor typically runs. That’s why there are no errors, but also no execution of your custom logic.
Thanks & Regards,
Vishal
Hi @VishalKa5
If this behavior is expected, then what would be the recommended alternative approach to capture existing asset metadata before it is replaced?
Thank you.
Hi @Divya_T13 ,
Yes, the "Replace" option in AEM works differently — it directly replaces the asset without going through the normal upload process, so your custom logic (like SlingPostProcessor) doesn’t get triggered.
Since this is expected behavior, the best way to keep the previous metadata is to capture it before the asset is replaced, using a custom solution.
Set up a listener that watches for changes to assets — especially when the original file (the binary) is being updated.
When a file is about to be replaced, the listener can read and temporarily store the existing metadata — either in a separate place in AEM or in memory.
After the new file is uploaded, your listener can write the old metadata back to the asset, so nothing is lost.
This way, even if AEM skips the normal upload process, you're still able to detect the change and save/restore the metadata.
You could also do this using a custom workflow if you have more control over how assets are uploaded.
Thanks & regards,
Vishal
Hi @VishalKa5
Thank you for all your inputs.
The recommended approach of using listeners to capture and restore metadata won't work effectively, as the metadata would be overridden during the asset replacement process. Currently, we are creating a backup node to store the existing metadata as soon as the asset is uploaded. When the asset is replaced, we retrieve the metadata from the backup node.
Views
Replies
Total Likes
@Divya_T13 Just checking in — were you able to resolve your issue? We’d love to hear how things worked out. If the suggestions above helped, marking a response as correct can guide others with similar questions. And if you found another solution, feel free to share it — your insights could really benefit the community. Thanks again for being part of the conversation!
Views
Replies
Total Likes
@kautuk_sahni We’re still actively checking the issue on our end, and it’s not yet resolved. The suggestions provided were helpful for understanding the root cause and constraints (especially regarding how the "Replace" functionality bypasses standard hooks), but we’re exploring a few alternate approaches.