Expand my Community achievements bar.

Don’t miss the AEM Skill Exchange in SF on Nov 14—hear from industry leaders, learn best practices, and enhance your AEM strategy with practical tips.
SOLVED

AEM 6.4 - External Filedatastore - made simpler

Avatar

Employee

Hi All,

We have a requirement to setup external file datastore for binary data in AEM 6.4.3.

I searched for stuff and observed that crx2oak or oak-upgrade would be able to address this.

However, while running these tools,  it threw error saying that record does not exist.

Irony is, this error is thrown on brand new aem instance as well.

29.08.2019 11:28:17.295 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.MigrationOptions - copyVersions parameter set to 1969-12-31

29.08.2019 11:28:17.297 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.MigrationOptions - copyOrphanedVersions parameter set to 1969-12-31

29.08.2019 11:28:17.297 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.MigrationOptions - Checkpoints won't be migrated

29.08.2019 11:28:17.297 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.MigrationOptions - Cache size: 256 MB

29.08.2019 11:28:17.299 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.StoreArguments - Source: SEGMENT_TAR[crx-quickstart/repository]

29.08.2019 11:28:17.301 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.StoreArguments - Destination: SEGMENT_TAR[crx-quickstart/repository_new]

29.08.2019 11:28:17.311 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.FileStore - Creating file store FileStoreBuilder{version=1.8.5, directory=crx-quickstart/repository/segmentstore, blobStore=null, maxFileSize=256, segmentCacheSize=256, stringCacheSize=256, templateCacheSize=64, stringDeduplicationCacheSize=15000, templateDeduplicationCacheSize=3000, nodeDeduplicationCacheSize=1048576, memoryMapping=false, gcOptions=SegmentGCOptions{paused=false, estimationDisabled=false, gcSizeDeltaEstimation=1073741824, retryCount=5, forceTimeout=60, retainedGenerations=2, gcType=FULL}}

29.08.2019 11:28:17.417 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.ReadOnlyFileStore - TarMK ReadOnly opened: crx-quickstart/repository/segmentstore (mmap=false)

29.08.2019 11:28:17.467 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.ReadOnlyFileStore - TarMK closed: crx-quickstart/repository/segmentstore

29.08.2019 11:28:17.477 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.DatastoreArguments - Blobs stored in FileDataStore[crx-quickstart/repository/segmentstore] will be copied to FileDataStore[/mnt/author6.4_myh_dev]

29.08.2019 11:28:17.478 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.DatastoreArguments - Source blob store: FileDataStore[crx-quickstart/repository/segmentstore]

29.08.2019 11:28:17.483 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.FileStore - Creating file store FileStoreBuilder{version=1.8.5, directory=crx-quickstart/repository/segmentstore, blobStore=DataStore backed BlobStore [org.apache.jackrabbit.oak.plugins.blob.datastore.OakFileDataStore], maxFileSize=256, segmentCacheSize=256, stringCacheSize=256, templateCacheSize=64, stringDeduplicationCacheSize=15000, templateDeduplicationCacheSize=3000, nodeDeduplicationCacheSize=1048576, memoryMapping=true, gcOptions=SegmentGCOptions{paused=false, estimationDisabled=false, gcSizeDeltaEstimation=1073741824, retryCount=5, forceTimeout=60, retainedGenerations=2, gcType=FULL}}

29.08.2019 11:28:17.489 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.ReadOnlyFileStore - TarMK ReadOnly opened: crx-quickstart/repository/segmentstore (mmap=true)

29.08.2019 11:28:17.491 [main] *INFO*  org.apache.jackrabbit.oak.segment.SegmentNodeStore$SegmentNodeStoreBuilder - Creating segment node store SegmentNodeStoreBuilder{blobStore=DataStore backed BlobStore [org.apache.jackrabbit.oak.plugins.blob.datastore.OakFileDataStore]}

29.08.2019 11:28:17.495 [main] *INFO*  org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler - Initializing SegmentNodeStore with the commitFairLock option enabled.

29.08.2019 11:28:17.502 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.cli.parser.DatastoreArguments - Destination blob store: FileDataStore[/mnt/author6.4_myh_dev]

29.08.2019 11:28:17.505 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.FileStore - Creating file store FileStoreBuilder{version=1.8.5, directory=crx-quickstart/repository_new/segmentstore, blobStore=DataStore backed BlobStore [org.apache.jackrabbit.oak.plugins.blob.datastore.OakFileDataStore], maxFileSize=256, segmentCacheSize=256, stringCacheSize=256, templateCacheSize=64, stringDeduplicationCacheSize=15000, templateDeduplicationCacheSize=3000, nodeDeduplicationCacheSize=1048576, memoryMapping=true, gcOptions=SegmentGCOptions{paused=false, estimationDisabled=false, gcSizeDeltaEstimation=1073741824, retryCount=5, forceTimeout=60, retainedGenerations=2, gcType=FULL}}

29.08.2019 11:28:17.523 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.FileStore - TarMK opened at crx-quickstart/repository_new/segmentstore, mmap=true, size=1.6 MB (1608192 bytes)

29.08.2019 11:28:17.526 [main] *INFO*  org.apache.jackrabbit.oak.segment.SegmentNodeStore$SegmentNodeStoreBuilder - Creating segment node store SegmentNodeStoreBuilder{blobStore=DataStore backed BlobStore [org.apache.jackrabbit.oak.plugins.blob.datastore.OakFileDataStore]}

29.08.2019 11:28:17.526 [main] *INFO*  org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler - Initializing SegmentNodeStore with the commitFairLock option enabled.

29.08.2019 11:28:17.537 [main] *INFO*  org.apache.jackrabbit.oak.upgrade.RepositorySidegrade - Checkpoints won't be migrated because of the --skip-checkpoints options

29.08.2019 11:28:17.766 [main] *WARN*  org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore - Error occurred while loading bytes from steam while fetching for id 66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988#95174

java.util.concurrent.ExecutionException: java.io.IOException: org.apache.jackrabbit.core.data.DataStoreException: Record 66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988 does not exist

        at org.apache.jackrabbit.oak.cache.CacheLIRS$Segment.load(CacheLIRS.java:1018) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.cache.CacheLIRS$Segment.get(CacheLIRS.java:975) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.cache.CacheLIRS.get(CacheLIRS.java:286) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getInputStream(DataStoreBlobStore.java:324) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.plugins.blob.BlobStoreBlob.getNewStream(BlobStoreBlob.java:47) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.SegmentBlob.getNewStream(SegmentBlob.java:248) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.SegmentBlob.getNewStream(SegmentBlob.java:85) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeBlob(DefaultSegmentWriter.java:583) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeProperty(DefaultSegmentWriter.java:697) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeProperty(DefaultSegmentWriter.java:683) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:901) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:873) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:873) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.access$800(DefaultSegmentWriter.java:258) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$8.execute(DefaultSegmentWriter.java:247) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.SegmentBufferWriterPool.execute(SegmentBufferWriterPool.java:101) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter.writeNode(DefaultSegmentWriter.java:243) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.SegmentWriter.writeNode(SegmentWriter.java:141) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.SegmentNodeBuilder.getNodeState(SegmentNodeBuilder.java:132) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.scheduler.Commit.hasChanges(Commit.java:102) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler.execute(LockBasedScheduler.java:258) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler.schedule(LockBasedScheduler.java:236) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.segment.SegmentNodeStore.merge(SegmentNodeStore.java:195) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.spi.state.ProxyNodeStore.merge(ProxyNodeStore.java:43) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.migrateWithoutCheckpoints(RepositorySidegrade.java:435) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.copyState(RepositorySidegrade.java:317) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.copy(RepositorySidegrade.java:278) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.sidegrade(OakUpgrade.java:92) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.migrate(OakUpgrade.java:78) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.migrate(OakUpgrade.java:67) [oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.main(OakUpgrade.java:48) [oak-upgrade-1.8.5.jar:1.8.5]

Caused by: java.io.IOException: org.apache.jackrabbit.core.data.DataStoreException: Record 66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988 does not exist

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getStream(DataStoreBlobStore.java:590) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore$2.call(DataStoreBlobStore.java:328) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore$2.call(DataStoreBlobStore.java:324) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.cache.CacheLIRS$Segment.load(CacheLIRS.java:1014) ~[oak-upgrade-1.8.5.jar:1.8.5]

        ... 45 common frames omitted

Caused by: org.apache.jackrabbit.core.data.DataStoreException: Record 66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988 does not exist

        at org.apache.jackrabbit.core.data.AbstractDataStore.getRecord(AbstractDataStore.java:59) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getDataRecord(DataStoreBlobStore.java:599) ~[oak-upgrade-1.8.5.jar:1.8.5]

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getStream(DataStoreBlobStore.java:584) ~[oak-upgrade-1.8.5.jar:1.8.5]

        ... 48 common frames omitted

29.08.2019 11:28:17.800 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.FileStore - TarMK closed: crx-quickstart/repository_new/segmentstore

29.08.2019 11:28:17.818 [main] *INFO*  org.apache.jackrabbit.oak.segment.file.ReadOnlyFileStore - TarMK closed: crx-quickstart/repository/segmentstore

Exception in thread "main" java.lang.RuntimeException: javax.jcr.RepositoryException: Failed to copy content

        at com.google.common.io.Closer.rethrow(Closer.java:149)

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.migrate(OakUpgrade.java:81)

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.migrate(OakUpgrade.java:67)

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.main(OakUpgrade.java:48)

Caused by: javax.jcr.RepositoryException: Failed to copy content

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.copy(RepositorySidegrade.java:285)

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.sidegrade(OakUpgrade.java:92)

        at org.apache.jackrabbit.oak.upgrade.cli.OakUpgrade.migrate(OakUpgrade.java:78)

        ... 2 more

Caused by: java.lang.RuntimeException: Error occurred while obtaining InputStream for blobId [66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988#95174]

        at org.apache.jackrabbit.oak.plugins.blob.BlobStoreBlob.getNewStream(BlobStoreBlob.java:49)

        at org.apache.jackrabbit.oak.segment.SegmentBlob.getNewStream(SegmentBlob.java:248)

        at org.apache.jackrabbit.oak.segment.SegmentBlob.getNewStream(SegmentBlob.java:85)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeBlob(DefaultSegmentWriter.java:583)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeProperty(DefaultSegmentWriter.java:697)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeProperty(DefaultSegmentWriter.java:683)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:901)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:873)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:873)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNodeUncached(DefaultSegmentWriter.java:868)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.writeNode(DefaultSegmentWriter.java:805)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$SegmentWriteOperation.access$800(DefaultSegmentWriter.java:258)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter$8.execute(DefaultSegmentWriter.java:247)

        at org.apache.jackrabbit.oak.segment.SegmentBufferWriterPool.execute(SegmentBufferWriterPool.java:101)

        at org.apache.jackrabbit.oak.segment.DefaultSegmentWriter.writeNode(DefaultSegmentWriter.java:243)

        at org.apache.jackrabbit.oak.segment.SegmentWriter.writeNode(SegmentWriter.java:141)

        at org.apache.jackrabbit.oak.segment.SegmentNodeBuilder.getNodeState(SegmentNodeBuilder.java:132)

        at org.apache.jackrabbit.oak.segment.scheduler.Commit.hasChanges(Commit.java:102)

        at org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler.execute(LockBasedScheduler.java:258)

        at org.apache.jackrabbit.oak.segment.scheduler.LockBasedScheduler.schedule(LockBasedScheduler.java:236)

        at org.apache.jackrabbit.oak.segment.SegmentNodeStore.merge(SegmentNodeStore.java:195)

        at org.apache.jackrabbit.oak.spi.state.ProxyNodeStore.merge(ProxyNodeStore.java:43)

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.migrateWithoutCheckpoints(RepositorySidegrade.java:435)

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.copyState(RepositorySidegrade.java:317)

        at org.apache.jackrabbit.oak.upgrade.RepositorySidegrade.copy(RepositorySidegrade.java:278)

        ... 4 more

Caused by: java.io.IOException: org.apache.jackrabbit.core.data.DataStoreException: Record 66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988 does not exist

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getStream(DataStoreBlobStore.java:590)

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getInputStream(DataStoreBlobStore.java:343)

        at org.apache.jackrabbit.oak.plugins.blob.BlobStoreBlob.getNewStream(BlobStoreBlob.java:47)

        ... 41 more

Caused by: org.apache.jackrabbit.core.data.DataStoreException: Record 66f1d39a64998bf2e641765057509c086f2474202d9fd24fa654dabc01aba988 does not exist

        at org.apache.jackrabbit.core.data.AbstractDataStore.getRecord(AbstractDataStore.java:59)

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getDataRecord(DataStoreBlobStore.java:599)

        at org.apache.jackrabbit.oak.plugins.blob.datastore.DataStoreBlobStore.getStream(DataStoreBlobStore.java:584)

        ... 43 more

Then i realized that AEM 6.4.3 already has local datastore where binary data is stored.

All it needs is , moving that datastore to external system, configuring OSGi for fileDatastore to point to that external system path.

That's it.

Post this, i can see that any asset upload increases the size in external system and not in AEM repository.

Does this sound right?

Jörg Hohmcd​ Please confirm.

1 Accepted Solution

Avatar

Correct answer by
Employee Advisor

If you just use the symlink approach, you don't need a changed OSGI configuration.

Jörg

View solution in original post

3 Replies

Avatar

Employee Advisor

AEM 6.4.3 comes with an external datastore by default (it's the default since AEM 6.3), so you don't have to do anything.

To change the path to the datastore, I would recommend to shut down AEM, move the datastore to the final location and replace the crx-quickstart/repository/datastore directory with a symlink to the correct directory.

Jörg

Avatar

Employee

And we need to the OSGi congig of fileDataStore as well, right?

Avatar

Correct answer by
Employee Advisor

If you just use the symlink approach, you don't need a changed OSGI configuration.

Jörg