Expand my Community achievements bar.

Dive into Adobe Summit 2024! Explore curated list of AEM sessions & labs, register, connect with experts, ask questions, engage, and share insights. Don't miss the excitement.
SOLVED

Author instance restarting problems

Avatar

Former Community Member

Hi,

One of my colleague tried installing a package on Shared Author environment (Size was approx 400Mb). The upload was not successful and server went quite funny so we decided to restart the server. However, after that the server isn't starting. We are working on 5.6.1. When i hit the url servername:4502 i get not found but /system/console is coming up fine. Although a lot of dependencies aren't active and most of the options are mission from main tab and status tab.

Also, are they any best practices which should be followed for installing packages of size more than 500 Mb or something because we have to regularly do that.

Error.log attached.

08.07.2014 14:54:48.093 *INFO* [FelixStartLevel] com.day.crx.sling.server Service [com.adobe.granite.crx.console.CmdWorkspace,135] ServiceEvent REGISTERED
08.07.2014 14:54:48.101 *INFO* [FelixStartLevel] com.day.crx.sling.server Service [com.day.crx.sling.server.impl.DiskBenchmarkPlugin,136] ServiceEvent REGISTERED
08.07.2014 14:54:48.108 *INFO* [FelixStartLevel] com.day.crx.sling.server Service [com.day.crx.sling.server.impl.ProfilerPlugin,137] ServiceEvent REGISTERED
08.07.2014 14:54:48.575 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.RepositoryImpl Starting repository...
08.07.2014 14:54:48.583 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.fs.local.LocalFileSystem LocalFileSystem initialized at path /opt/adobe/cq5/author/crx-quickstart/repository/repository
08.07.2014 14:54:49.789 *INFO* [FelixStartLevel] com.day.crx.core.cluster.ClusterController Node 24b70699-a515-4052-84f6-cfdeb388734e started the master listener, on address: xxx.yyy.zzz/192.133.22.22:8088 force: false
08.07.2014 14:54:49.818 *INFO* [FelixStartLevel] com.day.crx.core.cluster.ClusterController Node 24b70699-a515-4052-84f6-cfdeb388734e started as: master
08.07.2014 14:54:49.942 *INFO* [FelixStartLevel] com.day.crx.persistence.tar.ClusterTarSet activate /opt/adobe/cq5/author/crx-quickstart/repository tarJournal
08.07.2014 14:54:50.000 *INFO* [FelixStartLevel] com.day.crx.persistence.tar.TarSet scanning index /opt/adobe/cq5/author/crx-quickstart/repository/tarJournal/data_00032.tar id:32 length:53113344 append:-1 1560710199
08.07.2014 14:54:50.209 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.fs.local.LocalFileSystem LocalFileSystem initialized at path /opt/adobe/cq5/author/crx-quickstart/repository/version
08.07.2014 14:54:50.228 *INFO* [FelixStartLevel] com.day.crx.persistence.tar.ClusterTarSet activate /opt/adobe/cq5/author/crx-quickstart/repository version
08.07.2014 14:54:50.233 *INFO* [FelixStartLevel] com.day.crx.persistence.tar.TarSet scanning index /opt/adobe/cq5/author/crx-quickstart/repository/version/data_00025.tar id:25 length:62866944 append:-1 335642604
08.07.2014 14:54:50.544 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.RepositoryImpl initializing workspace 'crx.default'...
08.07.2014 14:54:50.544 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.fs.local.LocalFileSystem LocalFileSystem initialized at path /opt/adobe/cq5/author/crx-quickstart/repository/workspaces/crx.default
08.07.2014 14:54:50.546 *INFO* [FelixStartLevel] com.day.crx.persistence.tar.ClusterTarSet activate /opt/adobe/cq5/author/crx-quickstart/repository crx.default
08.07.2014 14:54:50.572 *INFO* [FelixStartLevel] com.day.crx.persistence.tar.TarSet scanning index /opt/adobe/cq5/author/crx-quickstart/repository/workspaces/crx.default/data_00105.tar id:105 length:331587072 append:-1 1182665092
08.07.2014 14:54:52.378 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.query.lucene.SearchIndex Index initialized: /opt/adobe/cq5/author/crx-quickstart/repository/repository/index Version: 3
08.07.2014 14:54:53.954 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.query.lucene.Recovery Found uncommitted redo log. Applying changes now...
08.07.2014 14:55:26.471 *INFO* [Calculate File System Status] com.day.crx.persistence.tar.TarUtils File system status: created 200 files in 12 ms (16666 ops/sec)
08.07.2014 14:55:26.481 *INFO* [Calculate File System Status] com.day.crx.persistence.tar.TarUtils File system status calculated in 22 ms
08.07.2014 14:55:50.707 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.persistence.bundle.AbstractBundlePersistenceManager cachename=crx.defaultBundleCache[ConcurrentCache@4984bbd4], elements=0, usedmemorykb=0, maxmemorykb=8192, access=381, miss=379
08.07.2014 14:56:58.997 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.persistence.bundle.AbstractBundlePersistenceManager cachename=crx.defaultBundleCache[ConcurrentCache@4984bbd4], elements=0, usedmemorykb=0, maxmemorykb=8192, access=1016, miss=1014
08.07.2014 14:58:06.125 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.persistence.bundle.AbstractBundlePersistenceManager cachename=crx.defaultBundleCache[ConcurrentCache@4984bbd4], elements=0, usedmemorykb=0, maxmemorykb=8192, access=1651, miss=1649
08.07.2014 14:59:14.800 *INFO* [FelixStartLevel] org.apache.jackrabbit.core.persistence.bundle.AbstractBundlePersistenceManager cachename=crx.defaultBundleCache[ConcurrentCache@4984bbd4], elements=0, usedmemorykb=0, maxmemorykb=8192, access=2286, miss=2284

1 Accepted Solution

Avatar

Correct answer by
Level 10

There are couple of things. First what is the content inside package & why you want upload regularly.  Is it product importer?

1)   Make sure temp folder has sufficient space atleast 3 times the size of package & replication happening at given point of time.

2)   If you have dam assets inside package or any workflow runs based on its content. Make sure to offload it. 

3)   Have regular maintance activity of server like purging, optimization & deletion of packages that is no longer required.

View solution in original post

5 Replies

Avatar

Level 6

Pls try to rename the index folder from below location and try to restast again- 

$crx-quickstart\repository\repository

$\crx-quickstart\repository\workspaces\crx.default

Avatar

Correct answer by
Level 10

There are couple of things. First what is the content inside package & why you want upload regularly.  Is it product importer?

1)   Make sure temp folder has sufficient space atleast 3 times the size of package & replication happening at given point of time.

2)   If you have dam assets inside package or any workflow runs based on its content. Make sure to offload it. 

3)   Have regular maintance activity of server like purging, optimization & deletion of packages that is no longer required.

Avatar

Former Community Member

Sham HC wrote...

There are couple of things. First what is the content inside package & why you want upload regularly.  Is it product importer?

1)   Make sure temp folder has sufficient space atleast 3 times the size of package & replication happening at given point of time.

2)   If you have dam assets inside package or any workflow runs based on its content. Make sure to offload it. 

3)   Have regular maintance activity of server like purging, optimization & deletion of packages that is no longer required.

 

Thanks for the recommendations.

The reason why we have to import the content regularly is because we are just done with finalizing our migration approach and there are bugs with whatever has been done and if people change the code and run migration again it creates packages with size swelling upto 700Mb sometimes.

I would appreciate if you can let me know the procedure to update large content packages. Is there any other way to upload package apart from Package Share and Webdav? plus nice and efficient.

Avatar

Level 10

Kumar Lal wrote...

Sham HC wrote...

There are couple of things. First what is the content inside package & why you want upload regularly.  Is it product importer?

1)   Make sure temp folder has sufficient space atleast 3 times the size of package & replication happening at given point of time.

2)   If you have dam assets inside package or any workflow runs based on its content. Make sure to offload it. 

3)   Have regular maintance activity of server like purging, optimization & deletion of packages that is no longer required.

 

Thanks for the recommendations.

The reason why we have to import the content regularly is because we are just done with finalizing our migration approach and there are bugs with whatever has been done and if people change the code and run migration again it creates packages with size swelling upto 700Mb sometimes.

I would appreciate if you can let me know the procedure to update large content packages. Is there any other way to upload package apart from Package Share and Webdav? plus nice and efficient.

 

Hi Kumar lal

The way it works with large package currently is copy the package in the /tmp then copy to datastore, and use more tmp copies of the initial resulting in copying 3 or 4 times the package before actually starting to extract its content. Hence maintenance adds up.  On the other side 700Mb should be fine just make sure to take care of my earlier recommendations.  Package manager service is the option.

Anyhow For migration I would recommend personally is to extract the package on your local filesystem, then use the vlt import hence creating the content in the repository directly.

Thanks,
Sham

Avatar

Level 10

Hi Kumar,

  Not sure on what migration approach you are following but ideally try to break the packages if possible, else you can still upload bigger packages but make sure you have sufficient disk space available.