Expand my Community achievements bar.

SOLVED

Write AEMasaCS logs directly to ElasticSearch

Avatar

Level 1

Hello,

 

Is it possible to configure AEMasaCS project to write logs directly to external ElasticSearch to monitor logs in Kibana?

I know that we can download/tail logs via aio and then use beats to submit data to ES, but I was told to investigate if there is a way to send data directly. I've checked ACS Commons and saw a SyslogAppender feature that is not compatible with AEMasaCS, and I believe the entire ch.qos.logback package is marked as Deprecated with a "This internal logback API is not supported by AEM as a Cloud Service." message. So, I'm wondering if this is even a valid approach. Also, I'm not sure how much performance overhead it will cost us to send logs via network.

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

1 Accepted Solution

Avatar

Correct answer by
Community Advisor

Hi Rustam,

 

We have very limited direct access to FS in CS version of AEM.

 

There's no such functionality in AEM as a CS to share files directly with Elastic.

 

There's an option for Splunk(Not Elastic) to setup forwarder which will get you a bit closer to your instance, but then you still may get blocked depending on your Adobe Azure security settings when connection to your organisation.

 

AFAIK cron job in logstack and aio API is the best option I found so far.

 

May be someone else in community found a better approach.

 

Regards,

Peter

View solution in original post

6 Replies

Avatar

Community Advisor

Hi Rustam,

 

Sending logs directly through the jvm process to Elastic is definitely not good, it may indeed impact many parts of the Adobe application.

 

Adobe IO gives you the opportunity to download logs via [1] download logs API,

In ELK stack we can utilize logstack cron based events system to import data into your index[2]. Once it's in index we can do all of the monitoring needed. This way you will utilise aio events that live outside of your jvm process.

 

[1] https://developer.adobe.com/experience-cloud/cloud-manager/reference/api/#tag/Environments/operation...

[2] https://www.elastic.co/guide/en/logstash/current/plugins-inputs-http_poller.html

 

Regards,

Peter

Avatar

Level 1

Hi Peter,

 

Thanks for your response!

Yes, I'm aware of the aio utility to download/tail logs and my suggestion was to use it to gather the logs and then feed it to ES. But I also want to understand whether it's possible to send logs from AEM application or not, and if not, why. If it was a regular Java app, I would add logback-elasticsearch-appender dependency, but it depends on ch.qos.logback packages that are not compatible with AEM as a CS.

Nevertheless, AEM as a CS uses something to store logs in a file system, so may be there is a documentation I couldn't find or something undocumented that sheds a light on this topic. 

Avatar

Correct answer by
Community Advisor

Hi Rustam,

 

We have very limited direct access to FS in CS version of AEM.

 

There's no such functionality in AEM as a CS to share files directly with Elastic.

 

There's an option for Splunk(Not Elastic) to setup forwarder which will get you a bit closer to your instance, but then you still may get blocked depending on your Adobe Azure security settings when connection to your organisation.

 

AFAIK cron job in logstack and aio API is the best option I found so far.

 

May be someone else in community found a better approach.

 

Regards,

Peter

Avatar

Level 1

Thanks, Peter.

I'm going to mark this reply as correct, because I don't believe we can get any closer to the answer without an insight from Adobe. 

Have a nice weekend!

Rustam

Avatar

Employee Advisor

I am aware of some thoughts to support log-forwarding to other systems than Splunk. You might want to contact customer support which can probably get you in contact with the right Adobe folks.

Avatar

Level 1

Hello Jorg,

Apologies for the delay in responding.

I reached out to our contact at Adobe via email about the issue and quoted your previous response.

 

Your help is really appreciated.