Expand my Community achievements bar.

Webinar: Adobe Customer Journey Analytics Product Innovations: A Quarterly Overview. Come learn for the Adobe Analytics Product team who will be covering AJO reporting, Graph-based Stitching, guided analysis for CJA, and more!

Adobe Analytics real time data (as near as possible) on the webpage

Avatar

Level 2

 We have a use case where we want to show few analytics attributes directly on the frontend. 

I was checking AA reporting APIs for same. But, can we make the direct call (from FE) to fetch those attributes ? Are there any performance issues ? 

I do not have a lot of working experience on these APIs, can someone please suggest if anyone used these reports for any such use case. 

 

Also, https://experienceleague.adobe.com/docs/analytics/analyze/reporting-api.html?lang=en -  this link talks about fully processed / real-time. Are there any performance difference between these ? 

4 Replies

Avatar

Employee

You need to elaborate your use-case better so that people can respond.

 

In general, API access from the client side is not recommended from the security perspective. You can proxy these via your AEM layer but the thing to keep in mind is the usage charges. (See your contract for the same). Hitting the APIs from the webpage will have impact on page performance/user-experience as well.

 

If the usecase is lets say showing most popular content on the page, i'd design it in a way that you fetch this using reporting APIs in a scheduled fashion and cache that data (either in AEM or somewhere else). Even every 15 minutes will be ok as that would result in  just 100 hits a day.  You can then request this data from the cache on the page.

 

Regarding the API types, the key difference in latency and data quality. if by performance you mean the API response time, i don't think that will have any significance difference.

 

HTH.

 

Avatar

Community Advisor

I agree about separating the API calls to a backend process that places the data in AEM, or a Database, etc for the website to pull from.

 

We used to use the APIs to drive our "Top Stories" containers, we set up the API to pull hourly (you can do it more frequently for your needs - it really depends on your use case), and we would store the result in our DB... (we actually pulled additional information such as category), then the containers on the front end on general pages would just pull top 5 stories from all categories, or top 5 stories from the category they were currently looking at (i.e. top 5 news, or top 5 sports, or top 5 opinion, etc). Since we pulled the content IDs via the API, the developers could code the containers to pull the images and URLs and other information live from the DB to show in the container...  If story titles changed, the site would always reflect the current values, not the version as tracked earlier in the day.

Avatar

Level 1

Hi, 

 

Thank you for the response. 

So, if we make these calls from AEM then we will persist this data in AEM publisher and sync it in all nodes ? And, even though we do this in a scheduler, but it will add load on publishers along with nodes for every markets? 

Please help me understand if this is still the right way of doing it ?

I was under the impression since this is more of a data layer attribute, so better FE makes the call to AA real time reporting API. But even if this approach I have following concern -

1. On every page load FE will make call to AA reporting API. 

2. When traffic is huge, can. AA handle such huge number of requests ?

 

Avatar

Community Advisor

There is no reason that every page should make a reporting call. Your website won't even do that for content. You will likely have a CDN (content delivery network), and/or one or more cache layers. None of the content of your website would make direct calls to updated content.

 

I don't know enough about AEM to know its capabilities, but we when we did this we we making the calls from MSSQL.  The APIs calls were scheduled for every hour as I said, @himanshu suggested every 15 mins... you will need to work with your team to determine the best schedule for you.

 

The content will be stored, and the content should be pulled when new content gets pulled from your actual webserver (i.e. the various cache layers, CDN, etc have all expired.

 

Depending on where you need the information to appear, the cache may be set to 5 mins, or it could be set to 1 hour... I don't know your site.