Adobe Analytics real time data (as near as possible) on the webpage | Community
Skip to main content
Level 2
January 5, 2024
Question

Adobe Analytics real time data (as near as possible) on the webpage

  • January 5, 2024
  • 1 reply
  • 1475 views

 We have a use case where we want to show few analytics attributes directly on the frontend. 

I was checking AA reporting APIs for same. But, can we make the direct call (from FE) to fetch those attributes ? Are there any performance issues ? 

I do not have a lot of working experience on these APIs, can someone please suggest if anyone used these reports for any such use case. 

 

Also, https://experienceleague.adobe.com/docs/analytics/analyze/reporting-api.html?lang=en -  this link talks about fully processed / real-time. Are there any performance difference between these ? 

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.

1 reply

Adobe Employee
January 5, 2024

You need to elaborate your use-case better so that people can respond.

 

In general, API access from the client side is not recommended from the security perspective. You can proxy these via your AEM layer but the thing to keep in mind is the usage charges. (See your contract for the same). Hitting the APIs from the webpage will have impact on page performance/user-experience as well.

 

If the usecase is lets say showing most popular content on the page, i'd design it in a way that you fetch this using reporting APIs in a scheduled fashion and cache that data (either in AEM or somewhere else). Even every 15 minutes will be ok as that would result in  just 100 hits a day.  You can then request this data from the cache on the page.

 

Regarding the API types, the key difference in latency and data quality. if by performance you mean the API response time, i don't think that will have any significance difference.

 

HTH.

 

Jennifer_Dungan
Community Advisor and Adobe Champion
Community Advisor and Adobe Champion
January 5, 2024

I agree about separating the API calls to a backend process that places the data in AEM, or a Database, etc for the website to pull from.

 

We used to use the APIs to drive our "Top Stories" containers, we set up the API to pull hourly (you can do it more frequently for your needs - it really depends on your use case), and we would store the result in our DB... (we actually pulled additional information such as category), then the containers on the front end on general pages would just pull top 5 stories from all categories, or top 5 stories from the category they were currently looking at (i.e. top 5 news, or top 5 sports, or top 5 opinion, etc). Since we pulled the content IDs via the API, the developers could code the containers to pull the images and URLs and other information live from the DB to show in the container...  If story titles changed, the site would always reflect the current values, not the version as tracked earlier in the day.