Challenges using Adobe Reporting API 2.0
Hello everyone,
We are implementing Adobe Reporting API 2.0, since version 1.4 will be deprecated in August 2026. During implementation, we encountered a limitation in the API design that affects our use case.
In Reporting API 2.0, data retrieval is based on dimensions (descriptive attributes such as customer ID, date, page) and metrics (calculated measures such as pageviews and visits). The API always returns aggregated results based on the dimensions provided.
Our requirement is to ingest historical data at the lowest possible granularity (near event-level) and/or perform daily loads. To achieve this, we would need to include multiple dimensions (e.g., date, customer ID, page). However, the API processes breakdowns by sending separate requests for each combination of dimensions if we want all dimensions and metrics in a single table.
Example:
-
50 days
-
30 customers
-
10 pages
This would result in 50 × 30 × 10 = 15,000 API requests.
Given this structure, the API appears not scalable for historical ingestion, and daily loads would also be challenging. The nested dimension logic significantly increases processing time, complexity, and the risk of hitting API limits.
We would greatly appreciate any recommendations or best practices for:
-
Efficiently ingesting historical data at near-event granularity
-
Managing large-scale daily loads without creating excessive API requests
We have heard about Adobe Data Feeds and are waiting for a response from our Adobe team, but we would also like to know if anyone is using Adobe Analytics 2.0 for daily workloads involving multiple dimensions and metrics in a single table.
Thank you in advance for your guidance!