Challenges using Adobe Reporting API 2.0 | Community
Skip to main content
Level 1
March 16, 2026
Question

Challenges using Adobe Reporting API 2.0

  • March 16, 2026
  • 1 reply
  • 38 views

Hello everyone,

We are implementing Adobe Reporting API 2.0, since version 1.4 will be deprecated in August 2026. During implementation, we encountered a limitation in the API design that affects our use case.

In Reporting API 2.0, data retrieval is based on dimensions (descriptive attributes such as customer ID, date, page) and metrics (calculated measures such as pageviews and visits). The API always returns aggregated results based on the dimensions provided.

Our requirement is to ingest historical data at the lowest possible granularity (near event-level) and/or perform daily loads. To achieve this, we would need to include multiple dimensions (e.g., date, customer ID, page). However, the API processes breakdowns by sending separate requests for each combination of dimensions if we want all dimensions and metrics in a single table.

Example:

  • 50 days

  • 30 customers

  • 10 pages

This would result in 50 × 30 × 10 = 15,000 API requests.

Given this structure, the API appears not scalable for historical ingestion, and daily loads would also be challenging. The nested dimension logic significantly increases processing time, complexity, and the risk of hitting API limits.

We would greatly appreciate any recommendations or best practices for:

  1. Efficiently ingesting historical data at near-event granularity

  2. Managing large-scale daily loads without creating excessive API requests

We have heard about Adobe Data Feeds and are waiting for a response from our Adobe team, but we would also like to know if anyone is using Adobe Analytics 2.0 for daily workloads involving multiple dimensions and metrics in a single table.

Thank you in advance for your guidance!

    1 reply

    Jennifer_Dungan
    Community Advisor and Adobe Champion
    Community Advisor and Adobe Champion
    March 16, 2026

    Yeah, I don’t actually like API 2.0 as much as I liked API 1.4

     

    For simple requests, 2.0 is fine… but as you said, the new API isn’t scalable!

     

    Data Feeds are good, but a very different way to pull data… first off, they are full raw data… so you will have to process the exclusions (exclude_hit), the User and Visit calculations… You will have to map the “backend numerical event references” to the correct events, etc.

     

    Some key pointers… make sure you are using “post_***” values where ever possible…  any data that gets processed in your system via Processing Rules or Vista Rules will be set correctly only in the post_ version.

     

    Also note, there is no specific “Page View” event, this is calculated by there being a post_page_url or post_page_name value.

     

    The raw data will provide a table export of every hit, and the columns you specify… but all the calculations will be on you.