Expand my Community achievements bar.

Webinar: Adobe Customer Journey Analytics Product Innovations: A Quarterly Overview. Come learn for the Adobe Analytics Product team who will be covering AJO reporting, Graph-based Stitching, guided analysis for CJA, and more!
SOLVED

Adobe Analytics 2.0 API

Avatar

Level 1

Hi,

We are interested in using Adobe Analytics 2.0 API.  There are a couple of questions i have:

1) Are there quotas in terms of how many calls can be made concurrently per user/report suite/org?

2) Is there a standard variables/pageurl across report suites that is accessible from API 2.0?  it seems this dimension is not usable from the API.

3) We would like to pull page name and page url with page views call so the response looks like

Home Page,https://site.com/, 101

Home Page,https://a.site.com/, 100

Products, https://site.com/products/, 100

It seems like I would need to make a single call get a list of all the page names and then make a separate breakdown call for each page name by page_url.

Thanks for your help,

Sammy

1 Accepted Solution

Avatar

Correct answer by
Employee

ishans9314858 is correct about the API quota limit of 20K calls per hour per Analytics company; however, that is for the 1.3/1.4 Analytics APIs.  For the Analytics 2.0 APIs, the throttle limit is set at 120 calls per minute, per user, regardless of report suite or company. When the throttle limit is crossed, the server returns an HTTP 429 status to the user the with message content: "too many requests". Note that the throttle applies at the API gateway layer.

There is also a throttle applied by the underlying reporting engine per report suite which is independent of any reporting client (e.g. Workspace, Report Builder, API, etc.) When that throttle is hit, no errors are returned but report requests take longer to process. The end result is that report requests across all clients take longer to process as the load on the report suite increases. Thus, it's possible for Workspace or Report Builder usage to cause API requests to run slower and it's possible for API requests to cause Workspace or Report Builder to run slower on a given report suite. The Analytics reporting system is a shared, multi-tenant system and this throttle is designed to prevent reporting activities on any single report suite from consuming too high a percentage of the capacity of a given data center.

The Page variable id is always /variables/page.  The Title or Name of the variable can be renamed on a per-report-suite basis but the id /variables/page shouldn't change. You can use the /dimensions endpoint for a report suite to get the list of dimensions and you can see the page variable as in the following example:

{

  "id": "variables/page",

  "title": "Page",

  "name": "Page",

  "type": "string",

  "category": "Content",

  "support": [

  "oberon",

  "dataWarehouse"

  ],

  "pathable": true,

  "segmentable": true,

  "reportable": [

  "oberon"

  ],

  "supportsDataGovernance": true

  }

ishans9314858 is also correct that you have to make multiple calls in the 2.0 API to do a breakdown.

A minimum of 2 calls is required: 1 call to get the item IDs of the dimensions and a second call to do the break down. This is because a breakdown of Dimension A by Dimension B is actually a filter on Dimension B where the values on the same hit were a specific value (itemId) of Dimension A. Multiple-level breakdowns across large numbers of dimension items require more calls.  For example, breaking down a list of 1000 items two levels deep, with 5 dimensions at the first level and 10 dimensions at the second level will require 6001 actual API requests.  With a throttle of 120 calls per user per minute it would take approximately 50 minutes for the 1000x5x10 breakdown to be completed assuming no other report requests are loading the report suite.

Previous versions of the API made the multiple breakdown calls for on behalf of the caller; however, this has the effect of "hiding" the performance costs of large complex breakdowns to the callers. API calls had to be queued and processed because of the load they placed on the reporting system. Callers had no idea how long a breakdown call would take and couldn't display any sort of progress while a request was working its way through the queue.

API 2.0 doesn't do any queuing at the API layer but it requires callers to make the breakdown calls themselves. Callers can know exactly how many calls need to be made and thus calculate and provide an indication or update on progress when working through large multi-level breakdowns. API 2.0 is a reporting API designed to support responsive interactive reporting and exploration but is not well-suited for bulk data export use cases. The 1.4 Data Warehouse API or Data Feeds are more suitable for use cases that require bulk data export.

View solution in original post

10 Replies

Avatar

Employee Advisor

Hi sammyy10113488​,

1. The API quota limit is 20,000 calls per hour per Analytics login company. Additionally there is a throttle limit of around ~300 calls per minute.

2. I don't think we have a standard dimension for Page URL available in Adobe Analytics. It's very much possible when it's captured in a custom variable.

3. To breakdown a dimension with another dimension, we will need item IDs. To get the item IDs, you will need to make a call that will show all the values for a dimension along with their itemIDs.

Hope this helps!

Avatar

Level 2
Hi ishans52004352 I've tried to perform from two different machines (with the same credentials) two different queries, but I've noticed some sort of decrease of performance. I'm sure that I'm not reaching the limit of queries (because of the speed of retrieval of the API). Is there any configuration that allows me to run the two different extractions at the highest throughput, so the performance of both extractions is not compromised?

Avatar

Level 1

Thank you so much for the response, is there any way to raise that limit and when we hit the limit what will the response look like?

Thanks!

Avatar

Employee Advisor

There isn't a way to increase the API limit. On hitting the limit, you will see Too many requests... error.

Avatar

Correct answer by
Employee

ishans9314858 is correct about the API quota limit of 20K calls per hour per Analytics company; however, that is for the 1.3/1.4 Analytics APIs.  For the Analytics 2.0 APIs, the throttle limit is set at 120 calls per minute, per user, regardless of report suite or company. When the throttle limit is crossed, the server returns an HTTP 429 status to the user the with message content: "too many requests". Note that the throttle applies at the API gateway layer.

There is also a throttle applied by the underlying reporting engine per report suite which is independent of any reporting client (e.g. Workspace, Report Builder, API, etc.) When that throttle is hit, no errors are returned but report requests take longer to process. The end result is that report requests across all clients take longer to process as the load on the report suite increases. Thus, it's possible for Workspace or Report Builder usage to cause API requests to run slower and it's possible for API requests to cause Workspace or Report Builder to run slower on a given report suite. The Analytics reporting system is a shared, multi-tenant system and this throttle is designed to prevent reporting activities on any single report suite from consuming too high a percentage of the capacity of a given data center.

The Page variable id is always /variables/page.  The Title or Name of the variable can be renamed on a per-report-suite basis but the id /variables/page shouldn't change. You can use the /dimensions endpoint for a report suite to get the list of dimensions and you can see the page variable as in the following example:

{

  "id": "variables/page",

  "title": "Page",

  "name": "Page",

  "type": "string",

  "category": "Content",

  "support": [

  "oberon",

  "dataWarehouse"

  ],

  "pathable": true,

  "segmentable": true,

  "reportable": [

  "oberon"

  ],

  "supportsDataGovernance": true

  }

ishans9314858 is also correct that you have to make multiple calls in the 2.0 API to do a breakdown.

A minimum of 2 calls is required: 1 call to get the item IDs of the dimensions and a second call to do the break down. This is because a breakdown of Dimension A by Dimension B is actually a filter on Dimension B where the values on the same hit were a specific value (itemId) of Dimension A. Multiple-level breakdowns across large numbers of dimension items require more calls.  For example, breaking down a list of 1000 items two levels deep, with 5 dimensions at the first level and 10 dimensions at the second level will require 6001 actual API requests.  With a throttle of 120 calls per user per minute it would take approximately 50 minutes for the 1000x5x10 breakdown to be completed assuming no other report requests are loading the report suite.

Previous versions of the API made the multiple breakdown calls for on behalf of the caller; however, this has the effect of "hiding" the performance costs of large complex breakdowns to the callers. API calls had to be queued and processed because of the load they placed on the reporting system. Callers had no idea how long a breakdown call would take and couldn't display any sort of progress while a request was working its way through the queue.

API 2.0 doesn't do any queuing at the API layer but it requires callers to make the breakdown calls themselves. Callers can know exactly how many calls need to be made and thus calculate and provide an indication or update on progress when working through large multi-level breakdowns. API 2.0 is a reporting API designed to support responsive interactive reporting and exploration but is not well-suited for bulk data export use cases. The 1.4 Data Warehouse API or Data Feeds are more suitable for use cases that require bulk data export.

Avatar

Level 2

Is this still the case? I have been trying to pull larger time frames of data using Adobe Analytics v2 OAuth Partition Connector - Server to Server but it does seem to run into the issues with throttling. Sounds like, if I want to import more than a couple month's of data on a daily basis, I should switch to v1..4

Avatar

Level 1

We also need bulk export of data. I believe the 1.4 Data warehouse API is going to be deprecated right? And what do you exactly mean when you say "Data Feeds" are more suitable for use cases that require bulk data export? What are these Data Feeds?

Avatar

Employee

The 1.4 Data Warehouse API will be deprecated only when equivalent functionality is supported in a newer API version. Adobe has not announced any timeline for Data Warehouse functionality to be available in a newer API version yet so the 1.4 Data Warehouse API continues to be fully supported.

For information on Data Feeds I recommend this help document as a starting place:  Analytics Data Feed

HI Brian, Mentioned link take us to page where it talks about FTP configuration. How can i fetch data feeds as API. Why are we talking about FTP when data feeds is API?