Hi all,
I am exploring the best practices for leveraging Adobe Analytics data within Power BI to create in-depth web analytics reports. For example, my goal is to effectively report (closer to AA workspace) on KPIs) such as Form completions (event5), Unique Visitors, Visits, and Average Time on Site, broken down by various dimensions (e.g., Form Name, Country, Marketing Channel, Hostname, Device, etc..) across different time frames (weekly, monthly, yearly). To achieve this, I am considering three data extraction methods from Adobe Analytics:
I am reaching out for your expert opinions on which of these methods is most conducive to seamless and effective integration, emphasizing practicality, data flexibility, and reliability. Specifically, I am looking for guidance on:
I am eager to hear your thoughts and recommendations, particularly any real-world experiences you can share, which will guide me in selecting the most appropriate method for integrating Adobe Analytics data with Power BI for advanced and dynamic web analytics reporting. Thank you for your expertise and support.
Solved! Go to Solution.
Views
Replies
Total Likes
Data Warehouse and Report Builder are both going to be limited in what you can do with "Unique Visitors".. mostly because the data in those sources is flattened.
Let's say I as a user hit Page A, Page B and Page C
In both of the above, if you export Pages, Page Views and Unique Visitors, you are going to get something that looks like:
Pages | Page Views | Unique Visitors |
Page A | 1 | 1 |
Page B | 1 | 1 |
Page C | 1 | 1 |
You cannot see that these are from the same User, so the only thing you can do is add them up, but then you will see 3 Page Views and 3 Unique Visitors (this is incorrect)
In Workspace you would actually see:
Page Views | Unique Visitors | ||
Pages | 3 | 1 | |
Page A | 1 | 1 | |
Page B | 1 | 1 | |
Page C | 1 | 1 |
Where the totals are properly de-duplicated.
With Data Feeds, you can of course use SQL to compensate for this, since you will have access to fields like the Visitor IDs in which you can build logic to identify the users and their individual visits, etc. but since this is actual raw data, there's a lot of processing that will have to be done to clean this up to match workspace (removal of excludes, etc)
I personally haven't used Power BI, so I am not sure the full capabilities of that tool. However, I think that it can also use the APIs directly, and the APIs will return summary data that de-duplicates the values, as well as the individual rows of data. The APIs are actually used to build the Workspace reports, so this might be your best bet for trying to match the data.
Views
Replies
Total Likes
Data Warehouse and Report Builder are both going to be limited in what you can do with "Unique Visitors".. mostly because the data in those sources is flattened.
Let's say I as a user hit Page A, Page B and Page C
In both of the above, if you export Pages, Page Views and Unique Visitors, you are going to get something that looks like:
Pages | Page Views | Unique Visitors |
Page A | 1 | 1 |
Page B | 1 | 1 |
Page C | 1 | 1 |
You cannot see that these are from the same User, so the only thing you can do is add them up, but then you will see 3 Page Views and 3 Unique Visitors (this is incorrect)
In Workspace you would actually see:
Page Views | Unique Visitors | ||
Pages | 3 | 1 | |
Page A | 1 | 1 | |
Page B | 1 | 1 | |
Page C | 1 | 1 |
Where the totals are properly de-duplicated.
With Data Feeds, you can of course use SQL to compensate for this, since you will have access to fields like the Visitor IDs in which you can build logic to identify the users and their individual visits, etc. but since this is actual raw data, there's a lot of processing that will have to be done to clean this up to match workspace (removal of excludes, etc)
I personally haven't used Power BI, so I am not sure the full capabilities of that tool. However, I think that it can also use the APIs directly, and the APIs will return summary data that de-duplicates the values, as well as the individual rows of data. The APIs are actually used to build the Workspace reports, so this might be your best bet for trying to match the data.
Views
Replies
Total Likes
Is it advisable to extract data for each KPI and dimension separately and then amalgamate them in Power BI, allowing the end users to slice & dice the data quickly & efficiently? We are facing an issue (data discrepancy between workspace & Power BI dashboards) while trying to do 4 to 5 breakdowns in a single data warehouse/report builder request. Do you have any suggestions on how to overcome the issue?
Coming to API 2.0, I can view the data in JSON format using Swagger UI & enabling a debugger (https://developer.adobe.com/analytics-apis/docs/2.0/apis/). Unsuccessful in connecting to the Postman. Do you know the process to build these APIs, run these APIs at regular intervals & then extract them, save them somewhere & then upload them to Power BI?
Views
Replies
Total Likes
Unfortunately, as I said, I really haven't used Power BI specifically.
You did say you were having trouble connecting in Postman? I believe the debugger will only provide the request string and not the authentication.. could that be the problem you are encountering?
Swagger creates the authentication through your login, but in Postman you would have to configure that...
Views
Replies
Total Likes
Having trouble generating Access tokens via Postman.
I followed the steps in https://developer.adobe.com/analytics-apis/docs/2.0/guides/use-cases/postman/ but received a 302 status and it doesn't have any error message.
Without using Postman, can I retrieve the data using Swagger UI? Honestly, I would like to know when to use Postman & when to use Swagger UI.
Views
Replies
Total Likes
If you are just doing isolated pulls for data, either Swagger or Postman should work... but if you can't find the proper 1.4 version of the API in swagger, you might have to use Postman.
I see both Swagger and Postman as tools to make quick, isolated pulls of data... more like testing tools really. If you are going to be using APIs regularly, I would expect the API to be configured in a server, with some sort of scheduler.
Bearing in mind, that it's been many years since I was doing a lot of API pulls (using version 1.4 and the old Web Service Credentials).
I don't have an OAuth credential handy for 2.0 at the moment.
But when we were actively using them, we had a cron job set up to make the API pulls and store the info either into a TXT file (this was used with a SOLR search implementation for showing popular content), or another solution where we would put the data into a MySQL or MSSQL Database, and I believe these were scheduled with the SQL program itself...
Have you tried leveraging a developer? They tend to be good resources for helping to set up processes like this, rather than an analyst trying to do everything on their own.
Thanks for the additional details. I will check with your IT team on it.
Views
Replies
Total Likes
The current PBI connector or even if you are pulling data out of data warehouse isn't ideal for in depth reporting in PBI or really any other BI tool. Trying to pull everything you may need out then re-modeling it in another tool is rough. Generally speaking most people will use these tools for aggregate reporting in conjunction with outside data sources. I wouldn't expect your numbers to line up perfectly if you are trying to do any modeling or transformation in PBI, there are too many variables and the likelihood that you are missing something is high.
If you are using, or will be using, Customer Journey Analytics, Adobe announced that they will release a database connection for CJA that will help you here. This connect will remove the row limitation and will be a live connection, meaning that you can pull back whatever data is available in CJA workspace. In the words of the Adobe employee that hosted the session, almost anything you can do in workspace, you should be able to do in Power BI or any other BI tool that connects to a database. Expectation is this will be available in late May.