Expand my Community achievements bar.

SOLVED

Data Source error refreshing Adobe Analytics Data in Power BI

Avatar

Level 2

Recently, we transitioned our team's Power BI dashboards from SharePoint to Adobe Connector 2.0. This all worked fine, except when we uploaded the desktop Pbix files to the web service version of Power BI. After uploading when I refreshed the dataset, it took a ton of time 10+ hours but at the end, it kept throwing the following data source error:

Data source error: [ValidateMarkupTags][ccon]Expression.Error: The field '[ccon]rows[/ccon]' of the record wasn't found.. error_code = [ccon]429050[/ccon]. message = [ccon]Too many requests[/ccon]. [/ccon]. The exception was raised by the IDataReader interface. Please review the error message and provider documentation for further information and corrective action. Table: 2023 - Export 2 - BID specific Post-Click.
Cluster URI: WABI-US-NORTH-CENTRAL-E-PRIMARY-redirect.analysis.windows.net
Activity ID: 20486221-9bdd-4cee-b186-1d9fae60fca9
Request ID: a43dd13d-98c7-a3f5-ad59-9a7783960676
Time: 2023-10-13 11:29:36Z

 

Any ideas on how to fix this error? Also, is the Power BI Adobe Analytics connector worth investing our team's time and resources in? The fundamental reason we switched to Adobe Analytics Connector is to save time. But if it takes unreasonable hours just to refresh then we are kinda worried whether we made the right decision.   Any help is appreciated.

 

----------------------

 

 

UPDATE: I was able to fix this "data refresh" error using two different methods. Both methods were successful. Left a detailed reply below elaborating both the methods. 

1 Accepted Solution

Avatar

Correct answer by
Level 2

Hey everyone! I have an UPDATE:

 

So, I found out two solutions and both of them are successful in fixing the "data source" error that I was getting: 

 

Solution #1:  

 

I created 2 dataflows - one with old data (2021 & 2022 data) and one with new data (2023 data) and I set the old dataflow to refresh once with no schedule and the second dataflow to refresh data on the schedule I need and then I loaded and append the other data flow with the second dataflow. This gave me a bullet proof pattern that is in line with Microsoft's Best Practice.

 

What actually happens was the historical data (2021 & 2022) doesn't refresh and only the live dataflow, which is 2023 data flow refreshed, therefore I do NOT get the "Too many request" Data Source error , as the data size is now considerably smaller (because only 2023 data is allowed to refresh).

 

Solution 2: 

 

  • Load table A (2021 & 2022 data) & table B (2023 data) into the model (table A is excluded in report refresh);
  • Hide table A & B (Optional step);
  • Create 'table C' in DAX:

```

UNION (

SELECTCOLUMNS ( 'table A', expression ),

SELECTCOLUMNS ( 'table B', expression )

)

```

MUST USE SELECTCOLUMNS() to have correct column order.

 

This solution also worked for me seamlessly, as it fixes the problem of entire datasets (2021, 2022 & 2023) loading every time I press 'refresh'.  

 

Let me know if you have any questions. 

View solution in original post

10 Replies

Avatar

Community Advisor and Adobe Champion

Unfortunately, I don't use Power BI, so I can't help specifically with this issue... I would suggest contacting support (who can look at the actual files and what is happening, and would be able to dig deeper into that error message. 

 

Good Luck.

Avatar

Level 2

Thank you, Jennifer. Will reach out to the support. 

Avatar

Level 2

Hey @Jennifer_Dungan ,

 

I have an update. I fixed the "data source" error that I was facing using two different methods. Both the methods were successful. I have a left a comment above elaborating my 2 solutions. Thank you! 

Avatar

Correct answer by
Level 2

Hey everyone! I have an UPDATE:

 

So, I found out two solutions and both of them are successful in fixing the "data source" error that I was getting: 

 

Solution #1:  

 

I created 2 dataflows - one with old data (2021 & 2022 data) and one with new data (2023 data) and I set the old dataflow to refresh once with no schedule and the second dataflow to refresh data on the schedule I need and then I loaded and append the other data flow with the second dataflow. This gave me a bullet proof pattern that is in line with Microsoft's Best Practice.

 

What actually happens was the historical data (2021 & 2022) doesn't refresh and only the live dataflow, which is 2023 data flow refreshed, therefore I do NOT get the "Too many request" Data Source error , as the data size is now considerably smaller (because only 2023 data is allowed to refresh).

 

Solution 2: 

 

  • Load table A (2021 & 2022 data) & table B (2023 data) into the model (table A is excluded in report refresh);
  • Hide table A & B (Optional step);
  • Create 'table C' in DAX:

```

UNION (

SELECTCOLUMNS ( 'table A', expression ),

SELECTCOLUMNS ( 'table B', expression )

)

```

MUST USE SELECTCOLUMNS() to have correct column order.

 

This solution also worked for me seamlessly, as it fixes the problem of entire datasets (2021, 2022 & 2023) loading every time I press 'refresh'.  

 

Let me know if you have any questions. 

Avatar

Level 1

After following your solution steps, It was working fine till last week but now i am getting that error again.

Avatar

Level 2

What's the exact error message? Try clearing cache, sign out, save and then sign in and try refreshing again. Let me know if the error persists. 

Avatar

Level 1

Yes i have tried and still getting below error.

Data source error: [ValidateMarkupTags][ccon]Expression.Error: The field '[ccon]rows[/ccon]' of the record wasn't found.. error_code = [ccon]429050[/ccon]. message = [ccon]Too many requests[/ccon]. [/ccon]. The exception was raised by the IDataReader interface. Please review the error message and provider documentation for further information and corrective action.
Cluster URI: WABI-WEST-US-redirect.analysis.windows.net
Activity ID: 6d807254-3439-4bec-9196-9e38cdaaccf9
Request ID: 09df0100-bdcd-4160-a1d9-89998051bec8
Time: 2024-03-17 11:31:48Z

Avatar

Level 2

What's the size of your data? Can you try break it into small chunks? For example, split your data into years: 2021 for query A, 2022 for Query B and so on... 

Avatar

Level 1

We are already breaking our data into two queries. For current month we are fetching one query and from sep'22 to last month in another query. Total rows after combining the query are 70k.

We have queries that have 190k+ rows that works fine but for above queries we are getting error when we refreshing the data in power bi service (for Power bi desktop it is okay).