Expand my Community achievements bar.

Join us at Adobe Summit 2024 for the Coffee Break Q&A Live series, a unique opportunity to network with and learn from expert users, the Adobe product team, and Adobe partners in a small group, 30 minute AMA conversations.

20% Discrepancy in Raw Data vs Adobe Reports

Avatar

Level 2

Hi, on average Im seeing a 20-27% difference when re-running year over year reports. For example, when I ran a report in May of 2017 Adobe said I had 100,000 visitors, however when I RE-RAN the month of May 2017 in May of 2018, it said I had 80,000 visitors in May 2017. This discrepancy is across all major metrics, for EVERY month.

Additionally, I've noticed that when using Ad Hoc Analysis (which I know is raw numbers) its showing a discrepancy of about 20% between its numbers and the numbers shown in Reports.

20% seems like a huge difference, can anyone explain this? What goes into Adobe's "Continual Processing" and how could it be off by so much in each different view? Where do the other numbers go? This is extremely problematic to the numbers our company is reporting.

5 Replies

Avatar

Community Advisor

Seems odd to me, that data from one month like PVs UVs Visits changes month to month over a year. (I assume you pull exact same timeframes.) I would expcet PVs UVS visits once month is complete to be the same.

Things I have seen that could be making a change in numbers:

1 Are you using any year based KPIs(Yearly Unique Visitors) as an example, if time frame is changes when they are reported on it may change results.

2 Is the visit metric on the report in question restricted to a segment that may have changes over time?(Like new pages added or deleted in a segment)

3 Any custom data uploaded between dates of report pulls?

4 Do you have any special processing rules against any metrics that could be causing a change over time?

5 Any changes made to report suite itself(like a service issue)?

Any screen grabs would help as I honestly haven't seen this type of issue before.

Avatar

Level 2

I would expect them to stay the same as well, which is why its so troublesome to me. I always pull the same time frames (entire month), and typically pull them on the 15th or so of the month after to allow data processing or random issues to sort of settle down. (Example: Ill pull stats for May on the 15th of June)

Not using any time constrained KPIs, just base level. It is not restricted to segments regarding pages or anything that could have altered over time (the only segments applied are mobile/desktop device segments and its consistent regardless of the segment applied). No custom data uploaded besides campaign classifications (which shouldnt be interfering because theres no filter based on campaign applied). No special processing rules have been implemented, and to the best of my knowledge there have been no service issue alterations to the S-Code or anything of the sort.

I'd pull screen grabs, but I dont think it will make a difference in the explanation because I dont have screen grabs from the ORIGINAL times I ran the report, simply past records in Excel. Every month since I've noticed this occurring I re-run the previous year report and figure out the % difference of the change. It has never failed to be between 20-27%.

Avatar

Level 4

If the discrepancy is consistent it means you have something wrong in logic.

Would recommend sticking to basics of each variable/metric and not to forget the exclusion if you have any setup.

Avatar

Level 2

By saying "its consistent" I mean that its the same difference regardless of whether there is a segment applied or not. Without any segments, AND using out-of-box basic KPIs there is still a discrepancy of 20+% in YOY numbers. It makes absolutely no sense.

Avatar

Employee Advisor

Two things I can think of:

  • Are you using data sources?
  • What is 'count repeat instances' set to?