Expand my Community achievements bar.

Submissions are now open for the 2026 Adobe Experience Maker Awards

Compare and validate AA data to CJA data

Avatar

Level 1

I have migrated a client's website from Adobe Analytics to Customer Journey Analytics (CJA) and want to ensure that the data is populating consistently across both platforms. Specifically, I need to compare the data in an Adobe Analytics report suite to the corresponding data view in CJA.

 

Is there a more efficient way to validate the data rather then going through each individual dimension/metric to compare? 

 

Appreciate any insight on:

- Best practices for ensuring data consistency between Adobe Analytics and CJA.

- Common issues / discrepancies to watch out for during this comparison.

- Tools or workflows that can streamline the validation process.

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

4 Replies

Avatar

Community Advisor

Hi @Frankalytics 

I totally get how time-consuming it can be to manually validate every dimension and metric.

Here are some thoughts and best practices -

1. Start with a high-level comparison:
Use core metrics like visits, unique visitors, and page views over identical date ranges in both AA and CJA. Make sure your filters and segment logic are aligned. This gives you a quick health check before diving into granular validation.

2. Validate key events and dimensions in chunks:
Rather than comparing everything, focus first on business-critical metrics (e.g., conversions, engagement events, top content). Pick 4–5 key dimensions (device type, marketing channel, geolocation, etc.) and verify those for a specific period.

3. Pay attention to attribution models and data stitching:
One of the most common discrepancies comes from how CJA handles identity stitching (e.g., cross-device) and how attribution windows differ from AA. If you're using a Report Suite with Marketing Channels in AA, and a stitched person ID in CJA, expect some variance.

4. Watch for processing differences:
CJA’s event-based model processes data a bit differently. For example, fallout reports or visit-level segments might behave differently if your data schema or component mapping isn't aligned. Also, data latency and deduplication rules can affect totals.

5. Use tools like Data Warehouse + Workspace exports:
In AA, use Data Warehouse for raw extracts, and in CJA, consider Analysis Workspace exports or Query Service (if you're comfortable with SQL). Comparing outputs in Excel or a tool like Power BI can surface mismatches quickly.

6. Document everything as you go:
Track what you validated, any assumptions, segment definitions, and edge cases you’ve tested. It’ll save you huge amounts of time when your team (or client) wants to review the validation logic.

Hope this helps!

Avatar

Community Advisor and Adobe Champion

In addition to what @Vinay_Chauhan said

 

 

You can also probably use Report Builder to pull data into Excel from AA and CJA for comparison, particularly if you aren't comfortable with SQL.

 

You are probably aware of this, but Marketing Channels process a little differently between AA and CJA (particularly since there is no concept of "First Hit of Visit" in CJA rules)

 

 

Comparing everything one to one isn't possible.. but ensuring that trends and patterns are consistent, and that expected values are captured for the most important items are going to be the main focus of your testing.

Avatar

Level 5
Best Practices for Ensuring Data Consistency
 
Compare key summary metrics (Visits, Unique Visitors, Page Views, Revenue, etc.) for the same time range and segments. This helps identify broader issues before diving into specifics.
 
Ensure both AA and CJA are using the same time zone and date ranges. This is one of the most common sources of discrepancies.
 
Try to recreate the same logic from AA segments in CJA filters. Due to differences in how segments are evaluated (especially over time or across datasets), you might need to adapt the segment logic.
 
When setting up the CJA schema, mirror the naming of metrics and dimensions from AA where possible, to ease comparisons.
 
Common Issues / Discrepancies:
 
Attribution model differences.
Segment/filter logic may not match.
Data latency or freshness mismatches.
Bot/internal traffic filters might differ.
Schema changes or deduplication issues.

Avatar

Community Advisor

Hi @Frankalytics ,

 

Best Practices for Ensuring Data Consistency:

1. Establish Baseline Comparisons

Instead of comparing every dimension and metric individually, start by comparing aggregated totals for high-priority metrics and segments:

  • Visits, Visitors (Unique Visitors), and Page Views

  • Conversion events (e.g., purchases, form submissions, custom events)

  • Key dimensions (e.g., page names, campaigns, marketing channels)

Recommendation:

  • Perform comparisons over identical date ranges (use at least 7 days, ideally a month).

  • Confirm time-zone alignment between AA and CJA (both UTC or matching local timezone)

2. Use Summary Reporting (Segment-level)

Rather than individual dimension validation, group dimensions and metrics into logical segments or reports. For instance:

  • Traffic sources: Channels, Campaign IDs

  • Product interactions: Product views, add-to-cart, purchases

  • Device and Browser metrics: Device category, browsers, operating systems

Recommendation:

  • Export summarized reports from AA (via Workspace or Report Builder).

  • Reproduce the same segmented views in CJA workspace to compare totals quickly.

3. Check Identity Stitching and Data Model Differences

  • Confirm how visitor identification is handled between AA (visitor IDs, ECIDs) and CJA (identity namespaces, ECID integration).

  • Ensure your CJA data view is configured consistently to match AA sessionization logic (e.g., session timeout, session refresh logic).

Thanks.

Pradnya