Expand my Community achievements bar.

Submissions are now open for the 2026 Adobe Experience Maker Awards

How to Build an A4T Table in Adobe Analytics

Avatar

Employee

3/7/25

Overview  

With the Analytics for Target (A4T) panel you can analyze Adobe Target activities and experiences in Analysis Workspace. The integration between Analytics and Target provides powerful analysis and timesaving tools for your optimization program. This article will provide a step-by-step guide to make A4T panels and analyze the results. 

How to Build an A4T Table

Screenshot 2025-03-03 at 6.57.43 PM.png

1. Create a new project: 

  • Click the Create Project button in the top-right corner of the screen. 
  • Select Blank Workspace Project and click Create. 

2. Set up the panel: 

  • Navigate to the Panels section on the left-hand side. 
  • Select Analytics for Target and drag it into the center of the screen. 

Screenshot 2025-03-03 at 7.01.26 PM.png

3. Configure the drop-down selections: 

  • Target Activity: Use the drop-down to type or select the name of the test activity you want to analyze. 
  • Control Experience: This will auto-populate based on the selected Target Activity. Verify it matches the control setup for your test. You should adjust this, if necessary, as this may vary by the setup of each test.  

Screenshot 2025-03-03 at 7.02.20 PM.png

  • Success Metrics: Add metrics such as Cart Additions, Checkouts, Orders to evaluate the test’s success. 
  • Normalizing Metric: By default, this is set to Visitors. However, depending on your organization’s goals, you may want to adjust. You should use Visits as a normalizing metric when you are looking to measure short-term behavior, frequency per session, and session-based goals. You should use Visitors as a normalizing metric when looking to measure long-term behavior, cross-visit behavior, unique reach, and visitor-centric goals. 

4. Build the table: 

  • Click Build to generate the A4T table and analyze your test results. 

Reading Results of the A4T Table 

 

Screenshot 2025-03-03 at 7.16.16 PM.png

Using the Adobe Target Sample Size Calculator 

To determine when your test will reach statistical significance: 

1. Go to the Sample Size Calculator. 

2. Configure the following inputs: 

  • Confidence Level: Set to 95%. 
  • Statistical Power: Set to 80%. 
  • Number of Offers: Include all test variations and control. For example, you will input ‘2’ for this section if you are running a typical A/B test. However, if you are running a test that offers Experience A, Experience B, Experience C, Experience D, Experience E, you will input ‘5’ under this section. 

3. Calculate the Unique Daily Visitors: 

  • Drag the relevant page (e.g., Homepage, PDP, Checkout) into a Freeform table in Adobe Analytics. 
  • Add the Unique Visitors metric and set the date range to the last 60 days. You can increase the time window if desired. 
  • Divide the total number of visitors by 60 (or any other time window, as long as it is over 30 days to alleviate external factors, such as seasonality) and input this value into the calculator as Total Number of Daily Visitors. 

4. Determine the Baseline Conversion Rate: 

  • Divide the total number of conversions by dividing your conversion event by Unique Visitors. For example, using Orders divided by Unique Visitors (Orders/Unique Visitors) is a common conversion calculation. You can also use Clicks/Unique Visitors, Cart Additions/Unique Visitors, Checkouts/Unique Visitors, etc. 

5. Review the Results: 

  • Take note of the number of weeks required to complete the test and the sample size per offer needed to achieve statistical significance. Ensure the test has reached the required sample size before making decisions. 

Understanding Lift and Confidence 

As you monitor your organization activities, you will notice that there are three types of lifts: flat, positive, and negative. A flat lift (Hovering around 0%) indicates no statistically significant difference between the control and variation groups. A positive lift suggests a meaningful performance difference. Continuously monitor to ensure the lift remains above zero. A negative lift suggests that the alternate experience performs worse than the control. Moreover, if you see a negative lift and have reached an appropriate sample size amount, you should consider turning off the test. 

Confidence intervals and levels are key when analyzing your A4T results. The Lift Lower and Lift Upper bounds define the confidence interval for the lift. A narrow interval indicates greater certainty, while a wide interval suggests insufficient sample size or high variability. The test must reach 95% confidence and meet the required sample size to confirm results. 

Custom Analysis for Target Activities in Adobe Analytics 

You can use the A4T integration to build a custom analysis for your Target activities instead of relying on the automatic A4T panel. Additionally, you can use the standard A4T panel to analyze Auto-Target activities. 

To create a custom analysis: 

  1. Navigate to Adobe Analytics Analysis Workspace and add a Freeform table to the panel. 
  2. Locate the ‘Target Activities (Analytics for Target)’ variable and drag it into the table. 
  3. Right-click the activity you want to analyze and select ‘Display only selected rows’ to focus on that activity.  
  4. Find the ‘Target Experiences (Analytics for Target)’ variable and drop it onto the Target Activity variable. The table will automatically populate with the experiences within the selected activity.  
  5. Drag and drop relevant metrics to analyze performance.  

Keep in mind that a custom table will not automatically include conversion rate, upper and lower lifts, and confidence level, as an A4T panel would. 

The A4T integration supports AB, Auto-Allocate, XT, MVT, Recommendations and Auto-Target activities. While Adobe Analytics Analysis Workspace offers powerful analysis capabilities, modifications to the default Analytics for Target panel are required to accurately interpret Auto-Target activities. These adjustments are necessary due to the fundamental differences between experimentation activities (manual A/B Test and Auto-Allocate) and machine learning activities (Auto-Target). 

Conclusion 

Building an A4T table in Adobe Analytics enables organizations to effectively manage and analyze their Adobe Target tests. By following the configuration steps, inputting proper test parameters, and leveraging statistical tools, organizations can ensure an accurate evaluation of their tests’ performance. Furthermore, consistently monitoring the test’s lift and confidence levels ensures the use of data-driven decisioning for your organization’s optimization success.   

Resources 

https://experienceleague.adobe.com/en/docs/target/using/integrate/a4t/a4t 

https://experienceleague.adobe.com/en/docs/analytics/analyze/analysis-workspace/panels/a4t-panel 

https://experienceleaguecommunities.adobe.com/t5/adobe-analytics-blogs/using-adobe-target-sample-siz... 

https://experienceleague.adobe.com/en/docs/target/using/activities/abtest/sample-size-determination 

https://experienceleague.adobe.com/en/docs/target/using/activities/abtest/create/create-a4t 

https://experienceleague.adobe.com/en/docs/target-learn/tutorials/integrations/set-up-a4t-reports-in...

7 Comments

Avatar

Level 3

3/12/25

@jenmarti When using an Adobe Analytics A4T panel the date range is automatically populated by retrieving the relevant details from Adobe Target, e.g. 1st-31st March. However, what happens when the test's date range is extended in Adobe Target from 31st March to say April 10th. I can re-open the Adobe Analytics Workspace or click on 'Project' > 'Refresh project' but that only updates the table's data and doesn't update the dates to the new 1st March to April 10th range. So unless I recreate the A4T from scratch each time I won't know if the 1st-31st March date range is the definitive accurate one or not. Any ideas please? Thanks   

Avatar

Level 3

3/27/25

Any thoughts please @jenmarti ? 🙂 It does make me have to question the date range each time I open an 'Analytics for Target' panel it if I can't be sure that the shown date range is still 100% valid - thanks

Avatar

Adobe Champion

3/27/25

@Biggsy50: I put in a request to lock dates to a panel in workspace, but no word so far. I believe that when the panel refreshes the data, so if you extend the activity, the panel dates should change.  I'm guessing that it is not working that way for you?

Avatar

Level 3

3/27/25

@ShariLynnDeutsch Thanks. Seemingly not in my experience. I had an A4T panel ending 28th February ... but in Adobe Target the test had been extended to end March 31st, but the A4T didn't acknowledge this or auto-update. I'm going to find another example soon so hopefully I can then 100% confirm this.

Avatar

Employee

4/14/25

@Biggsy50 You will need to manually update the date range in the A4T panel within Adobe Analytics. To do this, confirm the duration of the test in Adobe Target and adjust the date range in the Analytics dashboard to ensure it reflects the updated timeline of your test. This step is necessary to ensure that the data and analysis correspond accurately to the entire period the test has been live.

Avatar

Level 3

4/14/25

Thanks @jenmarti. I love how the AA 'Analytics for Target' panel grabs the correct date range from AT in the first place, e.g March 1-31st ... but it's a shame that when the AA 'Analytics for Target' panel is then refreshed or is re-opened it doesn't do that same date range check with AT in case the test's been extended in AT to say March 1st - April 10th.

 

I don't have access to AT so I find I have to often rebuild the AA 'Analytics for Target' each time to be confident of the date range ... which reduces its "quick" reporting effectiveness some what.  

Avatar

Level 3

4/24/25

FYI - I can confirm that unfortunately none of the following updates the date range in an Adobe Analytics 'Analytics for Target' panel if the dates have been changed in Adobe Target (whether that be stopping the test early or extending it):

  1. Opening the project
  2. Clicking on 'Project' > 'Refresh project'
  3. Clicking on the pencil icon on the right-hand side and then clicking on 'Build'.

Therefore I can never be 100% sure that I'm looking at the correct data unless I create a brand new 'Analytics for Target' panel each time ... which is a pain.