I am supporting a client that has implemented a test on their own and are having some issues.
We are seeing a serious discrepancy between visitor/visit counts between Target numbers & Adobe analytics. (85%-90%)
Visitor Count not matching between Target & Analytics on a Redirect A/B test.
- Target is the reporting source (A4t integration has not been implemented)
- Adobe analytics is showing much more traffic then that of Target.
- Visitor counts on Target is only about 15% of the traffic shown in Adobe Analytics (Both experiences).
- Known issue of a slow site experience, mbox timeout call issues?
- The team that implemented this only added one URL to the activity location
- I seen a few secured and non secured URL versions of the page they want to test in an Entry Page URL report.
My thoughts were to simply add all the non secured, or secured versions of the URL to the default experience "add additional pages" to see if all the traffic we see in adobe analytics will appear on Target.
Timeout was changed from 5 to 15 seconds just to see if we can get some improvement in match numbers.
Have you checked the implementation of visitor id service, target and analytics on the website and confirmed it meets the minimum criteria. Please refer to the following article for implementation requirements for A4T with redirect tests : Before You Implement
The implementation of visitor id service implies for A4T integration only however as standalone solutions analytics and Target have different way of counting visitors, Analytics would take into account any visitor coming on the page however target would only count a visitor if he qualifies for the test. You can refer to the following article on more information on expected data variance : Expected Data Variances (Deprecated)
I would highly recommend using A4T going forward to stitch the visitors in both the solutions in a unified manner for better reporting.
What cases would exclude a visitor from being qualified for the test other than these.
1. Specific segments (audience) or in other words visitors who meet another criteria.
2. Users who did not hit the page of the Activity location while the test was live.
3. Users who have participated in another live test on the page?
I am asking this to find out if there are any reasons why other than the reasons above a visitor would not qualify for a test.
With a Redirect A/B test, we are seeing only a fraction of visitors that came in on a specific day that were driven mainly through a paid search program. (highest variation period was the few days a paid search campaign went live). Both the control & variation had much smaller visitor numbers compared to the paid search campaign.
1. Could the person who implemented the test not check off "add query parameters"?
2. Did Target not qualify those visitors because campaign ID's were not captured?
It is a lot to digest but thanks for the help so far.