Expand my Community achievements bar.

What To Do When the Control Experience Wins

Avatar

Employee

5/17/24

Eight columns or four? Which layout do web visitors prefer on your product detail pages? Marketing, IT and several executive stakeholders are awaiting the answer to that question, and they are all eager for the results of the A/B test you launched. Now, it’s Friday afternoon, and time to make one last check of the test’s performance. To your disappointment, you find that the four-column control experience is winning over the alternate treatment the business is so excited about. Now what do you do?

This is a scenario that many companies that use Adobe Target encounter as they are scaling their A/B testing programs. In this blog post, we will give you a few different strategies for how to message and iterate after the control experience wins in an A/B test.

Emphasize Positives from Existing Site Experience Winning

Whenever you are running a basic A/B test, you are testing a challenger experience against your existing site that users are seeing and interacting with every single day. If the challenger experience is losing, it means your hypothesis was incorrect. But, it also can be looked at from a positive perspective, in that your existing site experience was best for users.

Also, you can find ways to apply learnings from the winning control experience to other areas of the site. Using our four-column control experience vs. eight-column challenger experience example from earlier – where else can your company leverage the winning, four-columns experience?

This is a glass-half-full approach, certainly. However, it’s still important to emphasize this when speaking to stakeholders about the test results. You may have invalidated your test’s hypothesis with this A/B test, but you did get some solid, quantitative data that speaks to the strength of your existing site experience.

Look for Insights in Segmented Test Data

Even if the control experience wins out, there is still a lot that you can learn from running an A/B test. Make sure that you are looking at the test data against some different segments that are important to your organization. Some basic examples include desktop vs. mobile visitors, or first-time vs. returning visitors. Reviewing testing results against these segments can uncover some critical insights that could inform future testing hypotheses or help you optimize your site experience for specific device types or users. For example, a challenger experience could just be rolled out to mobile visitors if it performed really well with just that segment.

When you are looking at the test data in this way, make sure you also review the test’s secondary KPIs. The original experiment may have reached a statistically significant negative lift when looking at a primary KPI such as orders. But, there could still be some other, secondary KPIs where there may be a positive lift. It’s also important to check and validate that you have selected the correct metrics as primary and secondary KPIs for each test – sometimes, looking at test data against the wrong KPIs can lead to a control experience being called a winner, when maybe it shouldn’t have been. Looking through all these nuggets of insight will help inform the design of future tests on your roadmap as well.

Be Data-Driven in your Testing

One of the common mistakes that can lead to control experiences consistently winning is a failure to have data behind some of the hypotheses that are driving your testing program. A string of control experiences winning is a good opportunity to take stock and ask your team a few questions. Are we reviewing our existing analytics data to look for fallout points or other opportunities for testing? Are we engaging with the right teams in our org who may be generating these insights? And are these insights informing the next tests on our roadmap?

This is a problem many clients face, and it’s not just related to A/B testing. Many personalization programs also could be more data-driven in their approach. For future tests, try to include specific data points from Adobe Analytics that support your hypotheses.

Iterate on your Hypothesis

This last one is maybe the most important out of any of these recommendations. A/B testing programs should be a cyclical process. Within Adobe Consulting Services, we talk about the Optimization Framework – an end-to-end process that takes you from test hypothesis to test design to measuring the test’s results – that repeats at the conclusion of every A/B test. It’s critical to review the data from your testing, understand any insights that could be gained from the test, and then use those insights to inform your next A/B tests that you want to pull from your testing roadmap.

Go back and look at the hypothesis for the unsuccessful A/B test and build on it for a future experiment. If executive stakeholders are frustrated after the control experience wins out, bring them into the process of developing your next hypothesis. And don’t be afraid to iterate and try something new! It could be a new page, a new segment of visitors or a new look for a challenger experience. The important thing is to not stop after control experiences win out, but rather, continue testing and looking for insights. 

***

It’s never fun to have to go back to executive stakeholders and explain that one of your company’s A/B tests was unsuccessful. It’s especially tough if you dedicated lots of development resources towards building out a complex experience, or if you are on a run of challenger experiences losing to the control. However, this is a part of the A/B testing process that every single practitioner goes through. Use these four strategies outlined in this blog post to move past these A/B tests where the control experience wins out and continue to look for meaningful insights from A/B testing for your organization.

 

Are you looking to optimize your website's performance and drive conversions?

Look no further than Adobe - the industry leader in A/B testing. With more A/B tests under our belt than any other organization in the world, we have the expertise and knowledge to take your online presence to the next level. Adobe Target is unrivaled in the industry, and with Adobe Consulting, you can have access to this powerful tool and the expertise behind it. Don't miss out on the opportunity to unlock the full potential of your website. Contact us today and discover how Adobe can help your business thrive. If you would like to understand how Adobe can help your business reach out to Adobe Consulting Services.