At Adobe, we know that our customers rely on Adobe Journey Optimizer (AJO) to deliver seamless, timely, and personalized customer experiences. That’s why we have made it our mission to catch issues before they reach our customers – and resolve them quickly when they do.
How? With something we call Customer Use Case (CUC) Automation.
While we continue to invest in traditional testing methods like unit tests, functional tests, and integration checks, Customer Use Case (CUC) Automation goes a step further. It tests AJO the way our customers actually use it—across journeys, segments, profiles, message authoring, delivery, and more. By mirroring real user workflows end to end, CUC ensures everything works together seamlessly, just as you’d expect in a live environment.
This means better reliability, faster issue resolution, and smoother experience for our team and our customers.
Here’s the uncomfortable truth: if automation doesn’t reflect how our customers actually use the product, it’s not doing its job.
That was the gap we saw and filled with Customer Use Case (CUC) automation. Instead of just testing services, we test entire user workflow: how real users build, execute, and depend on our workflows. This includes not just our app, Adobe Journey Optimizer (AJO), but all the platform services it’s built on—like Adobe Experience Platform, profile, and segmentation services.
We’ve now automated 200+ critical user workflows. These tests run hourly or every few hours across environments and regions. And they’re not flaky. We hold a 98%+ pass rate across the suite, and our smoke tests have a 100% success rate. When something breaks, it’s real and it matters. That’s why our developers trust these tests so much, they let them gate production deployments.
But catching issues is only half the equation.
The other half is responding fast. When a test fails, we don’t just log it—we auto-trigger war rooms. We treat stage like prod. Our goal isn’t to be bug-free forever (that’s a fantasy). Our goal is to fail fast, resolve faster, and make sure customers never feel the pain.
In short: we’re proving that fast development with great quality is not a myth. With the right focus—CUC coverage, test health discipline, and Continuous Integration/Continuous Deployment (CI/CD) integration—we are giving our developers full autonomy, without compromising trust.
In this blog, you’ll learn how we built this system, what it looks like in action, and how it’s changed the way we ship software at scale.
Most teams don’t have a testing problem. They have a testing blind spot.
Unit tests catch broken logic. Feature tests validate new capabilities. Integration tests check API contracts and service responses. These layers are critical—and we use them too.
But here’s the catch: none of them tell what the customer sees.
Traditional tests focus on systems—not scenarios. They assume that if each service behaves as expected in isolation, the customer experience will be fine. But in reality, what customers interact with is a chain of services, stitched together in workflows that span teams, platforms, and microservices.
We saw this clearly in AJO.
Adobe Journey Optimizer is built on top of public cloud platforms like Azure and AWS, as well as Adobe Experience Platform. This means that even if our own APIs are functioning correctly, the overall customer experience can still be affected by issues in underlying services—like cloud infrastructure, profile data, segmentation logic, or activation flows. Traditional test cases often miss these kinds of problems because they focus on isolated responses, not whether a customer journey actually runs end to end.
That’s the difference.
Customer Use Case (CUC) testing looks at the product the way our users do. It doesn’t ask, “Did this API respond correctly?” It asks, “Can a customer build and execute a Journey/Campaign, with all the dependencies working together, end to end?”
Once we started thinking that way, we started catching issues before customers ever saw them.
A Customer Use Case (CUC) is more than an integration test. It’s a full representation of how a user interacts with our product—from start to finish—across every system involved.
For us, that meant mapping out exactly how customers use AJO: building journeys, creating segments, sending data to platforms, triggering events, and more. And not just the “happy path”—but edge cases too.
We prioritized based on impact:
Once we had the list, we built automation for each. Today, we’ve covered 200+ CUCs using a backend-focused framework powered by Cucumber and Selenium. These tests are API-first and simulate full end-to-end flows, not just isolated calls.
Automation doesn’t matter if no one trusts the results.
That’s why we invested early in test reliability. We made test health a priority, not an afterthought. We stabilized data setups, fixed flaky tests, and monitored failure patterns closely. The result?
This stability turned the test suite from a “nice to have” into a core dependency for engineers. Developers now deploy to staging—or a single production region—and wait for CUCs to run. If they pass, the rollout continues. If not, they halt or roll back.
That trust only happens when test results are clear, accurate, and fast.
Here’s how we structured our test execution:
All of this is deeply integrated into CI/CD pipelines. If a test fails, the pipeline stops. If smoke tests fail in stage—we treat it like a prod failure. Stage is no longer a dumping ground; it’s a production-grade environment.
The real breakthrough wasn’t just the technology—it was the cultural shift that followed.
Because engineers now trust the tests, they’ve started using them to make real decisions:
We even auto-trigger war rooms when smoke tests fail. This ensures rapid response and coordination, long before a customer ever feels the impact.
We’ve also seen adoption across microservice teams. Many now follow the same “regional rollout + CUC validation” pattern to ship safely and confidently.
Software will never be 100% bug-free, but we’re committed to catching issues before they reach our customers and resolving them quickly if they do.
Customer Use Case Automation lets us:
By shifting our approach from reactive fixes to proactive quality, we’ve built a foundation that allows us to move faster, with greater confidence.
The result? Customers get latest features and innovations in AJO-delivered faster, and with the quality that they can count on.
여기에 의견을 추가하려면 등록된 사용자이어야 합니다. 이미 등록되어 있다면 로그인하시기 바랍니다. 아직 등록하지 않은 경우 등록 후 로그인하시기 바랍니다.