Adobe’s Real-Time CDP powered by Adobe Experience Platform offers 3 types of segment evaluation: Batch, Streaming, and Edge. As hinted in the name, not all of these are “real-time”. This is sometimes a surprise to marketers, after all isn’t “real-time” in the product name ? In reality however, not all segments need to be calculated in real time, if you’re looking for customers who haven’t purchased in the last six months there’s no need for that to be real time.
Where I find people get confused though, is understanding the relationship between the Data Ingestion method (which can also be either Batch or Streaming) and the Segment Evaluation type. A question I’ve often heard is “If I ingest data using a batch data source is it limited to batch segmentation?”
The short answer is no, batch data can be used in a streaming segment, but the ingestion of batch data does not trigger segmentation. As a result, when it will and won’t be leveraged in a streaming segment can vary depending on the situation.
To explain this fully, we need to understand how segmentation works and when segment evaluation is triggered. For the sake of simplicity I’m going to lump edge and streaming segments together in this explanation.
First let’s look at how segmentation differs in Adobe Experience Platform. Most marketers are familiar with traditional database segmentation. A SQL query is written and then executed against your customer database to get a filtered set of results. Kind of like a funnel where every profile gets pushed through the segment and only some come out the other end.
Adobe’s approach is different, segmentation is evaluated on a per profile basis. Imagine inverting the funnel and instead of running rules against the full dataset, we push through a single profile and what comes out the bottom is a list of segments that the profile belongs to (or has exited from).
So the next question is when does this 1:1 segmentation run?
There are 2 processes that trigger segment evaluation on a profile, the first is the nightly scheduled job. Every night segmentation is evaluated for every profile. This “batch” job doesn’t just evaluate “batch” segments, it evaluates every segment against every profile, including streaming and edge segments.
The second process that triggers segmentation is receiving a streaming update. Every time a customer views a web page or receives a streaming event from any channel, it triggers segmentation for that profile. The catch of course, is that it will only evaluate the streaming segments at this time. As you can imagine, this is a huge amount of work, if we receive 100K events per hour and you have 100 streaming segments, that’s 10 million individual segment evaluations per hour! This is why we can’t expect every segment to be evaluated this way and have guardrails on both the complexity of the segment rules and how many can be created per instance.
So while technically it doesn’t matter how the data is ingested into the profile, we need a streaming update in order to trigger segmentation on that data, whether it’s related to the streaming update or not. What this means in most cases is newly ingested batch data might be ignored until the nightly job but previously ingested batch data can still help to support your real-time use cases, and should your customer engage on a real time channel before the nightly job, even newly ingested batch data can be leveraged immediately.
Confused ? Let’s walk through an example.
Marketing Objective: Encourage customers to leave an onsite review for a recently purchased product. The incentive should be personalised based on the customers location; customers who live in metro areas will be offered a discount on their next purchase, those who live in regional locations will receive an offer for free shipping. Segment Definition: Customers who live in [list of postcodes] who have made a purchase today Evaluation Type: Streaming Data Sources: 1. Customer Data from CRM (including home address postcodes) — batch ingested hourly 2. Online Transaction Data — streamed through WebSDK
When a customer qualifies for this experience differs depending on the journey the customer takes: Existing customers with batch data ingested before the streaming purchase event will be segmented in real-time. New customers whose address information won’t be ingested until the next hourly batch, won’t be segmented until the nightly job executes.
Understanding when segmentation is triggered and how your data is ingested is imperative to developing an effective segmentation strategy. In the above example, if you wanted to be sure that all customers receive a personalised experience immediately after purchasing you would need to consider one of the following:
Adding a fallback segment which includes only the purchase event and doesn’t consider city — new customers would receive this generic post purchase journey or experience rather than the location specific experience
Instead of using the address information provided by your customer data source, use the geo-location data in the purchase event derived from their IP address
Change the customer data ingestion from batch to streaming
Of course some of these options are easier than others. It would be great if every customer could integrate every data source using streaming connectors, but in reality this isn’t possible. Most companies will have legacy systems that even if they plan to move towards real time, aren’t there yet. That’s why Adobe focuses on profiles who are actively engaging with your brand. So even if batch data is needed, segmentation will evaluate as soon as the customer engages on a real time channel. Looking again at the example above, if the new customer returns to the site after the batch data has processed Adobe will evaluate the segment immediately, moving their qualification time forward to ensure a personalised experience can be delivered as soon as possible.
Let’s take this one step further and imagine the purchase was made in-store instead of online. It’s common for point of sale data to be integrated via batch processes, so in this case both of the data sources feeding the segment would be batch and will not trigger segment evaluation. The next (possibly entirely unrelated) streaming event received for that profile will trigger segmentation. So if the customer next engages on the website for example, we can still personalise that website experience based on the batch data received earlier that same day.
This unique approach to segmentation is one of the ways that Adobe Experience Platform stands out from the crowd and delivers on the promise of real time personalisation at scale, making sure you can deliver the right experience at the right time.
In summary, keep these 3 points in mind when you’re scratching your head trying to understand why a profile has or hasn’t qualified for your segment:
Streaming segments evaluate on a per profile basis when the profile is updated via a streaming method.
All segments(batch, streaming and edge) are evaluated against all profiles on a nightly basis.
It doesn’t matter how the data is ingested as long as it’s in the profile at the time that segmentation occurs.