How to view the "Recs Box" for debugging Target Recommendations with AEP Web SDK? | Community
Skip to main content
Level 2
March 12, 2026
Question

How to view the "Recs Box" for debugging Target Recommendations with AEP Web SDK?

  • March 12, 2026
  • 2 replies
  • 65 views

I'm struggling to find an efficient way to debug Recommendations. With at.js, I used mboxTrace, which provided a very clear "Recs Box" showing exactly which criteria were used, which entities were returned, and why certain items were excluded.

Now that I'm using Web SDK and the AEP Debugger, I find the debugging process much more difficult. I can see the network calls, but I miss that high-level summary of the Recommendations logic.

What I've tried:

  • Using the AEP Debugger Extension to look at Edge Traces.

  • Digging through the interact response payloads in the Network tab.

My Questions:

  1. Is there a way to get a visual "Recs box" similar to the old mboxTrace functionality in at.js?

  2. In the AEP Debugger, where exactly should I look to see the Criteria details and Exclusion reasons for a specific Recommendations activity?

  3. Are there any third-party tools or specific "Edge Trace" configurations that make the JSON output more readable for Recs-heavy implementations?

  4. How do you effectively validate that the entity.id being passed in the current session is actually being used by the algorithm in real-time?

2 replies

Level 3
April 13, 2026

Hi ​@khush_b ,

 

To debug Recommendations specifically with the Web SDK, you have to shift from browser-rendered summaries to Edge Traces and Response Payloads. Here is the most efficient way to get that deep-level visibility back.

1. Recreating the "Recs Box" Data

Since the automatic visual box doesn't exist in the Web SDK, you can use Response Tokens to surface the same data in the browser's Network tab.

  • Setup: In the Target UI, go to Administration > Response Tokens and enable attributes like option.criteria.id, option.algorithm.id, and entity.id.

  • Result: These will now be included in the propositions section of the Web SDK interact response. It isn't a "box," but it tells you exactly which algorithm and which items were selected.

2. Finding Exclusion Reasons in AEP Debugger

To see the "Why" (e.g., why an item was filtered out), you must use an Edge Trace.

  1. Generate a Token: In Target, go to Administration > Implementation > Generate Trace Token.

  2. Connect in Debugger: Open the AEP Debugger, navigate to Logs > Edge, click Connect, and paste your token.

  3. Inspect the JSON: Look for the com.adobe.target.recommendations node in the log.

  4. Exclusion Details: Inside that node, look for inventoryFilterExecution. This section explicitly lists which entities were filtered out and the specific rule (e.g., "Category Match" or "Out of Stock") that caused the exclusion.

3. Validating entity.id in Real-Time

To confirm Target is actually "seeing" the product you are passing:

  • Check the Request: In the Network tab, inspect the interact payload. Ensure entity.id is correctly mapped under data > __adobe > target.

  • Check the Trace "Key": In the Edge Trace, look for the key or currentEntity attribute. If your algorithm is "People Who Viewed This," the currentEntity in the trace must match the entity.id sent in the request. If they don't match, Target is defaulting to a general algorithm because it didn't recognize the product context.

4. Efficient JSON Reading

Because Web SDK traces are massive, avoid reading them in the small Debugger window:

  • Copy to Editor: Copy the full Edge Trace JSON and paste it into a tool like VS Code.

  • Search for "recs": Search for the recommendations or propositions keys to jump straight to the logic blocks. This is much faster than scrolling through the AEP Debugger logs.

Level 1
April 20, 2026

Thanks PrasanthV, how are you able to enable the   option.criteria.idoption.algorithm.id, and entity.id  response tokens? I am only able to add profile tokens?
 

 

Level 3
April 21, 2026

Hi ​@catkim_dg,

 

To get the visibility you're looking for, focus on these two methods:

1. Enabling Built-in Response Tokens

In the Administration > Response Tokens screen, look for a toggle or a "built-in" tab.

  • Many Recommendations-specific tokens (like activity.name, experience.name) are already there.

  • If you don't see option.criteria.id in the list, it's because Target doesn't allow custom Recommendations attributes to be toggled globally the same way as profile attributes.

2. The "Web SDK Workaround" (The most reliable method)

Since you are using the Web SDK, you don't actually need to "enable" these as response tokens in the admin UI to see them in the logs. Instead, use the AEP Debugger Edge Trace.

If you have a Trace Token active (generated in Target > Administration > Implementation):

  1. Open the AEP Debugger and go to Logs > Edge.

  2. Look for the com.adobe.target node.

  3. Inside, navigate to propositions -> items -> data.

  4. The Web SDK automatically includes the criteriaId, algorithmId, and the entityIds returned in the response payload of the trace, even if they aren't enabled in the "Response Tokens" UI.

3. Using the "Custom Code" Option in the Activity

If you specifically want these attributes to appear in your browser's Network tab (interact call) as part of the JSON response:

  • Open your Recommendations Activity.

  • Go to the Experiences step.

  • Instead of just a "JSON Offer" or "HTML Template," use the "Custom Code" or "JSON" option.

  • You can manually map the attributes into the template using standard Recs syntax (e.g., ${campaign.name}, ${recommendations.algorithm.name}).

April 22, 2026

@khush_b  Perhaps, you should try using the edge tracing with Adobe Assurance. 
https://experienceleague.adobe.com/en/docs/platform-learn/implement-web-sdk/tags-configuration/validate-with-assurance
Search for recommendation section under context→ targetTrace payload.