How to view the "Recs Box" for debugging Target Recommendations with AEP Web SDK? | Community
Skip to main content
Level 2
March 12, 2026
Question

How to view the "Recs Box" for debugging Target Recommendations with AEP Web SDK?

  • March 12, 2026
  • 1 reply
  • 32 views

I'm struggling to find an efficient way to debug Recommendations. With at.js, I used mboxTrace, which provided a very clear "Recs Box" showing exactly which criteria were used, which entities were returned, and why certain items were excluded.

Now that I'm using Web SDK and the AEP Debugger, I find the debugging process much more difficult. I can see the network calls, but I miss that high-level summary of the Recommendations logic.

What I've tried:

  • Using the AEP Debugger Extension to look at Edge Traces.

  • Digging through the interact response payloads in the Network tab.

My Questions:

  1. Is there a way to get a visual "Recs box" similar to the old mboxTrace functionality in at.js?

  2. In the AEP Debugger, where exactly should I look to see the Criteria details and Exclusion reasons for a specific Recommendations activity?

  3. Are there any third-party tools or specific "Edge Trace" configurations that make the JSON output more readable for Recs-heavy implementations?

  4. How do you effectively validate that the entity.id being passed in the current session is actually being used by the algorithm in real-time?

1 reply

Level 3
April 13, 2026

Hi ​@khush_b ,

 

To debug Recommendations specifically with the Web SDK, you have to shift from browser-rendered summaries to Edge Traces and Response Payloads. Here is the most efficient way to get that deep-level visibility back.

1. Recreating the "Recs Box" Data

Since the automatic visual box doesn't exist in the Web SDK, you can use Response Tokens to surface the same data in the browser's Network tab.

  • Setup: In the Target UI, go to Administration > Response Tokens and enable attributes like option.criteria.id, option.algorithm.id, and entity.id.

  • Result: These will now be included in the propositions section of the Web SDK interact response. It isn't a "box," but it tells you exactly which algorithm and which items were selected.

2. Finding Exclusion Reasons in AEP Debugger

To see the "Why" (e.g., why an item was filtered out), you must use an Edge Trace.

  1. Generate a Token: In Target, go to Administration > Implementation > Generate Trace Token.

  2. Connect in Debugger: Open the AEP Debugger, navigate to Logs > Edge, click Connect, and paste your token.

  3. Inspect the JSON: Look for the com.adobe.target.recommendations node in the log.

  4. Exclusion Details: Inside that node, look for inventoryFilterExecution. This section explicitly lists which entities were filtered out and the specific rule (e.g., "Category Match" or "Out of Stock") that caused the exclusion.

3. Validating entity.id in Real-Time

To confirm Target is actually "seeing" the product you are passing:

  • Check the Request: In the Network tab, inspect the interact payload. Ensure entity.id is correctly mapped under data > __adobe > target.

  • Check the Trace "Key": In the Edge Trace, look for the key or currentEntity attribute. If your algorithm is "People Who Viewed This," the currentEntity in the trace must match the entity.id sent in the request. If they don't match, Target is defaulting to a general algorithm because it didn't recognize the product context.

4. Efficient JSON Reading

Because Web SDK traces are massive, avoid reading them in the small Debugger window:

  • Copy to Editor: Copy the full Edge Trace JSON and paste it into a tool like VS Code.

  • Search for "recs": Search for the recommendations or propositions keys to jump straight to the logic blocks. This is much faster than scrolling through the AEP Debugger logs.