Identity namespace should be same for the identities? | Community
Skip to main content
Level 2
June 2, 2025
Solved

Identity namespace should be same for the identities?

  • June 2, 2025
  • 1 reply
  • 1239 views

Hi, I am creating two schemas individual profile and experience event schema and using loyalty_Id as identity in the both schema. Whether should I create single namespace for both the Ids or create two namespace to assign them separately?

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by Tof_Jossic

Okay thank you so much for the resolution, may I know why event ingestion fails to promote the data to profile? Is it kind of a system issue/background issue?


@adlsst200 I suspect you have something in your payload that is not in the schema but as I said a Support ticket will allow for a deeper dive at this stage.

1 reply

Tof_Jossic
Adobe Employee
Adobe Employee
June 2, 2025

@adlsst200 the same namespace should be available to use in both Profile and Event schemas

ADLSSt200Author
Level 2
June 2, 2025

Thank you! I did the same and when I ingested data and validated it in profile by searching an identity and its namespace ... it's showing attributes only not the event for the event schema 

 

And in identity graph only one dataset is showing there why other one dataset is not there and I think identities aren't getting stitched. How to solve this problem?

 

 

Tof_Jossic
Adobe Employee
Adobe Employee
June 2, 2025

@adlsst200 might be tricky to fully troubleshoot here but I'm trying to trace your data off of those screenshots. As far as I can tell, the Identity Graph contains 2 identities stitched together which I believe are 'loyaltyid' and 'crmid'.

In terms of the events, it looks like you are uploading via file upload into dataset 'Luma Offline Purchase Events Dataset - Vishal'. As far as I can see the data lands into the Data Lake but is not being promoted to UPS.

It looks like the dataset was enabled for Profile after the data started ingesting, that would be the issue.