Expand my Community achievements bar.

Join expert-led sessions on Real-Time CDP & Journey Optimizer designed to boost your impact.
SOLVED

Profile Ingestion using DCS Streaming HTTP API

Avatar

Level 2

Hello Experts,

 

We are working on setting up pipeline for profile data ingestion to AEP using Streaming HTTP API. Pipeline will be sending update events as well for existing attribute and therefore we have enable upsert. Below is the sample call 

 

curl --location 'https://dcs.adobedc.net/collection/batch/{INLET-ID}?synchronousValidation=true' \
--header 'Authorization: bearer {TOKEN}' \
--header 'x-gw-ims-org-id: {ORG-ID}' \
--header 'x-api-key: {API-KEY}' \
--header 'x-sandbox-name: {SANDBOX}' \
--header 'Content-Type: application/json' \
--data-raw '{
    "messages": [
        {
            "header": {
                "imsOrgId": {ORG-ID},
                "datasetId": {DATASET-ID},
                "flowId": {FLOW-ID},
                "operations": {
                    "data": "merge",
                    "identity": "create",
                    "identityDatasetId": {IDENTITY-DATASET-ID}
                }
            },
            "body": {
                "_id": "test05152025-0101",
                "createdByBatchID": "test",
                "modifiedByBatchID":"test",
                "personID": "test05152025-01",
                "repositoryCreatedBy":"testUser",
                "repositoryLastModifiedBy":"testUser",
                "_tenant1": {                    
                    "fieldGroup1": {
                        "attribute1": false
                    },
                    "identifiers": {
                        "customId": "test05152025-01"
                    },
                    "fieldGroup2": {
                        "attribute2": "TEST"
                    }
                }
            }
        }
    ]
}'

- In streaming source, i have marked "XDM Compatible" option as selected since we don't want to maintain the mapping in dataflow. I am getting below error during data ingestion -

"responses": [
{
"xactionId": "482254a8-410",
"status": 400,
"message": "The value supplied for the 'body' field does not match your input schema. Update the 'body' value and try again."
}
]
 
When i use this with another source where mappings (data prep mappings) are present, data is getting ingested without error. Please suggest how can i resolve this.
Topics

Topics help categorize Community content and increase your ability to discover relevant content.

1 Accepted Solution

Avatar

Correct answer by
Level 5

Hello @learner_monk ,

When you use streaming source with mappings, AEP can transform or map your incoming data to match the XDM schema, allowing for more flexibility in field names, structure, and types. With XDM Compatible enabled, this mapping step is skipped, so the payload must be a perfect match but i believe the JSON you’re sending doesn’t match the XDM schema of your target dataset. When you choose the “XDM Compatible” option, AEP expects your payload to fit the schema perfectly-no automatic mapping or adjustments happen in the background. Every field, its structure, and its data type in your JSON must line up precisely with what’s defined in the dataset’s schema.Omit any fields not present in the schema, and ensure all required fields are present.With "XDM Compatible" enabled, your payload must match the schema exactly. Review your schema and payload for discrepancies in field names, nesting, and types. 

Thanks

AJODev

View solution in original post

3 Replies

Avatar

Correct answer by
Level 5

Hello @learner_monk ,

When you use streaming source with mappings, AEP can transform or map your incoming data to match the XDM schema, allowing for more flexibility in field names, structure, and types. With XDM Compatible enabled, this mapping step is skipped, so the payload must be a perfect match but i believe the JSON you’re sending doesn’t match the XDM schema of your target dataset. When you choose the “XDM Compatible” option, AEP expects your payload to fit the schema perfectly-no automatic mapping or adjustments happen in the background. Every field, its structure, and its data type in your JSON must line up precisely with what’s defined in the dataset’s schema.Omit any fields not present in the schema, and ensure all required fields are present.With "XDM Compatible" enabled, your payload must match the schema exactly. Review your schema and payload for discrepancies in field names, nesting, and types. 

Thanks

AJODev

Avatar

Level 6

Hi @learner_monk ,

 

@AJODev has already pointed out the problem. However, to add one further point on top of the comment:

Since you have already set up this http api, please check the dataflow and you can copy the payload. It will give you the full curl command so you do not have to create it yourself. This curl command will give you dummy data but already prepared according to your schema. Once you see the curl in postman, you can add/ delete unnecessary attributes and should be ready to go

 

supratim320_0-1747368121832.png

Hope this helps nailing the issue!

 

Avatar

Level 2

Thanks @supratim320 and @AJODev 

I used the copy payload option from dataflow and was able to make successful API call.

 

It was missing "xdmEntity" inside the body.