Hi team,
I am trying to ingest the multiple records to AEP using HTTP API end point but it get's failed could you please help me out here.
Referring below document for reference,
https://experienceleague.adobe.com/docs/experience-platform/ingestion/tutorials/streaming-multiple-m...
Currently I am able to ingest one records at a time.
Regards,
Sandip
Solved! Go to Solution.
Views
Replies
Total Likes
Hi nnakirikanti,
Thanks for response. My issue gets resolved.
I am doing one mistake in the post call I miss to add 'batch' word.
POST /collection/batch/{CONNECTION_ID}
Thanks
@sandip_surse Check this documentation, this would answer your case.
Let me know if you need any further guidance.
Views
Replies
Total Likes
Hi nnakirikanti,
Thanks for the response.
I already following the same document but still records gets failed.
Regards,
Sandip
@sandip_surse Post your code/results/errors to check further and recommend,
Hi nnakirikanti,
Thanks for response. My issue gets resolved.
I am doing one mistake in the post call I miss to add 'batch' word.
POST /collection/batch/{CONNECTION_ID}
Thanks
Hi nnakirikanti,
Greetings!
I am trying to performing the streaming ingestion for multiple records using the postman in that we are getting response like 207 Multi-status, created batch ID etc.
At dataset level data also successfully ingested.
Ques.1- In postman response one batch ID generated and at dataset level another batch ID created so these two batch Id are same or different?
because in my case both are different.
Ques2. Source team feeding the data in AEP that time they are created two separate batches one Batch Id for 600 records and another batch Id for 300 records. But AEP ae observe 900 records ingested in one single batch. Is that correct or in AEP also suppose to created two separate batch Id one for 600 records and another batch Id for 300 records.
Their is any limitation at AEP end for sending/pushing the data from source side in between two separate batches.
Please help to understand me.
Regards,
Sandip
Views
Replies
Total Likes
Hi @sandip_surse - As @nnakirikanti mentioned, could you please upload the request here and make sure the schemaid, datasetid, orgid are all correct.
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies