Dear Friends,
We are trying to upload the sample data to test out the data ingestion through file upload method for the schema we have created. Whenever we are trying to upload the data with a single record in it through CSV upload, we are getting the following error. Screenshot attached below for your reference.
NOTE: The strange thing that happens is we would be able to see the data getting ingested into the dataset. However, during the data ingestion process, it throws this 422 error (Unprocessable Entity).
Please help us understand what is the cause of this error and how to resolve this.
Thanks in advance. Gratitude.
Solved! Go to Solution.
Views
Replies
Total Likes
Thanks a ton for your kind support. That seemed like a Platform error. It got fixed now by them. Now it is working fine.
Have you done a quick test to make sure all the data is reaching the data set? Do the number of record within the CSV match?
Views
Replies
Total Likes
As of now I have tested with a single record. All the field values are being updated as per the CSV sheet.
Views
Replies
Total Likes
So Adobe doesn't publicly give any details about that error response, so we may need to see what they say. On the API side, ref, the xdmEntity is where the actual values are described. So to say that the entity can't be processed is pretty vague, but probably has to do with the mapping. Seems most likely to be a bug with the connector itself which is maybe using the API incorrectly or not performing any validation on your behalf. So, on your side, there could be some field validation issue... you may want to double check you are both mapping and have values in the CSV for all required fields and, some data types can be picky (e.g. Adobe's currency code data type uses a pattern recognition of ^[A-Z]{3}$ so would get picky about any other values mapped that don't meet that pattern)
Thanks a ton for your kind support. That seemed like a Platform error. It got fixed now by them. Now it is working fine.
Views
Likes
Replies
Views
Like
Replies