I used Postman to create a dataflow (https://platform.adobe.io/data/foundation/flowservice/flows) successfully. However, when using the imported python code in databricks with the same credentials, I get a status code of 500. What could potentially be the issue on the server end? I use Databricks to make various API calls to the server and not experience any issues until now. Not sure why the same code works in Postman and not in Databricks.
Here is the error message.
Request failed with status code: 500 {"type":"https://ns.adobe.com/aep/errors/FLOW-1400-500","title":"Internal Error","status":500,"report":{"detailed-message":"An internal error has occurred. Please try again. If the problem persists, please contact customer support.","request-id":"PKcrlPGnGiEzQM8EXg3Jkcgy1TbrtciS"},"errorMessage":"An internal error has occurred. Please try again. If the problem persists, please contact customer support.","errorDetails":"An internal error has occurred. Please try again. If the problem persists, please contact customer support."}
A 500 Internal Server Error usually indicates a server-side issue. Since it works in Postman but not Databricks, check the following:
If the issue persists, contact Adobe support with the request-id from the error message.
The call is working, I added connection: keep-alive to the headers and also had separate calls to create the source connection and target connection and then pass the ids in the create flow call.
Maybe check your credentials/keys? If it used to work and it suddenly doesn't, it could be as simple as an expired key, or outdated project permission.
@JohnYa5 Did you find the suggestions helpful? Please let us know if you need more information. If a response worked, kindly mark it as correct for posterity; alternatively, if you found a solution yourself, we’d appreciate it if you could share it with the community. Thank you!
조회 수
답글
좋아요 수