Expand my Community achievements bar.

SOLVED

Delivery failed while sending out campaigns

Avatar

Level 4

Hi,

I see the following error, popping up in many of my campaigns. But when I recreate the delivery , it's sent out successfully. What could be the reason?

1449115_pastedImage_0.png

Regards,

Priyanka

1 Accepted Solution

Avatar

Correct answer by
Employee Advisor

This error occured due to the "Duplicate Key Value Constraint". It means that it was trying to insert the same Primary Key in the nmsMirrorPageInfo schema which was already existing.

So, you can just copy the delivery and try to send it again, it should work this time. If still this is not working then you have to remove the same primary keys from concerned table manually.

Reason For Same Primary Key Generation :

There is only one sequence in adobe campaign named "xtknewid" which generates the primary keys for all the schemas except broadlog and tracking logs. And there is a limit on the sequence that is can generate so and so number of primary keys in one iteration. When these all primary keys of one iteration are exhausted, it creates second iteration and starts to generate the same primary keys.

But, as it inserts primary keys in all the schemas, there is a very very less chance that duplicate primary key can enter in the same table in which it was previously inserted.

This is exactly what happened here. Hope I cleared your doubt.

View solution in original post

2 Replies

Avatar

Correct answer by
Employee Advisor

This error occured due to the "Duplicate Key Value Constraint". It means that it was trying to insert the same Primary Key in the nmsMirrorPageInfo schema which was already existing.

So, you can just copy the delivery and try to send it again, it should work this time. If still this is not working then you have to remove the same primary keys from concerned table manually.

Reason For Same Primary Key Generation :

There is only one sequence in adobe campaign named "xtknewid" which generates the primary keys for all the schemas except broadlog and tracking logs. And there is a limit on the sequence that is can generate so and so number of primary keys in one iteration. When these all primary keys of one iteration are exhausted, it creates second iteration and starts to generate the same primary keys.

But, as it inserts primary keys in all the schemas, there is a very very less chance that duplicate primary key can enter in the same table in which it was previously inserted.

This is exactly what happened here. Hope I cleared your doubt.

Avatar

Level 4

Hi Kapil,

Yes that was a detailed note. Thank you very much

Regards,

Priyanka