Expand my Community achievements bar.

Announcing the launch of new sub-community for Campaign Web UI to cater specifically to the needs of Campaign Web UI users!
SOLVED

Update data activity fails due to unique key constraint violation

Avatar

Level 3

Hi,

 

There is a data schema which has the composite primary key i.e., email & ticket number, and the data flows from the source system to Adobe campaign every 2 hours. A behavior has been observed, i.e., if the file has minimum number of duplicate records in the combination of email + ticket number, the update data activity passes without causing the workflow to fail, whereas if the duplicate records goes past 300+ records in the file, the workflow is getting failed with update data activity. Could someone help me to understand this issue or is there any setting/configuration which causes this issue?

 

Thanks. 

1 Accepted Solution

Avatar

Correct answer by
Community Advisor

Hi @Rajesh_SS01,

Here is the reason why your data update fails with duplicate rows intermittently.

By principle, the update data activity inserts/updates data in the database in batches, default batch size is 10000.

For every batch that is processed, let's say the operation will be an insert operation, Adobe Campaign issues the insert command for every row of data in that batch. In case two or more identical rows of data fall into the same batch, then for all those rows an insert command will be executed. It will be successful for just one row, and will give the unique constraint violation error for the subsequent duplicate rows.
Best way to avoid such situations is to do a deduplication of data based on the key fields before feeding into the update data activity.

 

Thanks,

Ishan

View solution in original post

8 Replies

Avatar

Community Advisor

Hello @Rajesh_SS01 ,

can you deduplicate record or you pass duplicates on purpose? You can set dedup beforehand or dierectly in update activity.

Marcel_Szimonisz_0-1681802156198.png

 

Maybe when you also enable Transitions -> "Generate an outbound transition for rejects" could help in some cases

Marcel

Avatar

Level 3

Hi @Marcel_Szimonisz ,

 

Thanks for your response, yes, the action has been taken with the dedup activity, however, wanted to understand the reason for the failure in the update data activity if the record count with duplicates is higher in number, but at times, it is allowing the other records to insert or update when the duplicates count is lesser with error messages in the Audit logs.

 

Thanks.

Avatar

Community Advisor

It could be that somehow the duplicates are inserted in one batch and another time it is split in separate batches where in first the recors is  inserted and in other batches record it is updated.

 

Marcel

Avatar

Level 3

Hi @Rajesh_SS01 ,
Its morever due to duplicate data coming back in the transition again to the Update data activity.
The best way is to split the incoming data on the basis of Primary keys in the specific table and prevent the already existing records to be passed to update activity.

Hope that helps!.

 

Thanks

Avatar

Level 7

Hi @Rajesh_SS01 

Based on my understanding, Reject Count is set to 100 records by default. So as long as there are duplicate records (failing with Unique Constraint Violation) less than or equal to 100, update activity won't fail.

But if this threshold exceeds e.g.: 300 in your case, there will be a failure recorded in Update Activity.

Better way as suggested by our community members is to apply de-dup based on Key Combination you have in respective schema

Thanks,

SSB

Avatar

Community Advisor

Hi @Rajesh_SS01 

 

This issue is coming because the data coming in the transition is not unique as per the Primary key defined in the Schema. We need to split the data which has same primary key combination and ignore those records and then load the remaining records into the Schema.

 

Regards,

Pravallika.

Avatar

Employee Advisor

@Rajesh_SS01 ,

Based on the information provided, it appears that the primary key constraint is being violated when there are duplicate records in the combination of email and ticket number. This is likely causing the update data activity to fail when the number of duplicate records exceeds 300.

One potential solution to this issue could be to identify and remove the duplicate records before they are loaded into Adobe Campaign. Alternatively, you could modify the workflow or update data activity to handle the duplicate records in a more graceful way, such as by skipping or merging them.

Avatar

Correct answer by
Community Advisor

Hi @Rajesh_SS01,

Here is the reason why your data update fails with duplicate rows intermittently.

By principle, the update data activity inserts/updates data in the database in batches, default batch size is 10000.

For every batch that is processed, let's say the operation will be an insert operation, Adobe Campaign issues the insert command for every row of data in that batch. In case two or more identical rows of data fall into the same batch, then for all those rows an insert command will be executed. It will be successful for just one row, and will give the unique constraint violation error for the subsequent duplicate rows.
Best way to avoid such situations is to do a deduplication of data based on the key fields before feeding into the update data activity.

 

Thanks,

Ishan