I would be thrilled if somebody would be able to come up with some suggestions on how to solve a "challenge" we are facing.
Backround: - we are using offline data import to add some offline events and evars to our reporting - we are using transactionID to associate these with user behaviour - the events are set as counters
The challenge: - we were running a few imports of offline events to Adobe these past few weeks, all in all, 3 months worth of data - by a mistake, we uploaded the the whole database everytime, not only the incremental data 😞 - we found out, that every event that we uploaded as a duplicate by a mistake is treated by adobe, as if it happened multiple times, and not as a duplicate of the same event - so, if an event with the same transactionID and the same timestamp was included in three offline data imports, Adobe treats it as if it happened three times - obviously, the data is ruined - obviously, we would love not to loose 3 months worth of data - it is not possible to remove the existing data from adobe in any way
The solution? We are thinking about two ways how to solve this: - create new events and evars, an try to upload the same data, with the transaction IDs, but using the new events and evars. The big question is, if Adobe would be able to associate these newly created events and evars, which did not exist till today, with previous user behaviour - use some existing events and evars, which were not used these last few months, and upload the data using these. But, adobe is recommending exercising caution when doing this, although not specifying, what the dangers are...
So... I know this answer is over a year late, but you if you want to undo events that have previously been uploaded, you can upload negative numbers in your data source file and those negatives can cancel out the previously uploaded events.