Expand my Community achievements bar.

Updating Destination Dataflow writing to GCS Bucket

Avatar

Level 1

Hi Team,

 

Is there anyway to update the destination dataflow in AEP, for example, i have built the Dataflow to write the data to GCS bucket, my first run extracted all the data and pushed it to the bucket and the subsequent runs were bringing the incremental data , everything works fine till now, however we decided to include manifest file to the export so that we can use it as a trigger for our pipelines to run in our Google cloud. But when we tried to implement this it didn't let us update the dataflow, instead we ended up building a new one, the challenge here is when the new dataflow is built it is bringing in all the data (Even when incremental options is selected).

 

Any other option to update the destination dataflow without recreating it?

 

Thanks

Naresh

0 Replies