Row-level ingestion from the SnowflakeDB connector works for the initial backfill, however, subsequent incremental loads based on the delta column are not being ingested into AEP, and no updates are flowing through | Community
Skip to main content
Level 2
April 7, 2026
Solved

Row-level ingestion from the SnowflakeDB connector works for the initial backfill, however, subsequent incremental loads based on the delta column are not being ingested into AEP, and no updates are flowing through

  • April 7, 2026
  • 1 reply
  • 25 views

@AmitVishwakarma ​@Asheesh_Pandey ​@kautuk_sahni ​@akshay_kashyap 
Any suggestions would be appreciated thank you 
Hi Folks,

Based on the referenced document, I created a date filter to limit ingestion from a Snowflake table that contains the last 10 years of historical data. The requirement was to ingest only the last one month of data initially and then continue with incremental ingestion going forward.

The initial load completes successfully, and we can see the expected data in AEP. However, during the incremental runs, the dataflow fails to ingest any new records, even though new data is available in Snowflake.

It appears that the delta column–based incremental logic is not picking up the latest data, resulting in no records being ingested during incremental executions

PQL filter in the Source connection works as expected 
!--scriptorstartfragment-->

{

  "type": "PQL",

  "format": "pql/json",

  "value": {

    "nodeType": "fnApply",

    "fnName": ">=",

    "params": [

      {

        "nodeType": "fieldLookup",

        "fieldName": "business_process_dt"

      },

      {

        "nodeType": "literal",

        "value": "2026-03-01"

      }

    ]

  }

}


https://experienceleague.adobe.com/en/docs/experience-platform/sources/api-tutorials/filter
https://experienceleague.adobe.com/en/docs/experience-platform/sources/api-tutorials/collect/payments


Inside Dataflow creation  the delta column definition !--scriptorstartfragment-->!--scriptorendfragment-->

{

  "transformations": [

    {

      "name": "Copy",

      "params": {

        "deltaColumn": {

          "name": "business_process_dt",

          "dateFormat": "YYYY-MM-dd",

          "timezone": "UTC"

        }

      }

    },!--scriptorendfragment-->

 

Best answer by Asheesh_Pandey

Delta ingestion in AEP only works reliably when the delta column is strictly increasing and high-precision and not reused values.

You are using business_process_dt (format: YYYY-MM-dd), which is a date-only field. 

  • Initial load --> all March data ingested
  • Incremental --> new rows still have same date ("fnName": ">=")

AEP thinks: “Already processed this value” as a result: 0 records ingested. 

To solve this, change the delta column to something like business_process_ts (YYYY-MM-dd HH:mm:ss). Alternatively, once initial backfill is done, remove the static PQL filter and let Delta alone control ingestion. If nothing works, recreate the dataflow with a new delta column.

 

 

1 reply

Asheesh_Pandey
Community Advisor
Asheesh_PandeyCommunity AdvisorAccepted solution
Community Advisor
April 8, 2026

Delta ingestion in AEP only works reliably when the delta column is strictly increasing and high-precision and not reused values.

You are using business_process_dt (format: YYYY-MM-dd), which is a date-only field. 

  • Initial load --> all March data ingested
  • Incremental --> new rows still have same date ("fnName": ">=")

AEP thinks: “Already processed this value” as a result: 0 records ingested. 

To solve this, change the delta column to something like business_process_ts (YYYY-MM-dd HH:mm:ss). Alternatively, once initial backfill is done, remove the static PQL filter and let Delta alone control ingestion. If nothing works, recreate the dataflow with a new delta column.

 

 

- Asheesh
Level 2
April 10, 2026

 ​@Asheesh_Pandey 
After changing the delta Column to yyyy-MM-dd HH:mm:ss incremental ingestion worked .
Thank you