Your achievements

Level 1

0% to

Level 2

Tip /
Sign in

Sign in to Community

to gain points, level up, and earn exciting badges like the new
Bedrock Mission!

Learn more

View all

Sign in to view all badges

SOLVED

Snowflake Data Ingestion Error (Native Connector method) on Email Format

Avatar

Level 3

Team - We are trying to ingest one of the snowflake table data in to AEP data lake. We have created the respective schema and datasets accordingly for the data ingestion. There is a column name called "Email" (VARCHAR(16777216) data type ) from snowflake data source which has a customer email and respective field has been created in AEP schema as string data type to receive the same. We have mapped the email data with our respective string data type and scheduled in some frequent basis to onboard the snowflake data.

However, there is a couple of error throwing related to this specific field and we have tried with changing the other data type from AEP schema but still no luck. Refer the below screenshot for the reference. Please let me know your thoughts on this error.

 

MicrosoftTeams-image (2).png

1 Accepted Solution

Avatar

Correct answer by
Level 3

Ah I see, which hashing function are you using within the calculated field? Also, following the decryption are you masking the key words with ${}  ?

 

Keywords:

new, mod, or, break, var, lt, for, false, while, eq, gt, div, not, null, continue, else, and, ne, true, le, if, ge, return

View solution in original post

3 Replies

Avatar

Level 3

Has the data been cleaned before the import? i.e. it could be that the data has special character, null etc. 

Did you try converting the data to string before the import? 

Sounds more like a data issue to me.

Avatar

Level 3

Hi @adobechat , thanks for the reply. Yes, we have ensured to clean the data before we ingest in to the AEP platform. However, the data that is being pushed to AEP data lake is encrypted and so OOTB schemas has a restriction to accept those data directly. Hence, we have used calculated filed and did some manipulation before ingesting to AEP data lake and it worked as expected. 

But, still intermittently we are facing issue related to OOTB filed even though we use calculated field to avoid such errors.

Avatar

Correct answer by
Level 3

Ah I see, which hashing function are you using within the calculated field? Also, following the decryption are you masking the key words with ${}  ?

 

Keywords:

new, mod, or, break, var, lt, for, false, while, eq, gt, div, not, null, continue, else, and, ne, true, le, if, ge, return