Hello Community Advisors/Advocates,
Thanks for joining the product call with Senior PM Abhishek where he walked us through framework to enable frictionless and quick ingestion of data in any format from any data source.
This is a contextual thread to post your queries which will be answered by internal AEP experts.
One of the feature we greatly miss while ingesting data into AEP is decrypting it on the fly. Data prep allows to hash data but there is no way to decrypt it. Any thoughts on this ?
While ingesting data specially from source like snowflake what I spotted is it doesnt list the schema details of snowflake to which table belongs to. We've been into a situation where table having same name under different schema was tough to distinguish. Its good to have path details of source tables.
I'm not sure on practicality but I'm curious to know if we can have a connector without Auth spec? Just thinking of a source which has some industry test data and connector is open to anyone who want to bring that test data into AEP and try some hands on ?
How is self source connector framework different than building your own extension in adobe launch/data collection in terms of publishing process specially ? Is it more or less similar ?
What are the industry ways of data validation/quality check for data ingested in AEP ? What are the common practices followed by AEP customers for this?
In order to use offers in AJO messages, How can we ingest offer and its meta data in AEP from an external(non Adobe) offer management system ? Please share a real world scenario ?
Hi @Sneha-Parmar ,
I would like hear from Adobe experts as well but here is something that I had personally built externally to create offers in AEP ODS.
ODS offers apis for you to define your offer and its components. I had worked on a requirement where I had to build bulk offers & rules. Hence I had created a csv file of offers metadata and via script we looped in through each csv record and called up create offer api. Parametrized function for offer was like