Hello Team,
Welcome to the Adobe Real-Time CDP Community Mentorship Program 2024! This is the featured Community Discussion/Contextual thread for your Adobe Real-Time CDP Community Mentor, Manoj Kumar!
Manoj Kumar will be your dedicated mentor, providing valuable support and guidance on your Adobe Real-Time CDP queries as you upskill yourself and prepare for Real-Time CDP certification throughout the program.
Know your Mentor Manoj Kumar (aka @_Manoj_Kumar_ )
Manoj is a great contributor to Experience League communities, a CDP expert, and comes with a decade of experience.
Aspirants mapped to Manoj Kumar
How to participate in the program
Suggested Next Steps for Aspirants:
Remember that every post / like / comment you make in your contextual thread and the Real-time CDP Community throughout the program helps increase your chance to be recognized by your Mentor and win exclusive Adobe swag, so bring your best efforts!
We wish you all the best as you embark on this learning experience!
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Welcome to the 3rd week of learning Adobe RT-CDP.
This weeks topics is:
Data Ingestion
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Here is the question #1 for Module 3 topic: Describe data ingestion capabilities with the CDP
A cloud storage source was scheduled to ingest data daily but it was stopped because of some data issues in upstream systems. Later on, it was found that the data available on cloud storage was correct and could be ingested into the platform.
Now we have to ingest the backlog of files and restart the daily import process.
how this can be achieved?
@_Manoj_Kumar_,
I believe that the answer to this question is to create a new data flow with backfill enabled. This is because when creating a new dataflow, it is automatically created with a backfill of 13 months of historical data, and data that can be ingested going forward using a Daily schedule.
This is based off my understanding of dataflows in AEP.
Respectfully,
@thewadeng
Hello @thewadeng Yes, You are correct. Creating a new data with backfill enabled is the only way to achieve this. Also, When creating a new dataflow there is a toggle which lets you enable backfill thought is enabled by default.
Views
Replies
Total Likes
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Here is the question #2 for Module 3 topic: Describe the capabilities of Edge ingestion
What is the sequence of flow for a personalization request using Adobe Target and Adobe Experience Platform Web SDK to reach the Target Edge via the Edge network?
@_Manoj_Kumar_,
Based off my understanding of the documentation and my understanding of AEP thus far, the sequence of a personalization request is as follows:
This was generated based off my understanding from previous modules and the provided documentation above.
Respectfully,
@thewadeng
Below are the flow details :-
Thanks,
Somen
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Here is the question #3 for Module 3 topic: Describe the advanced techniques around data ingestion architecture
How can a data engineer prevent missing values when correcting partial updates in a profile fragment, such as fixing a first name typo while ensuring other fields, like the last name, remain intact?
Views
Replies
Total Likes
@_Manoj_Kumar_,
There are a few ways to prevent missing values when correcting partial updates in a profile fragment:
My answer is based on my knowledge of RT-CDP.
Respectfully,
@thewadeng
Hello @thewadeng Yes, you are correct.
But most important part is enable the datasets to support upserts:
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
You can 50% discount coupon for the certification by filling out the form link given below.
Form Link: https://forms.office.com/r/fNYJstwxKJ
Note: The form would be open to taking responses from Aug 11 to Aug 19
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Welcome to the 4th week of learning Adobe RT-CDP.
This weeks topics is
Segmentation
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Here is the question #1 of the week topic: Describe different ways to build audiences & segments within the CDP
What type of segmentation would you use, if the criteria to create an audience when someone visits pages abc.com/1 Or abc.com/2 after login to the website? The login action is captured as an event.
Views
Replies
Total Likes
@_Manoj_Kumar_,
Because the question specifies the use of a single event to determine which segmentation to use and the criteria to be able to build an audience based on a sequence, I believe that the use of streaming segmentation is the correct choice here.
I initially thought this answer might have been edge segmentation because of the two pages being served after a specific point, but then I considered that this would apply had the segmentation occurred using the Edge network; where Target sends personalizations on the server-side without the use of an event. I came to this answer using my own knowledge of segmentation
Respectfully,
@thewadeng
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Here is the question #2 of the week topic:
What namespaces can be used to check the data overlap while using the Segment Match to share data with a partner?
Views
Replies
Total Likes
@_Manoj_Kumar_,
According to the Segment Match documentation , the supported namespaces (comprised of both identity value and identity namespace) are:
I used the documentation to find this answer.
Respectfully,
@thewadeng
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
Here is the question #3 of the week topic: Demonstrate an understanding of how to apply use case(s) to segment/audience activation
At what frequency the data is shared with Meta when an audience is activated to the Facebook audience?
Views
Replies
Total Likes
@_Manoj_Kumar_,
When a data connection is set up with Facebook, and audiences are activated to this destination, data is shared at a streaming frequency. This is because the documentation for the Facebook connector defined itself as streaming-based connection via APIs.
I reached my answer by initially remembering about the connector itself, and then cross-checking my answer using the documentation.
Respectfully,
Hello @thewadeng The documentation says streaming but in reality it is shared every hour.
Views
Replies
Total Likes
Hello @Amit_Shinde1 @JoyceLu1 @HamishMo @vabs95 @vprasad @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar @somen-sarkar @Bhanu8562 @fondofhatsKey
I hope you all are doing well and on your path for the certification.
Please feel free to add any questions or doubts you might have on Activation, Governance & Administration
Views
Replies
Total Likes
@_Manoj_Kumar_,
I have a few questions:
These are the ones I have for now, but will ask more as they come up.
Respectfully,
Views
Likes
Replies
Views
Likes
Replies
Views
Likes
Replies