Expand my Community achievements bar.

[Mentor Manoj Kumar] Adobe Real-Time CDP Community Mentorship Program 2024

Avatar

Administrator

Hello Team,

Welcome to the Adobe Real-Time CDP Community Mentorship Program 2024! This is the featured Community Discussion/Contextual thread for your Adobe Real-Time CDP Community Mentor, Manoj Kumar!

Manoj Kumar will be your dedicated mentor, providing valuable support and guidance on your Adobe Real-Time CDP queries as you upskill yourself and prepare for Real-Time CDP certification throughout the program.

Know your Mentor Manoj Kumar (aka @_Manoj_Kumar_   )

Manoj is a great contributor to Experience League communities, a CDP expert, and comes with a decade of experience. 

Aspirants mapped to Manoj Kumar

1) Amit Shinde aka @Amit_Shinde1 
2) Joyce Luey aka @JoyceLu1 
3) Hamish Moffatt aka @HamishMo 
4) Vaibhav Seth aka @vabs95 
5) Vinay Prasad aka @vprasad 
6) Priya Singh aka @PriyaSi1 
7) Ravi kiran Paturi aka @RavikiranPaturi 
Waden Greaux aka @thewadeng 
9) Sahil Ramrao Bhogekar aka @sramraobhogekar 
10) Somen Sarkar aka @somen-sarkar 
11) Bhanu Chander Pasala aka @Bhanu8562 
12) Michael Giddings aka @fondofhatsKey 

How to participate in the program

  • Post your Questions in this thread to connect with your Mentor, Manoj , and fellow Aspirant peers.
  • Stand a chance to win the ‘Most Engaging Aspirant’ recognition from your mentor by participating in a weekly quiz.
  • Test your knowledge by replying to the unresolved questions in the Real-Time CDP and AEP community and tag your Mentor to get recognized as an ‘Exceptional Contributor’ by your mentor.
  • Stick to the schedule to cover one module/week and clear Adobe Real-Time CDP Certification during the program: July 15 – Aug 30

Suggested Next Steps for Aspirants:

  • Update your Community Profile photo with your latest headshot to stand out to your Mentor and Peer Aspirants.
  • "Like" this thread to confirm your participation in the program.
  • Introduce yourself to Manoj and your Aspirant peers by Replying to this Thread! Break the ice by introducing yourself (location, org/ company, etc.) and your experience with/ interest in Adobe DX stack. 
  • Post your Questions to this thread as you begin learning more about the Adobe Real-Time Customer Data Platform Developer Expert certification (Exam ID: AD0-E605)
  • Stick to schedule and ensure you track your progress in the exam prep guide.
  • Test your learning by replying to weekly quiz by your mentor
  • Practice the modules by replying to unresolved queries in the AEP community & RTCDP Community and tag your mentor. 

Remember that every post / like / comment you make in your contextual thread and the Real-time CDP Community throughout the program helps increase your chance to be recognized by your Mentor and win exclusive Adobe swag, so bring your best efforts!

We wish you all the best as you embark on this learning experience!

85 Replies

Avatar

Community Advisor

Hello @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Welcome to the 3rd week of learning Adobe RT-CDP. 

 

This weeks topics is:

Data Ingestion

  • Describe data ingestion capabilities with the CDP
  • Describe the capabilities of Edge ingestion
  • Describe the advanced techniques around data ingestion architecture

 


     Manoj
     Find me on LinkedIn

Avatar

Community Advisor

Hello  @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Here is the question #1 for Module 3 topic: Describe data ingestion capabilities with the CDP

 

A cloud storage source was scheduled to ingest data daily but it was stopped because of some data issues in upstream systems. Later on, it was found that the data available on cloud storage was correct and could be ingested into the platform.

Now we have to ingest the backlog of files and restart the daily import process. 

how this can be achieved?

  • Create a new data flow with backfill enabled
  • Enable data flow and change the start date in dataflow
  • Enable existing data to flower and create new dataflow to import old files.

     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

I believe that the answer to this question is to create a new data flow with backfill enabled. This is because when creating a new dataflow, it is automatically created with a backfill of 13 months of historical data, and data that can be ingested going forward using a Daily schedule. 

This is based off my understanding of dataflows in AEP. 

Respectfully, 
@thewadeng 

Avatar

Community Advisor

Hello @thewadeng Yes, You are correct. Creating a new data with backfill enabled is the only way to achieve this. Also, When creating a new dataflow there is a toggle which lets you enable backfill thought is enabled by default.

_Manoj_Kumar__0-1723429056333.png

 


     Manoj
     Find me on LinkedIn

Avatar

Community Advisor

Hello  @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Here is the question #2 for Module 3 topic: Describe the capabilities of Edge ingestion

 

What is the sequence of flow for a personalization request using Adobe Target and Adobe Experience Platform Web SDK to reach the Target Edge via the Edge network?

 

Hinthttps://experienceleague.adobe.com/en/docs/experience-platform/web-sdk/personalization/adobe-target/...


     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

Based off my understanding of the documentation and my understanding of AEP thus far, the sequence of a personalization request is as follows: 

 

  1. The request is generated from activity based on WebSDK, which is needed to activate data to Target based on the initial information provided from the website. One thing to remember is that the Active On-Edge merge policy must be selected to enable same-page and next-page personalization using Target. 
  2. The information is sent to the Edge Network where it is enriched with further resources such as the visitorID, consent (which can be gathered through the Consents and Preferences object within a schema) and other information.
  3. After leaving the Edge Network, the request makes it way to Target with all of the necessary information.
  4. Upon arriving, Profile scripts execute on the back-end to facilitate the communication between Target and Profile storage. At this time, information from segments generated across other Adobe products is received.
  5. Once all of the data has been collected, Target determines what to do, and sends the information back to the Edge network within the same personalization request.
  6. Once back the Edge network, this information is shown to the user--depending on the nature of the action, the content is cached so that it can be served to the user. It's important to remember that the same information collected in Step 2 remains as the information is served through Target.
  7. After being sent the notification from the device via WebSDK, the Edge Network forwards the request to the Analytics for Target instance, where the data collected from the personalization request can be used downstream--namely in Analytics.

This was generated based off my understanding from previous modules and the provided documentation above.

 

Respectfully, 
@thewadeng 

Avatar

Level 7

@_Manoj_Kumar_ ,

Below are the flow details :-

  1. a. The device loads the Web SDK. Page container is pre-hidden.
    b. The Web SDK sends a request to the Edge Network with XDM data, passed-in parameters, and the Customer ID (optional).
  2. The Edge Network sends the request to the edge services to enrich it with the Visitor ID, consent, and other visitor context info, such as geolocation and device-friendly names.
  3.  The Edge Network sends the enriched personalization request to the Target edge with the Visitor ID and passed-in parameters.
  4.  Profile scripts execute and then feed into Target profile storage. Profile storage fetches segments from the Audience Library (for example, segments shared from Adobe Analytics, Adobe Audience Manager, the Adobe Experience Platform).
  5.  Based on URL request parameters and profile data, Target determines which activities and experiences to display for the visitor for the current page view and for future prefetched views. Target then sends this back to the Edge Network.
  6. a.The Edge Network sends the personalization response back to the page .The Edge Network sends the Visitor ID and other values in cookies, such as consent, Session ID, identity, cookie check, personalization.
    b. Pre-hidden containers are being loaded with personalized offers and those offers are shown to customer.
  7. The Web SDK sends the notification from the device to the Edge Network as content is shown.
  8. The Edge Network forwards the details to the Analytics edge and those data are again used to build/enrich customer segments.

Thanks,

Somen

Avatar

Community Advisor

Hello  @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Here is the question #3 for Module 3 topic: Describe the advanced techniques around data ingestion architecture

 

How can a data engineer prevent missing values when correcting partial updates in a profile fragment, such as fixing a first name typo while ensuring other fields, like the last name, remain intact?

 

 


     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

There are a few ways to prevent missing values when correcting partial updates in a profile fragment:

  • Use the Data Prep function and feature to validate data as it comes through
  • Utilize Partial Ingestion to send a few updates in to ensure that data is coming through as expected
  • Use Regex for values within schema to ensure that values provided adhere to a format
  • You can audit your events to determine if events are being deduplicated using the Logs feature

My answer is based on my knowledge of RT-CDP. 

Respectfully, 
@thewadeng 

Avatar

Community Advisor

Hello  @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

You can 50% discount coupon for the certification by filling out the form link given below.

Form Link: https://forms.office.com/r/fNYJstwxKJ

Note:  The form would be open to taking responses from Aug 11 to Aug 19


     Manoj
     Find me on LinkedIn

Avatar

Community Advisor

Hello @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Welcome to the 4th week of learning Adobe RT-CDP. 

 

This weeks topics is 

Segmentation

  • Describe different ways to build audiences & segments within the CDP
  • Explain the various segmentation types and how they operate
  • Demonstrate an understanding of how to apply use case(s) to segment/audience activation

     Manoj
     Find me on LinkedIn

Avatar

Community Advisor

Hello @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Here is the question #1 of the week topic: Describe different ways to build audiences & segments within the CDP

 

What type of segmentation would you use, if the criteria to create an audience when someone visits pages abc.com/1 Or abc.com/2 after login to the website? The login action is captured as an event.


     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

Because the question specifies the use of a single event to determine which segmentation to use and the criteria to be able to build an audience based on a sequence, I believe that the use of streaming segmentation is the correct choice here.

I initially thought this answer might have been edge segmentation because of the two pages being served after a specific point, but then I considered that this would apply had the segmentation occurred using the Edge network; where Target sends personalizations on the server-side without the use of an event. I came to this answer using my own knowledge of segmentation 

 

Respectfully, 
@thewadeng 

Avatar

Community Advisor

Hello @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Here is the question #2 of the week topic: 

 

What namespaces can be used to check the data overlap while using the Segment Match to share data with a partner?


     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

According to the Segment Match documentation , the supported namespaces (comprised of both identity value and identity namespace) are:

  • Emails (SHA256, lowercase)
  • Phone (SHA256_E.164)
  • ECID
  • Apple's IDFA
  • Google Ad ID

I used the documentation to find this answer. 

Respectfully, 
@thewadeng 

Avatar

Community Advisor

Hello @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

Here is the question #3 of the week topic:  Demonstrate an understanding of how to apply use case(s) to segment/audience activation

 

At what frequency the data is shared with Meta when an audience is activated to the Facebook audience?


     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

When a data connection is set up with Facebook, and audiences are activated to this destination, data is shared at a streaming frequency. This is because the documentation for the Facebook connector defined itself as streaming-based connection via APIs. 

I reached my answer by initially remembering about the connector itself, and then cross-checking my answer using the documentation. 

Respectfully,

@thewadeng 

Avatar

Community Advisor

Hello @thewadeng  The documentation says streaming but in reality it is shared every hour.


     Manoj
     Find me on LinkedIn

Avatar

Community Advisor

Hello @Amit_Shinde1  @JoyceLu1 @HamishMo  @vabs95 @vprasad  @PriyaSi1 @RavikiranPaturi @thewadeng @sramraobhogekar  @somen-sarkar @Bhanu8562  @fondofhatsKey 

 

I hope you all are doing well and on your path for the certification.

 

Please feel free to add any questions or doubts you might have on Activation, Governance & Administration


     Manoj
     Find me on LinkedIn

Avatar

Level 3

@_Manoj_Kumar_

I have a few questions:

  • Will the exam cover anything related to Federal Audience Composition? I don't believe that it is mentioned in the Exam Prep Guide, but sticks out as something to at least be aware of?
  • What are the best practices around determining when to use a certain type of segmentation over the other?
  • When it comes to scenario based questions on the exam that cover a specific type of connector, should we focus on learning about some of the more used connectors? 

These are the ones I have for now, but will ask more as they come up. 

Respectfully, 

@thewadeng