Expand my Community achievements bar.

Looking for guidance/tips on how to setup a DevOps CI/CD pipeline(s) via EDS

Avatar

Level 1

Hey Everyone,

 

Context: Our team is currently trying to optimally setup a client's project, and we want to create a smooth CI/CD system which will take into account both the development side of contributions as well as the actual content authoring side (client side) of the flow.

 

The tooling and process we're utilizing is as such (see attached screenshot as well of our current infra thinking & layout):

- Document/Content Authoring & Asset Management = Sharepoint

- Version Control and Development Organization = GitHub

Deployment and CDN = AEMaaCS (were using Edge Delivery)

 

As for utmost safety in a project and to prevent unwanted mistakes, multiple environments should exist ideally. But so far this is understood by our team as such:

- Feature branches take in local code changes/pushes

- Merge into a Staging branch

- Merge into QA or Main from which it goes to a production pipeline

- The pipeline is triggered from main

- Pipeline deploys to production (.com) site directly from .live (published content)

- Content authors can now see the changes on the main site, and choose to make changes

 

What we want to know is:

  • How to setup the different environments in the Edge Delivery Solution, considering we can only add EDS domains?
    • Do we set up separate domains (staging.abc.com for example) for each environment?
    • Would having multiple domains or subdomains be a good approach?
    • Is it necessary when working with an EDS process? Is a staging environment even needed?
  • How do we bifurcate/manage content and page differences between QA and Staging environments? Do we need different Sharepoint mount URLs?
  • Can SharePoint's version history fully address rollback scenarios, especially for deleted content in production?
  • What specific measures (e.g., robots.txt, meta tags) can be implemented to ensure non-production environments are excluded from search engine crawling?

Not all questions need to be addressed, just providing context for where we are stuck - seeking advice or tips on the best approach. 

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

3 Replies

Avatar

Administrator

@aanchal-sikka@gkalyan@BrianKasingli@HeenaMadan@SaumyaJa@martin_ecx_io@ChitraMadan@narendiran_ravi@Asutosh_Jena_

@Rajumuddana

Kindly take a moment to review this question and share your valuable insights. Your expertise would be greatly appreciated!



Kautuk Sahni

Avatar

Level 4

Hi @Saad-A.

I had worked on EDS with document based authoring for one of our client. We did followed some procedures which can answer few of your questions.

1. Separate repo for each environment: we created separate github repositories for each environment and added workflow to push the changes automatically to higher env repo.

2. Separate mount URL/ SharePoint path: we created separate SharePoint paths for each env and those are mapped to mount URL in respective github repos.

3. CI/CD pipeline: With cloud manager, we created private repository pointing to client github repo and generate secret code which needs to be added to the github repo in /well-known/adobe/cloudmanager-challenge.txt file. This will automate the pipeline with the client github repository.

 

Thanks

Avatar

Level 7
  • We don't need any specific settings to exclude nonprod content from crawling. By default, EDS will not index any content from *.aem.live or *.aem.page by loading default robots.txt with the config below. 
User-agent: *
Disallow: /
  •  Separate branches can be used for different environments