Expand my Community achievements bar.

Dive into Adobe Summit 2024! Explore curated list of AEM sessions & labs, register, connect with experts, ask questions, engage, and share insights. Don't miss the excitement.
SOLVED

Are there settings that could be preventing our website from getting Google traffic?

Avatar

Level 2

We recently transferred our site from .net to AEM. It's a fairly large site with different directories for different states (i.e. www.site.com/fl/ ).

 

Our marketing department has not finished our home page design so our root home page is redirected to our .net homepage. 

 

After approximately 4-5 months our AEM site is getting very little SEO rankings/traffic. 

 

Are there technical settings in AEM or on our server I should be looking for to correct this issue?  I have been working on on-page optimization without much result in Google. Thanks in advance!

 

Topics

Topics help categorize Community content and increase your ability to discover relevant content.

1 Accepted Solution

Avatar

Correct answer by
Community Advisor

Hi @KaySmilesAlot,

Performance of your site on SEO depends on many factors like

1.whether your content is being rendered on server side or not

2. Is there any configuration on CDN(if you are using), dispatcher or AEM code level blocking robots to crawl you sites.

So first you have to check for the above factors then go for the solution of it:

If your content is not being rendered on server side means most of the Search Engines are not able to read your site content so there is no point in increasing SEO ranking without the SSR..

If you are using Single Page Application then please see some good article, available online, to improve SEO ranking in SPA on of that is https://www.searchenginewatch.com/2018/04/09/an-seos-survival-guide-to-single-page-applications-spas...

If your site is blocking robots to crawl your site, then first you need to identify good bots and bad bots then you can block and allow it on CDN, dispatcher or AEM code level because if you allow all the bots it will significantly impact your publisher instance and site performance.

Hope this will help.

Umesh Thakur

 

View solution in original post

1 Reply

Avatar

Correct answer by
Community Advisor

Hi @KaySmilesAlot,

Performance of your site on SEO depends on many factors like

1.whether your content is being rendered on server side or not

2. Is there any configuration on CDN(if you are using), dispatcher or AEM code level blocking robots to crawl you sites.

So first you have to check for the above factors then go for the solution of it:

If your content is not being rendered on server side means most of the Search Engines are not able to read your site content so there is no point in increasing SEO ranking without the SSR..

If you are using Single Page Application then please see some good article, available online, to improve SEO ranking in SPA on of that is https://www.searchenginewatch.com/2018/04/09/an-seos-survival-guide-to-single-page-applications-spas...

If your site is blocking robots to crawl your site, then first you need to identify good bots and bad bots then you can block and allow it on CDN, dispatcher or AEM code level because if you allow all the bots it will significantly impact your publisher instance and site performance.

Hope this will help.

Umesh Thakur