Hi everyone,
I’ve noticed that our AEM site has a very low AI visibility score (around 6%). Could this be happening because most of the content is rendered client-side ?
If so, what are the recommended approaches in AEM to overcome this issue and make page content more accessible to AI crawlers and indexing tools (e.g., using server-side rendering or prerendering)?z
Thanks,
L
Solved! Go to Solution.
Views
Replies
Total Likes
Hello @Learning4me ,
Yes, the low AI visibility score (around 6%) is most likely because our site relies heavily on client-side rendering meaning crawlers and AI tools don’t see the actual HTML content at load time.
Enable Server-Side Rendering (SSR):
For AEM SPAs, use the AEM SPA Editor SDK with a Node.js renderer so the HTML is generated on the server before it reaches the browser.
If you’re using Next.js with AEM as a headless CMS, use getServerSideProps or getStaticProps to prerender content.
Use Prerendering (if SSR isn’t feasible):
Tools like Rendertron or Puppeteer can generate static HTML snapshots for bots.
These can be served to crawlers through user-agent detection or an edge rule.
Make sure the HTML itself has meaningful content:
Include titles, headings, meta tags, and JSON-LD server-side — not just in client-side scripts.
Cache smartly:
Store prerendered or SSR HTML at the Dispatcher/CDN level and clear it automatically when content is published.
Docs for reference:
Hello @Learning4me ,
Yes, the low AI visibility score (around 6%) is most likely because our site relies heavily on client-side rendering meaning crawlers and AI tools don’t see the actual HTML content at load time.
Enable Server-Side Rendering (SSR):
For AEM SPAs, use the AEM SPA Editor SDK with a Node.js renderer so the HTML is generated on the server before it reaches the browser.
If you’re using Next.js with AEM as a headless CMS, use getServerSideProps or getStaticProps to prerender content.
Use Prerendering (if SSR isn’t feasible):
Tools like Rendertron or Puppeteer can generate static HTML snapshots for bots.
These can be served to crawlers through user-agent detection or an edge rule.
Make sure the HTML itself has meaningful content:
Include titles, headings, meta tags, and JSON-LD server-side — not just in client-side scripts.
Cache smartly:
Store prerendered or SSR HTML at the Dispatcher/CDN level and clear it automatically when content is published.
Docs for reference:
Views
Likes
Replies
Views
Likes
Replies