How to Make Your JavaScript Website SEO Friendly
- Last Edited April 19, 2026
- by Garenne Bigby
Google renders JavaScript well in 2026 — but AI crawlers mostly don’t, and even Googlebot’s two-pass indexing has caveats. Here’s how modern JS-heavy sites actually earn rankings.
A decade ago, JavaScript was a search-engine liability. Googlebot read raw HTTP response bodies and saw empty containers where React, Angular, or Vue would eventually inject content. Articles from that era recommended prerender services, hashbang URLs, and the official “AJAX crawling scheme” Google published in 2009. Nearly all of that is deprecated: Google retired the AJAX crawling scheme in 2015, deprecated dynamic rendering as a recommendation in 2023, and now operates an evergreen Chromium-based renderer that executes JavaScript on the same engine a modern user’s browser does. What hasn’t changed: rendering is still expensive, second-pass indexing still introduces delay, and a new audience — AI crawlers like GPTBot, ClaudeBot, and PerplexityBot — often doesn’t render JavaScript at all. This guide covers what 2026 JavaScript SEO actually looks like.
How Google actually renders JavaScript today
Googlebot’s rendering pipeline since 2019 runs on an evergreen Chromium engine — continuously updated to match current stable Chrome. Functionally, that means anything a recent Chrome browser can execute (ES2022+, fetch, IntersectionObserver, custom elements, most modern APIs), Googlebot can too.
The catch: rendering happens in two passes.
- First pass — Google fetches the raw HTML and parses it. Anything present in the server response (links, metadata, content) is indexed immediately.
- Second pass — the page enters a render queue. When Googlebot gets to it (minutes to days later, depending on crawl budget), Chromium executes the JavaScript and extracts the fully-rendered DOM.
For a site whose entire content depends on JavaScript (classic client-side SPA), nothing appears in the index until the second pass completes. During that delay, links don’t get crawled, content doesn’t get ranked, and freshness signals don’t update. For a news site, that delay is catastrophic. For a marketing site publishing occasional content, it’s usually survivable.
The meaningful 2026 problem isn’t “Google can’t render JS” — it’s that AI crawlers mostly don’t. ChatGPT’s retrieval crawler, Anthropic’s Claude crawler, Perplexity’s crawler, CCBot, and most of the LLM-training bot fleet fetch raw HTML and don’t run a full browser. Pages whose content only exists after client-side JavaScript runs are invisible to those crawlers and, by extension, don’t show up in AI Overview citations or LLM answers. Since pages cited in AI Overviews earn roughly 35% more organic clicks, this has become a real business problem, not an edge case.
The four rendering strategies (and when each wins)
Modern JavaScript SEO is really a choice between four rendering models. Most frameworks support all of them; the hybrid approach — mixing them per route — has become the 2026 default.
1. Client-Side Rendering (CSR)
The browser receives a near-empty HTML shell and downloads JavaScript that fetches data and builds the page. Classic React/Angular/Vue SPA. Fast subsequent navigation, but the first page load is slow, Googlebot relies on second-pass rendering, and AI crawlers see nothing. Fine for authenticated dashboards and internal tools; a bad default for public SEO-critical pages.
2. Server-Side Rendering (SSR)
The server runs your JavaScript on every request and returns fully-rendered HTML. Googlebot and AI crawlers see complete content immediately; the client hydrates the page with JavaScript for interactivity. The best default for dynamic, personalized, or frequently-changing content. Higher server cost and complexity; care needed around hydration mismatches. Implementations: Next.js (App Router or Pages Router with getServerSideProps), Nuxt (SSR mode), Remix / React Router v7, SvelteKit, Astro in SSR mode.
3. Static Site Generation (SSG)
Every page is pre-rendered to static HTML at build time and served from a CDN. Fastest possible Core Web Vitals, zero runtime server cost, universally crawlable. Ideal for content that doesn’t change per-request: blogs, marketing sites, documentation, e-commerce catalog pages. Rebuild required when content changes. Implementations: Next.js SSG (getStaticProps, or App Router static export), Astro (static mode, arguably the strongest SSG story in 2026), Nuxt generate, Hugo / 11ty for pure-static use cases.
4. Incremental Static Regeneration (ISR)
Pages are pre-rendered like SSG but re-generated on a timer or on-demand trigger. Combines SSG’s speed with SSR’s freshness. Next.js pioneered the pattern and remains the strongest implementation; Astro supports it via on-demand rendering; Nuxt supports it via Nitro’s stale-while-revalidate. Best for e-commerce with many pages that change occasionally (product listings, category pages) or large content sites where full rebuilds become impractical.
The hybrid default
Most production sites in 2026 mix strategies by route type:
- SSG for marketing pages, blog posts, documentation — whatever is stable and public.
- ISR for product listings and content that updates regularly but not per-request.
- SSR for personalized dashboards, checkout flows, and pages with per-user content.
- CSR for authenticated app interiors where SEO doesn’t matter.
Next.js App Router, Nuxt 3, Astro’s islands architecture, and SvelteKit all support this split natively — you declare the rendering mode per route.
Framework-native solutions replace the prerender-service era
In 2016, the common answer to “my Angular app isn’t getting indexed” was to bolt on a prerender service (Prerender.io, BromBone, SEO.js, etc.) that would intercept bot requests and serve pre-rendered snapshots. That approach — dynamic rendering — was Google’s official recommendation from 2017 until 2023, when Google deprecated it and recommended migrating to SSR or SSG instead.
Framework-native rendering is now the default answer for almost every case:
- React → Next.js (App Router with React Server Components is the modern default) or Remix / React Router v7.
- Vue → Nuxt (Nuxt 3 with Nitro server).
- Angular → Angular SSR (built-in since Angular 17, replacing the older Angular Universal package).
- Svelte → SvelteKit.
- Vanilla or multi-framework → Astro (islands architecture; ship almost no JavaScript by default).
- Solid → SolidStart.
Choosing a framework that supports SSR or SSG natively is dramatically simpler than adding a prerender proxy layer — and the resulting architecture is faster, cheaper, and more robust.
When prerender services still make sense
There’s one narrow remaining case: you have an existing, production, pure-CSR SPA (often Angular 1.x, early React, or legacy Vue 2) where migrating to SSR/SSG would be a multi-quarter engineering project. In that case, a prerender proxy can be an acceptable short-term bridge.
- Prerender.io — the surviving commercial service in this category. In 2026 it has repositioned around AI-crawler support: it caches rendered HTML and serves it to GPTBot, ClaudeBot, PerplexityBot, CCBot, and other AI crawlers that don’t execute JavaScript. For legacy SPAs that also want visibility in AI Overviews and LLM answers, this is the primary remaining commercial option. Free tier available; paid plans scale by cached URL count.
- Rendertron — Google’s open-source Puppeteer-based prerender server. Originally a reference implementation for dynamic rendering. It still works, but Google no longer actively maintains it or recommends it. Self-hosting is viable if you want zero vendor lock-in.
- Self-hosted Puppeteer or Playwright — for teams comfortable running their own headless-browser rendering service. More control, more operational burden.
The services listed in the 2016 version of this article — BromBone, AngularJS SEO, SEO.js, Backbone JS SEO — are either dead or effectively dead. Their entire premise (Google’s AJAX crawling scheme, hashbang URLs) stopped being relevant in 2015 when Google retired the scheme. Don’t try to revive them.
Core Web Vitals are a JavaScript problem
Hydration, third-party scripts, and client-side routing are the dominant causes of Core Web Vitals failures on JavaScript sites. The three metrics in 2026:
- LCP (Largest Contentful Paint) — target under 2.5 seconds. SSR and SSG both help because the largest element renders from HTML, not from a JavaScript-driven fetch.
- INP (Interaction to Next Paint) — replaced FID in March 2024. Target under 200ms. This is the metric that hydration churn and large client-side bundles most hurt. Defer, split, and lazy-load scripts. Avoid long tasks on the main thread.
- CLS (Cumulative Layout Shift) — target under 0.1. Set explicit dimensions on images, use CSS
aspect-ratio, avoid injecting content that shifts existing elements.
Modern frameworks address this with streaming SSR (sending HTML progressively as data resolves), selective hydration (React 18+, making only interactive components hydrate), islands architecture (Astro’s default — ship HTML with tiny interactive islands instead of a full JS bundle), and partial prerendering (Next.js — static shell with streamed dynamic holes).
Practical implementation checklist
- Render critical content in the initial HTML. Titles, headings, primary body copy, internal links, structured data, meta tags. If it matters for SEO or AI citation, it must exist before any JavaScript runs.
- Use real
<a href>links for navigation. JavaScriptonClickhandlers alone don’t work. Frameworks like Next.js, Nuxt, and SvelteKit do this automatically via their<Link>components. - Avoid hashbang and client-side-only routing. Use real URL paths with server-rendered content.
history.pushState()with SSR-supported routes is the current standard. - Render structured data server-side. JSON-LD injected via JavaScript is indexed eventually but less reliably than server-rendered schema. Generate JSON-LD during SSR/SSG and embed it in the initial HTML.
- Test with Search Console URL Inspection. The “Live Test” button shows exactly what Googlebot sees after rendering — the authoritative answer to “is Googlebot seeing my content?”
- Test AI crawler visibility separately. Disable JavaScript in Chrome DevTools and reload. What renders is roughly what most AI crawlers will see. If your content is blank without JS, AI crawlers are missing it.
- Monitor Core Web Vitals in the field. Search Console’s Core Web Vitals report uses real-user Chrome UX Report data — not lab scores. Track LCP, INP, and CLS at the 75th percentile.
- Avoid client-side redirects. Use server-side 301s instead. JavaScript redirects are followed by Googlebot but with a delay and reduced link equity transfer.
Testing tools
- Google Search Console URL Inspection — the authoritative “what Googlebot sees” test. Includes both the rendered HTML and a screenshot of what Chromium rendered.
- Rich Results Test — validates structured data rendered after JavaScript execution.
- Chrome DevTools Lighthouse — measures page-level performance, SEO, and accessibility. Useful for debugging; use field data for ranking decisions.
- Disable JavaScript in Chrome DevTools — approximates the AI-crawler view of your page.
- curl or
view-source:— shows the raw HTML response, identical to what non-JS crawlers see.
Frequently asked questions
Does Google still use the AJAX crawling scheme?
No. Google deprecated the AJAX crawling scheme (hashbang URLs, _escaped_fragment_) in October 2015 and removed support entirely shortly after. Any article recommending this scheme is obsolete. Use server-side rendering, static generation, or a hybrid approach instead.
Is dynamic rendering still recommended?
Google formally deprecated dynamic rendering as a best practice in 2023, calling it a “workaround” rather than a recommended approach. It still works for legacy sites but should not be the target architecture for new builds. Migrate to SSR or SSG where possible.
How long does Google’s second-pass rendering take?
Historically hours to days; in 2026, usually minutes for high-authority sites and hours for lower-crawl-priority pages. The delay is unpredictable and unbounded — which is why critical content should render server-side rather than relying on second-pass rendering.
Do AI crawlers like GPTBot execute JavaScript?
Mostly no. GPTBot, ClaudeBot, PerplexityBot, CCBot, and similar AI crawlers typically fetch raw HTML without running a full browser. Pages that rely entirely on client-side JavaScript for content are invisible to them. Given that AI Overview citations increasingly drive traffic, this is now a first-order SEO concern — not an edge case.
Should I use Next.js, Nuxt, or Astro?
Framework choice follows ecosystem preference more than SEO requirements in 2026 — all three support SSR and SSG well. Next.js is the React default, has the deepest Vercel integration, and pioneered App Router + React Server Components. Nuxt 3 is the Vue equivalent and is comparably mature. Astro is framework-agnostic (supports React, Vue, Svelte, Solid inside Astro components) and ships almost no JavaScript by default — arguably the best SEO-first choice for content sites.
What about Angular — is AngularJS (1.x) still viable?
AngularJS reached end-of-life in December 2021 and receives no further updates. Any site still on AngularJS needs urgent migration — to modern Angular (2+), React/Next.js, Vue/Nuxt, or Svelte. The SEO story for AngularJS was always painful; the fact that it’s now unsupported makes continuing to invest in it untenable.
Bottom line
JavaScript SEO in 2026 is not about prerender services and crawling schemes — it’s about choosing the right rendering mode per route. SSG for stable content, SSR for personalized content, ISR for content that changes periodically, CSR only for authenticated interiors. Frameworks do the heavy lifting; Next.js, Nuxt, Astro, and SvelteKit each make the choice declarative. The new wrinkle is AI crawlers, which don’t render JavaScript and can’t find content that only exists after hydration — so critical content must ship in the initial HTML, not just be eventually-renderable. Test with Search Console URL Inspection, verify with JavaScript disabled in DevTools, and measure Core Web Vitals from field data. Get those right and JavaScript becomes an asset to SEO, not a liability.
Categories
- Last Edited April 19, 2026
- by Garenne Bigby