How to Do SEO for Single Page Applications (SPA)

TL;DR

Single page applications load once and update dynamically without page refreshes. This creates SEO problems because Googlebot may not execute JavaScript correctly, URLs may not exist until JavaScript runs, and traditional crawling patterns break. The fix isn’t avoiding SPAs. It’s implementing proper rendering strategies: server-side rendering for critical pages, static generation where possible, and dynamic rendering as a fallback. The URL must exist and return content before JavaScript executes, or Google may never see your pages properly.


The Conversation

Derek, CTO of a SaaS startup, rebuilt their marketing site as a React SPA for development efficiency, noticed organic traffic dropped 80% after launch, pages not appearing in search results: We rebuilt our marketing site in React about three months ago. Our developers loved it. Our SEO traffic dropped 80%. When I search for our pages, some don’t appear at all. Others show weird truncated descriptions. What went wrong?

SDC SEO Brain: You’ve hit the classic SPA SEO problem. When Googlebot visits a traditional website, it receives HTML with content. When it visits your React SPA, it receives an empty HTML shell with JavaScript that says “load the actual content.” Whether Google sees your content depends on whether Googlebot executes that JavaScript correctly.

Derek: I thought Google could crawl JavaScript now. That was the whole point of their updates a few years ago.

SDC SEO Brain: Google can execute JavaScript, but it’s not reliable for SEO purposes. There’s a two-phase indexing process. First wave: Google crawls the raw HTML response. Second wave: Google renders JavaScript, sometimes. The rendering queue has limited resources and significant delays. Some pages get rendered quickly. Some take days or weeks. Some never get fully rendered. You can’t rely on it.

Derek: How do I know if Google is rendering our JavaScript?

SDC SEO Brain: Use URL Inspection in GSC. Enter one of your pages and click “Test Live URL.” Then compare the “HTML” view with the “Screenshot” and “Rendered HTML.” If the rendered HTML shows your actual content, Googlebot executed the JavaScript. If it shows an empty shell or loading spinner, rendering failed.

Derek: I just tested our homepage. The screenshot shows a loading spinner. That’s bad, right?

SDC SEO Brain: Very bad. That means Google is indexing your homepage as essentially an empty page with a loading spinner. No content, no keywords, nothing to rank for. Your page exists in Google’s index but with none of the actual information that should help it rank.

Derek: Why wouldn’t Google render our JavaScript? It works fine in browsers.

SDC SEO Brain: Several possible issues. Time limits: Google’s renderer has execution time limits. If your JavaScript takes too long to load and execute, it times out. Blocked resources: if your JavaScript files or API endpoints are blocked by robots.txt or return errors, rendering fails. Third-party dependencies: if you’re loading content from external APIs that are slow or blocked, the content never appears. Client-side routing: if your URLs only exist after JavaScript runs, Google may not even know those URLs exist.

Derek: We use React Router for client-side navigation. Is that the routing problem?

SDC SEO Brain: Potentially. With client-side routing, when a user types a URL directly into the browser, the server needs to know to return your React app for all routes. If your server isn’t configured correctly, direct URL access might return a 404 before React Router even loads. Check what happens when you access an interior page URL directly in an incognito browser with JavaScript disabled.

Derek: I just tried. It shows a blank page. That’s the problem?

SDC SEO Brain: That’s a critical problem. When Googlebot requests that URL, it gets a blank page. Even if Google eventually renders the JavaScript, the initial response contains nothing. Server-side rendering would return the actual content in the initial HTML response, so even if JavaScript fails, the content is still there.

Derek: What’s server-side rendering exactly?

SDC SEO Brain: SSR executes your React code on the server and sends fully-rendered HTML to the browser. The page arrives with content already present. Then React “hydrates,” taking over for interactive functionality. From SEO perspective, the server returns real content that Googlebot can index immediately, without waiting for JavaScript execution. Next.js and similar frameworks make SSR much easier for React apps.

Derek: We’re using Create React App. It’s purely client-side.

SDC SEO Brain: Create React App is designed for client-side applications where SEO doesn’t matter, like internal tools or apps behind login. For marketing sites or public-facing content that needs to rank, you need either: migrate to Next.js or similar SSR framework, implement static site generation (SSG) for pages that don’t change frequently, or use dynamic rendering as a temporary workaround.

Derek: What’s static site generation?

SDC SEO Brain: SSG pre-builds pages at build time, generating static HTML files for each URL. When Googlebot (or any visitor) requests a page, the server returns pre-built HTML, no JavaScript execution needed. This is ideal for content that doesn’t change with every request: blog posts, product pages, documentation. Tools like Next.js, Gatsby, or Astro support this approach.

Derek: Our marketing pages don’t change often. They’d be good candidates for SSG?

SDC SEO Brain: Perfect candidates. Homepage, about, features, pricing, blog posts: these can all be statically generated. You rebuild when content changes, but visitors and Googlebot always get pre-rendered HTML. You get the developer experience of React with the SEO performance of static HTML.

Derek: What about pages that need dynamic data? Our pricing has a calculator that changes based on user input.

SDC SEO Brain: The initial page can be statically generated with default content that’s SEO-relevant: pricing tiers, base prices, feature descriptions. The calculator functionality hydrates on the client side for interactivity. Users with JavaScript get the full interactive experience. Googlebot gets the static content, which is what you want indexed anyway. The interactive parts don’t need to be in Google’s index.

Derek: What’s dynamic rendering you mentioned earlier?

SDC SEO Brain: Dynamic rendering serves different content to Googlebot versus regular users. When your server detects Googlebot (via user-agent), it serves pre-rendered HTML. For regular users, it serves your normal client-side app. Google explicitly allows this for JavaScript-heavy sites, unlike cloaking which shows different content to deceive. It’s a legitimate workaround.

Derek: That sounds like more work than just fixing our rendering properly.

SDC SEO Brain: It is. Dynamic rendering is a temporary fix, not a long-term solution. Google recommends moving toward SSR or SSG. Dynamic rendering adds complexity: you need a pre-rendering service, user-agent detection, and separate rendering paths. Migration to Next.js is usually less ongoing work than maintaining dynamic rendering infrastructure.

Derek: If we migrate to Next.js, how hard is that?

SDC SEO Brain: Depends on your codebase complexity. Next.js uses React, so your components can largely transfer. The changes are mainly: routing (file-based instead of React Router), data fetching (Next.js has specific patterns), and deployment configuration. For a marketing site without complex state management, migration might take a few weeks. For a large app with intricate client-side state, longer.

Derek: What’s the fastest fix we can implement while planning a migration?

SDC SEO Brain: Implement pre-rendering for your critical pages now. Services like Prerender.io or Rendertron act as middleware. When Googlebot requests a page, the request goes to the pre-rendering service, which executes JavaScript and returns static HTML. For regular users, normal client-side app. This gets you indexed properly while you plan the proper fix.

Derek: How do I implement that technically?

SDC SEO Brain: Most pre-rendering services provide instructions for common setups. Generally: install middleware that detects Googlebot user-agent, route those requests to the pre-rendering service API, return the pre-rendered HTML. For other requests, serve your normal SPA. You’ll need server-side logic to do the user-agent detection, which means your static hosting might need to become dynamic hosting.

Derek: We’re on Netlify for static hosting.

SDC SEO Brain: Netlify has built-in pre-rendering options for SPAs. Check their documentation for “prerender” settings. You might be able to enable it without custom middleware. Vercel, Netlify, and similar platforms have increasingly good SPA SEO support because this is such a common problem.

Derek: Once we fix the rendering, how long until we recover our rankings?

SDC SEO Brain: After Google starts receiving properly rendered content, expect recrawling of your pages within days to a couple weeks. Ranking recovery depends on how much authority was lost during the three months of poor indexing and whether competitors gained ground. If your content and backlinks are still strong, recovery should follow recrawling. Full recovery might take one to three months.

Derek: Is there anything we should do about the existing poorly indexed pages?

SDC SEO Brain: After implementing the fix, use URL Inspection to request indexing for your most important pages. This prompts Google to recrawl and re-render them with the new setup. For a marketing site, maybe ten to twenty key pages need this manual push. The rest will get recrawled naturally.


FAQ

Q: Can Google crawl JavaScript-rendered content?
A: Google can execute JavaScript, but it’s unreliable for SEO. Google uses a two-phase indexing process: first crawling raw HTML, then rendering JavaScript in a separate queue. The rendering queue has delays and resource limits. Some pages render quickly; others take weeks or never fully render. Don’t rely on client-side rendering for SEO-critical content.

Q: What’s the difference between SSR, SSG, and client-side rendering?
A: Client-side rendering (CSR) sends empty HTML; JavaScript builds the page in the browser. Server-side rendering (SSR) builds HTML on the server for each request; pages arrive with content. Static site generation (SSG) pre-builds HTML at build time; pages are served as static files. For SEO, SSR and SSG ensure content exists before JavaScript executes. CSR depends on JavaScript rendering that may fail.

Q: What’s dynamic rendering and is it cloaking?
A: Dynamic rendering serves pre-rendered HTML to search engines while serving the client-side app to regular users. Google explicitly allows this for JavaScript-heavy sites, so it’s not cloaking (which involves deceptive content). It’s a legitimate workaround but adds complexity. Google recommends SSR or SSG as better long-term solutions.

Q: How do I check if Google is rendering my JavaScript correctly?
A: Use URL Inspection in GSC. Test a live URL and compare the screenshot and rendered HTML to what users see. If the screenshot shows a loading spinner or empty page while your site shows content in browsers, Google’s rendering is failing. Also check if your JavaScript files are blocked in robots.txt.

Q: Should I migrate from Create React App to Next.js for SEO?
A: For public-facing content that needs to rank, yes. Create React App is designed for client-side applications where SEO doesn’t matter. Next.js provides SSR and SSG options that ensure content is available before JavaScript executes. Migration complexity depends on your codebase, but for marketing sites, it’s usually worthwhile.


Summary

Single page applications create SEO problems because content may not exist until JavaScript executes. Google can render JavaScript but does so in a separate, delayed queue with resource limits. Some pages render quickly; others take weeks or never complete. Client-side rendering alone is unreliable for SEO.

Google’s two-phase indexing means initial HTML matters. First wave indexes raw HTML response. Second wave renders JavaScript when resources allow. If your initial HTML is empty, Google indexes an empty page. Even if rendering eventually succeeds, delays can last weeks, and rankings suffer during that time.

URL Inspection reveals rendering status. Test live URLs in GSC and compare screenshot and rendered HTML to what users see. Loading spinners or empty screenshots indicate rendering failure. This diagnosis tells you exactly what Google sees versus what you expect.

Server-side rendering (SSR) returns content in initial HTML. React code executes on the server; users receive fully-rendered HTML. JavaScript then hydrates for interactivity. Googlebot indexes real content immediately without waiting for client-side rendering. Next.js makes SSR much easier for React applications.

Static site generation (SSG) pre-builds pages at build time. Ideal for content that doesn’t change frequently: marketing pages, blog posts, documentation. Pages are served as static HTML files. Visitors and Googlebot get pre-rendered content without any JavaScript execution needed.

Client-side routing requires server configuration. With React Router, direct URL access must return your app shell, not a 404. If the server doesn’t recognize SPA routes, Googlebot requests return errors before React even loads. Test URLs directly with JavaScript disabled to verify server configuration.

Dynamic rendering is a temporary workaround, not a solution. Serving pre-rendered HTML to Googlebot while serving CSR to users is legitimate but adds complexity. Google recommends SSR or SSG as long-term approaches. Use dynamic rendering while planning proper migration.

Pre-rendering services provide quick fixes. Prerender.io, Rendertron, and similar services intercept Googlebot requests and return pre-rendered HTML. This gets you indexed while planning proper architectural changes. Platform-specific options (Netlify, Vercel) may offer built-in pre-rendering.


Sources