TL;DR
JavaScript websites can rank well, but only if Google can actually see your content. The core problem: Googlebot renders JavaScript, but with delays, resource limits, and potential failures. Client-side rendered (CSR) apps are highest risk because content doesn’t exist until JavaScript executes. Server-side rendering (SSR) or static site generation (SSG) dramatically improves SEO reliability by serving HTML that Google can read immediately. If you must use CSR, verify rendering works via URL Inspection, fix critical content loading issues, and accept that some JavaScript-dependent features may not be indexed.
Do This Today (3 Critical Checks)
- Test what Google actually sees: GSC → URL Inspection → Enter your URL → Click “Test Live URL” → View “Tested Page” screenshot and HTML. If content is missing, Google can’t index what it can’t see.
- Check JavaScript console errors: In URL Inspection, click “More Info” and check for JavaScript errors. Errors can prevent rendering, meaning blank or partial pages for Googlebot.
- Compare source vs rendered: View your page source (Ctrl+U) and compare to what browser DevTools shows after JavaScript runs. If critical content only appears after JS execution, that content is at risk.
JavaScript-Specific Technical Checks
Hydration mismatch warning:
If your server-rendered HTML differs from what JavaScript produces client-side, you have a hydration mismatch. Google may index the server version, users see the client version. Check for:
- Content that only appears after JavaScript runs
- Different text or elements between server and client render
- Dynamic content that changes on each render
Test in GSC URL Inspection: Compare the “Rendered Page” HTML to your live page. Differences indicate hydration issues.
Bundle size impact on Core Web Vitals:
Large JavaScript bundles delay rendering and hurt LCP/INP.
Check your bundle:
- Use webpack-bundle-analyzer or source-map-explorer
- Target < 200KB initial JavaScript (compressed)
- Code-split large dependencies
- Lazy load below-fold components
When SSR/SSG isn’t possible – Dynamic Rendering:
If you can’t implement SSR, dynamic rendering serves pre-rendered HTML to bots while serving JavaScript to users. Tools: Rendertron, Puppeteer, Prerender.io.
Caution: Google considers this acceptable but not ideal. It adds complexity and can be seen as cloaking if implemented incorrectly. SSR/SSG is always preferred.
JavaScript Rendering Options for SEO
| Rendering Type | How It Works | SEO Risk Level | Best For |
|---|---|---|---|
| <strong>Static HTML</strong> | Server sends complete HTML | Lowest | Blogs, documentation, landing pages |
| <strong>SSG (Static Site Generation)</strong> | HTML pre-built at build time | Low | Marketing sites, blogs, e-commerce catalogs |
| <strong>SSR (Server-Side Rendering)</strong> | HTML generated per request on server | Low | Dynamic content, personalized pages |
| <strong>CSR (Client-Side Rendering)</strong> | JavaScript builds HTML in browser | High | Apps where SEO doesn't matter (dashboards, internal tools) |
| <strong>Hybrid/ISR</strong> | Mix of SSG + SSR with revalidation | Low | Large sites with frequent updates |
Framework defaults:
- Next.js: SSR/SSG by default (SEO-friendly)
- Nuxt.js: SSR/SSG by default (SEO-friendly)
- Create React App: CSR only (SEO-problematic)
- Gatsby: SSG by default (SEO-friendly)
- Vue CLI: CSR by default (SEO-problematic)
- Angular: CSR by default (SEO-problematic, use Angular Universal for SSR)
The Conversation
Ryan, CTO at a SaaS startup, 20 employees, rebuilt marketing site in React with Next.js, intermediate SEO knowledge but deep technical skills, organic traffic dropped 60% after migration, frustrated: We migrated our marketing site from WordPress to Next.js three months ago. The site is faster, the code is cleaner, developers love it. But organic traffic dropped 60% and hasn’t recovered. Our old WordPress pages are deindexed, and the new pages aren’t ranking. Did we kill our SEO?
SDC SEO Brain: JavaScript migrations gone wrong are one of the most common SEO disasters I see. The good news: Next.js is actually SEO-friendly if configured correctly. The bad news: “correctly” has specific requirements that developers often miss. Let’s diagnose. Are you using SSR, SSG, or client-side rendering for your marketing pages?
Ryan: We’re using Next.js with getStaticProps for most pages. So SSG, I think?
SDC SEO Brain: That’s the right approach for marketing pages. SSG pre-renders HTML at build time, so Googlebot should receive complete HTML without needing to execute JavaScript. Let’s verify it’s actually working. Go to Google Search Console, URL Inspection, and test your homepage. What does the rendered screenshot show?
Ryan: The screenshot looks correct. The page content is there.
SDC SEO Brain: Good, rendering isn’t the issue. Now check if there are any JavaScript errors in the “More Info” section. Also, look at the “Page resources” to see if any critical resources failed to load.
Ryan: No errors showing. Resources loaded. So rendering is fine?
SDC SEO Brain: Rendering is fine for that page. But let’s check the migration itself. When you moved from WordPress to Next.js, did you implement redirects from every old URL to the corresponding new URL?
Ryan: We changed our URL structure. WordPress had /blog/post-name/ and we changed to /resources/post-name/. We set up redirects.
SDC SEO Brain: How did you implement the redirects? In Next.js config, at the server/hosting level, or somewhere else?
Ryan: In next.config.js using the redirects array.
SDC SEO Brain: That should work, but let’s verify. Take one of your old high-traffic WordPress URLs and test it. Does it actually redirect to the new URL? And what HTTP status code does it return?
Ryan: Let me check with curl… It’s returning a 308 redirect.
SDC SEO Brain: 308 is a permanent redirect, similar to 301. That’s fine. But here’s a potential issue: did you redirect every old URL, or just set up pattern-based redirects? If any old URLs don’t have redirects, they’ll return 404s and you lose that page’s ranking history.
Ryan: We did pattern-based. /blog/ redirects to /resources/. But we also renamed some posts and deleted others during the migration.
SDC SEO Brain: That’s likely where traffic leaked. Renamed posts with different slugs won’t match the pattern. Deleted posts with no redirect lose their rankings entirely. Check GSC Coverage report for 404 errors. Filter by pages that were previously indexed. Those are your lost URLs.
Ryan: Checking… There are 47 URLs showing as 404 that were previously indexed. Some are old blog posts we decided to remove.
SDC SEO Brain: 47 broken redirects is significant. For each of those, you need to decide: redirect to the most relevant existing page, or accept the traffic loss. Posts you deliberately removed should redirect to the closest topic match or your resources hub, not just 404. What was the traffic on those 47 posts before migration?
Ryan: I can check Analytics… Combined, those posts were getting about 8,000 sessions per month.
SDC SEO Brain: There’s a chunk of your 60% drop. 8,000 sessions from posts that now 404 with no redirect. The ranking signals those pages accumulated over years are now pointing at error pages. You need to create redirects for those 47 URLs to the most relevant existing pages.
Ryan: Some of those posts don’t have an equivalent on the new site. What do I redirect them to?
SDC SEO Brain: Find the closest topical match. If you had a post about “email marketing automation” and no direct equivalent exists, redirect to your email marketing category page or a related post. It’s not perfect, but a related redirect preserves some value. A 404 preserves nothing.
Ryan: Okay, I’ll map those out. What else could be causing the drop?
SDC SEO Brain: Let’s check indexing status of your new pages. In GSC, go to Coverage and look at the “Valid” pages count. How many pages are currently indexed versus how many your sitemap contains?
Ryan: Sitemap has 156 URLs. Only 89 are showing as indexed.
SDC SEO Brain: 67 pages not indexed is a problem. Filter by “Excluded” and look at the reasons. What’s the most common exclusion reason?
Ryan: “Crawled – currently not indexed” is the biggest one. 34 pages.
SDC SEO Brain: That’s a quality signal issue, not a JavaScript issue. Google crawled those pages, saw the content, and decided not to index them. What types of pages are in that bucket?
Ryan: Mostly our feature pages, like /features/analytics, /features/reporting, /features/integrations.
SDC SEO Brain: How much unique content is on each feature page? Are they substantial pages or thin descriptions?
Ryan: They’re pretty short. Maybe 200-300 words each with screenshots.
SDC SEO Brain: Thin feature pages are common in SaaS, and Google often doesn’t index them because they don’t provide enough unique value compared to competitors’ feature pages. This isn’t a JavaScript problem; it’s a content depth problem. You’d have the same issue on WordPress.
Ryan: So the JavaScript migration didn’t cause that specific issue?
SDC SEO Brain: Correct. Some problems you’re seeing are migration-related (broken redirects, URL changes) and some are content-related that existed before or would exist regardless. Let’s separate them. JavaScript-specific issues to check: dynamic content loading, JavaScript-dependent navigation, client-side routing without proper handling.
Ryan: We do have some dynamic content. Testimonials and case study previews load via API calls after the page loads.
SDC SEO Brain: That’s a potential issue. Content loaded asynchronously after initial render might not be seen by Googlebot if it times out before the API responds. Test this: In URL Inspection, look at the rendered HTML source. Are testimonials and case studies present in the HTML that Google sees?
Ryan: Let me check a page with testimonials… They’re not in the Google-rendered HTML. The container is there but empty.
SDC SEO Brain: Found an issue. Your testimonials load client-side via API, and they’re not rendering for Googlebot. If those testimonials contain keywords or content you want indexed, you need to either: include them in the initial server-side render, or use getStaticProps/getServerSideProps to fetch and include them at build/request time.
Ryan: That’s a code change. Our developers pushed the API approach because it’s faster and more “modern.”
SDC SEO Brain: It might be faster for users, but it’s invisible for SEO. You have options: SSR the testimonials so they’re in the initial HTML, lazy load them with a server-rendered fallback, or accept they won’t be indexed. For marketing pages where testimonials support conversion and SEO, I’d recommend including them in the initial render.
Ryan: What about our navigation? It’s all React components.
SDC SEO Brain: Test whether Googlebot can discover pages through your navigation. In GSC, check the “Links” report under internal links. If pages have zero internal links according to Google, your React navigation might not be crawlable.
Ryan: Checking… Most pages show internal links. But some deeper pages show only 1-2 internal links when they should have more from the main nav.
SDC SEO Brain: Your navigation might use JavaScript events that Googlebot doesn’t fully execute. Check if your nav links are actual <a href=""> tags or if they’re onClick handlers that programmatically navigate. Googlebot follows href attributes reliably but struggles with JavaScript-only navigation.
Ryan: They’re Next.js Link components. Those should generate proper anchor tags, right?
SDC SEO Brain: Next.js Link components do generate proper anchor tags with href attributes. If pages are missing internal links, check if some links are conditionally rendered based on JavaScript state, or if there’s client-side logic hiding links from certain pages. Also verify the links aren’t being blocked by JavaScript errors earlier in the render.
Ryan: I’ll have the dev team audit that. Anything else specific to Next.js we should check?
SDC SEO Brain: A few more items: Make sure you’re not accidentally noindexing pages. Check if any middleware or headers are adding X-Robots-Tag: noindex. Verify your canonical tags are correct and not all pointing to the homepage. Check that hreflang tags (if you have international versions) are properly implemented. And ensure your sitemap is accurate and being submitted to GSC.
Ryan: We do have canonicals. Let me check one… The canonical looks correct, pointing to itself.
SDC SEO Brain: Good. Last thing: page speed and Core Web Vitals. Next.js should be fast, but if you’ve added heavy JavaScript bundles, client-side analytics, or third-party scripts, you might have performance issues. Check Core Web Vitals in GSC. Any problems?
Ryan: LCP is “Poor” on mobile for several pages. Around 4.5 seconds.
SDC SEO Brain: That’s likely hurting rankings. Next.js should have good LCP with proper image optimization and minimal render-blocking JS. Common culprits: unoptimized images not using next/image, large JavaScript bundles, third-party scripts loading synchronously, web fonts blocking render. Run Lighthouse on a problematic page and check what’s causing the slow LCP.
Ryan: Okay, so to summarize: our rendering is actually fine, but we have redirect gaps, thin content not being indexed, API-loaded content not visible to Google, possible internal linking issues, and Core Web Vitals problems.
SDC SEO Brain: Exactly. The JavaScript framework isn’t the core problem. It’s how the migration was executed and some architectural decisions around dynamic content. Next.js with SSG is SEO-capable, but you need to ensure Google can see everything you want indexed in the initial HTML response.
FAQ
Q: Can Google crawl and index JavaScript websites?
A: Yes, Google renders JavaScript using a Chromium-based renderer. However, rendering happens in a “second wave” after initial crawling, uses limited resources, and can fail. Client-side rendered content is at higher risk of not being indexed. Server-side rendering or static generation is much more reliable for SEO.
Q: Is React/Vue/Angular bad for SEO?
A: The framework itself isn’t bad for SEO; the rendering method matters. Client-side rendered (CSR) apps are problematic because content doesn’t exist until JavaScript runs. Using SSR (Next.js, Nuxt, Angular Universal) or SSG (Gatsby, Next.js static export) makes these frameworks SEO-friendly.
Q: How do I test if Google can see my JavaScript content?
A: Use Google Search Console’s URL Inspection tool. Click “Test Live URL,” then view the screenshot and HTML under “View Tested Page.” If content is missing there, Google can’t index it. Also check for JavaScript errors that might prevent rendering.
Q: Should I use dynamic rendering or pre-rendering?
A: Pre-rendering (SSR/SSG) is preferred over dynamic rendering. Dynamic rendering serves different content to users vs bots, which can be seen as cloaking if done incorrectly. SSR/SSG serves the same HTML to everyone, which is safer and simpler.
Q: Why is content loaded via API not appearing in Google’s index?
A: Content fetched asynchronously after initial page load may not be rendered in time for Googlebot. Google’s renderer has timeouts and resource limits. If your API is slow or the content loads after a delay, Googlebot may capture the page before that content appears. Fetch critical content server-side instead.
Summary
JavaScript rendering is not the enemy; implementation is. Google can render JavaScript, but relies on specific conditions: content must be in the DOM when rendering completes, resources must load successfully, and there must be no blocking JavaScript errors. Server-side rendering (SSR) and static site generation (SSG) are dramatically more reliable than client-side rendering.
Migration issues often masquerade as JavaScript problems. Broken redirects, changed URLs without proper mapping, and lost pages cause ranking drops that get blamed on the new tech stack. Always audit redirect coverage thoroughly after any migration.
Verify rendering with URL Inspection, not assumptions. “It works in my browser” doesn’t mean Googlebot sees the same thing. Test actual Google rendering in Search Console for critical pages, and check that all important content appears in the rendered HTML.
Client-side loaded content is high risk. Testimonials, reviews, or any content fetched via API after initial render may not be indexed. If content matters for SEO, include it in the initial server response, not loaded dynamically.
Framework defaults matter:
- Next.js, Nuxt.js, Gatsby: SEO-friendly defaults with SSR/SSG
- Create React App, Vue CLI, Angular: CSR defaults require additional setup for SEO
Core Web Vitals apply regardless of framework. JavaScript frameworks can be fast, but heavy bundles, unoptimized images, and third-party scripts cause poor LCP/CLS/INP. Performance issues compound with rendering issues for ranking impact.
Sources
- Google Search Central: JavaScript SEO basics – https://developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics
- Google Search Central: Fix rendering problems – https://developers.google.com/search/docs/crawling-indexing/javascript/fix-search-javascript
- Google Search Central: URL Inspection tool – https://developers.google.com/search/docs/monitor-debug/url-inspection-tool
- Next.js Documentation: SEO – https://nextjs.org/learn/seo/introduction-to-seo