TL;DR
Lazy loading delays content loading until users scroll to it, improving initial page speed. But Googlebot doesn’t scroll like humans do. If your lazy loading requires scroll interaction to trigger content loading, Googlebot may never see below-the-fold content. Images, product listings, and text hidden behind scroll-triggered lazy loading can be completely invisible to Google, preventing indexing of content you need to rank.
The Conversation
Ryan, CTO of a furniture marketplace, 25,000 product listings, $15M GMV, React-based SPA, advanced technical knowledge, noticed that only first few products on category pages were being indexed: Our category pages show 60 products each, but Google is only indexing the first 8-10. Search results show our category pages with descriptions that cut off at the first row. What’s blocking the rest?
SDC SEO Brain: Sounds like a lazy loading issue. How are products loaded on your category pages? Does the full list load on initial page render, or do products load as users scroll?
Ryan: We use intersection observer to lazy load products. Initial render shows the first 8 products, then more load as you scroll down. It’s great for performance.
SDC SEO Brain: And that’s exactly why Google only sees 8 products. Googlebot doesn’t scroll. It loads your page, waits for JavaScript to execute, and captures whatever renders in that initial state. If products require scroll interaction to trigger loading, Googlebot never triggers it.
Ryan: But Google says it renders JavaScript. We’ve tested it.
SDC SEO Brain: Google renders JavaScript, but it doesn’t simulate all user interactions. Scrolling, clicking, hovering, typing into forms, none of these happen during Googlebot’s rendering process. Any content that depends on these interactions to load is effectively hidden from Google.
Ryan: Our intersection observer fires when elements enter the viewport. Google must have a viewport.
SDC SEO Brain: Google does have a viewport, but elements below that viewport never enter it because Google doesn’t scroll. Your intersection observer never fires for elements positioned below the initial viewport. Google sees the placeholders or empty space where those products should be.
Ryan: We could increase the initial load count to 60, but that defeats the purpose of lazy loading.
SDC SEO Brain: There’s a middle ground. You don’t need to abandon lazy loading for users. You need to ensure the content exists in the initial HTML for Googlebot. There are several approaches: server-side rendering with hydration, initial DOM population with lazy loading for images only, or different rendering paths for bots versus users.
Ryan: We already do SSR for SEO purposes.
SDC SEO Brain: What does your SSR output include? View page source on a category page. Are all 60 products in the HTML?
Ryan: Let me check… no, the HTML only has the first 8 products. The SSR is replicating what our client-side render does.
SDC SEO Brain: That’s a common SSR misconfiguration. Your server-side render should output all content that should be indexable, regardless of how client-side JavaScript will enhance it. The SSR output is what Googlebot sees before any JavaScript runs. If SSR only outputs 8 products, that’s your indexable content ceiling.
Ryan: So we need SSR to render all 60 products, then client-side takes over with lazy loading for users?
SDC SEO Brain: Exactly. The HTML source should contain all products. Then your client-side JavaScript can implement lazy loading for images within those products, load-more interactivity, or progressive enhancement. But the base content, product titles, descriptions, links, must be in the initial HTML.
Ryan: Won’t loading 60 products in HTML hurt our page speed scores?
SDC SEO Brain: Not necessarily, if you’re smart about it. The HTML for product cards is lightweight, maybe 1-2KB each. That’s 60-120KB of HTML, which compresses down significantly. What kills page speed is loading 60 high-res images. Keep the lazy loading for images but put the product data in the initial render.
Ryan: What about images? Google’s documentation says lazy loading images is fine.
SDC SEO Brain: Lazy loading images using the loading=”lazy” attribute or intersection observer is fine if the image src URL is present in the HTML. Google can see an img tag with loading=”lazy” and the src attribute, and will fetch that image for indexing even if it’s not displayed yet.
Ryan: So the issue is that we’re lazy loading the entire product card, not just the image?
SDC SEO Brain: Exactly. The product title, price, and link aren’t being lazy loaded, they’re simply not being rendered at all until scroll. That’s the difference. Lazy loading defers loading of content that exists in the DOM. Your approach removes the content from the DOM entirely until interaction.
Ryan: This is going to require some architecture changes.
SDC SEO Brain: It will, but it’s necessary for SEO on content-heavy pages. The pattern should be: all indexable content in initial HTML, images lazy loaded with proper attributes, and interactive enhancements layered via JavaScript. Google gets the content, users get the performance.
Ryan: What about infinite scroll? We have that on search results pages.
SDC SEO Brain: Infinite scroll is problematic for SEO because content beyond the initial load requires interaction. The solution is providing paginated URLs that Google can crawl independently. Your infinite scroll page 1 should be crawlable at /search?page=1, page 2 at /search?page=2, and so on. Each page renders its content in initial HTML. Users get infinite scroll experience, Google gets paginated access.
Ryan: We don’t have paginated URLs. It’s all JavaScript state.
SDC SEO Brain: Then Google can only index whatever loads initially on your search results page. All products after the first scroll load are invisible to search engines. If you want that content indexed, you need either paginated URLs or initial render of full content.
Ryan: How do I verify what Google is actually seeing on our pages?
SDC SEO Brain: URL Inspection in Search Console is the definitive source. Test a category page and view the rendered HTML. Count the products in that HTML. If it’s 8 instead of 60, you’ve confirmed the problem. Also use the mobile-friendly test tool and view rendered HTML there.
Ryan: What if we detect Googlebot and serve different content?
SDC SEO Brain: That’s cloaking, and Google explicitly warns against it. Serving different content to users versus Googlebot based on user-agent detection can result in penalties. The solution is serving the same content to everyone, structured in a way that works for both bots and users.
Ryan: Even if the content is the same, just rendered differently?
SDC SEO Brain: It’s a gray area that Google sometimes calls “dynamic rendering” when done properly. If you serve pre-rendered HTML to bots and JavaScript-rendered content to users, but the end result is identical, Google has said this is acceptable. However, any difference in content or intent is cloaking. And dynamic rendering adds complexity. Better to fix the underlying architecture.
FAQ
Q: Does Google scroll pages when rendering for indexing?
A: No. Googlebot loads pages and executes JavaScript but doesn’t simulate scrolling, clicking, or other user interactions. Content that requires these interactions to load will not be seen or indexed by Google.
Q: Is lazy loading images bad for SEO?
A: Lazy loading images using the loading=”lazy” attribute or intersection observer is fine if the image src URL is present in the HTML. Google can discover and fetch these images for indexing. The problem occurs when entire content blocks, not just images, are removed from the DOM until scroll.
Q: How do I make infinite scroll pages SEO-friendly?
A: Implement paginated URL structure alongside infinite scroll UI. Each page of content should be accessible via a direct URL that renders its content in initial HTML. Users experience infinite scroll; Google crawls paginated pages.
Q: What’s the difference between lazy loading and content not rendering?
A: Lazy loading defers the fetching of content that’s already in the DOM. Not rendering means the content isn’t in the DOM at all until triggered by interaction. Google can handle the former if implemented correctly; it cannot see the latter.
Summary
Lazy loading that removes content from the DOM until scroll interaction completely hides that content from Googlebot, which doesn’t simulate scrolling. Only content in the initial render is indexed.
The fix is separating content from loading behavior: put all indexable content in initial HTML, then apply lazy loading only to images and heavy media. Product titles, descriptions, and links must render immediately.
SSR configurations often replicate client-side rendering limitations. SSR output should include all indexable content, regardless of how client-side JavaScript will progressively enhance it.
Infinite scroll requires parallel paginated URL structure for SEO. Each page of content needs a crawlable URL that renders its content without interaction.
Dynamic rendering is acceptable when content is identical for bots and users. Content differences, even if unintentional, risk being classified as cloaking.
Sources
- Google Search Central: Lazy loading and SEO
- Google Search Central: Dynamic rendering
- Google Developers: Browser-level lazy loading