How to Fix “Page with Redirect” in Google Search Console

TL;DR

“Page with redirect” in GSC isn’t an error to fix. It’s Google telling you it found a URL, followed the redirect, and indexed the destination instead. The actual problem is usually why Google is still discovering the old URL. Your redirects are working; your link cleanup isn’t. The fix isn’t touching the redirect. It’s finding and updating the sources still pointing to the old URL, whether internal links, sitemaps, canonical tags, or external references you can control.


The Conversation

Daniel, e-commerce SEO manager, 15,000 SKU site on Magento, seeing 4,000+ “page with redirect” entries in GSC, intermediate technical knowledge, recently completed site migration: We migrated to a new URL structure three months ago. All the 301s are in place and working. But GSC shows over 4,000 URLs as “page with redirect” and the number keeps growing. I thought this would resolve itself.

SDC SEO Brain: It won’t resolve itself if you’re still actively telling Google about the old URLs. Where are those old URLs being discovered from? Click into a few of those entries in GSC and check the “discovered” source.

Daniel: I checked a few. Some say “sitemap” and some say “internal link.”

SDC SEO Brain: There’s your answer. Your sitemap still contains old URLs, and your internal linking still points to old URLs. Google obediently crawls what you tell it to crawl, hits the redirect, follows it to the new URL, and then reports “page with redirect” because that’s technically what happened. The redirect itself is working perfectly.

Daniel: But I updated the sitemap. It only has new URLs now.

SDC SEO Brain: When did you update it?

Daniel: About two weeks ago.

SDC SEO Brain: GSC data lags. More importantly, if Google has already cached 4,000 old URLs as things to check periodically, removing them from the sitemap doesn’t make Google forget about them. It just removes one discovery source. Google will continue recrawling URLs it has in its index until it’s satisfied they’re permanently gone.

Daniel: So what do I do? Just wait?

SDC SEO Brain: Waiting is part of it, but there’s active work to do. First priority: fix internal links. Every internal link pointing to an old URL forces Google to process a redirect chain and wastes crawl budget. Run a Screaming Frog crawl of your current site and export all internal links pointing to 301 redirects.

Daniel: I already did that. There are about 800 internal links to old URLs. Mostly in footer navigation and some hardcoded in CMS content blocks.

SDC SEO Brain: 800 is fixable. Footer navigation is usually template-level, so one fix propagates everywhere. CMS content blocks are more tedious but manageable. Fix those first. The sitemap is already clean, so internal links are your biggest remaining signal telling Google to visit old URLs.

Daniel: What about external backlinks? Those I can’t control.

SDC SEO Brain: You can’t control them, but you can understand them. Pull your backlink profile from Ahrefs or Semrush and filter for links pointing to old URLs. If major referring domains link to old URLs, consider reaching out to update the links. If it’s thousands of random low-authority links, not worth the effort. Let the redirects handle those indefinitely.

Daniel: There’s one thing I don’t understand. If the redirect works and Google ends up at the right page, why does it matter? The “page with redirect” isn’t hurting my rankings, right?

SDC SEO Brain: Correct that it’s not directly hurting rankings. But it’s a symptom of inefficiency. Every redirect is extra work for Google, extra latency for users, and extra complexity in your crawl logs. More importantly, high volumes of redirect entries often correlate with other problems: orphan pages, inconsistent canonicalization, or incomplete migrations. The redirect count itself isn’t the issue. What it represents is.

Daniel: Okay, that makes sense. What about the “number keeps growing” part? If I’m not adding new old URLs, why would the count increase?

SDC SEO Brain: Are you sure you’re not adding new old URLs?

Daniel: I mean, we’re adding new products all the time, but those go directly to new URL structure.

SDC SEO Brain: Check your product feeds. Google Merchant Center, Facebook catalog, affiliate feeds. Are those pointing to new URLs or generating old-format URLs?

Daniel: Oh. I didn’t think about Merchant Center. Let me check.

SDC SEO Brain: While you check that, also look at any automated systems that generate links. Email templates, transactional emails, PDF invoices with product links. Anywhere a URL gets programmatically generated could be using old patterns if the logic wasn’t updated.

Daniel: Merchant Center is still using old URLs. That’s probably a big source.

SDC SEO Brain: That’s almost certainly your growing count. Merchant Center submits URLs to Google. Google crawls them, hits redirects, indexes the destination, and reports “page with redirect.” Fix Merchant Center feed URL structure and you’ll stop the bleeding.

Daniel: The feed is auto-generated by a plugin. I need to check if there’s a setting or if I need to regenerate it.

SDC SEO Brain: Before you regenerate, verify what URL pattern the plugin uses. Some plugins have a base URL setting that might still be on old structure. Some use canonical URLs from Magento, which might be correct. Some have their own logic entirely.

Daniel: Found it. The plugin had a cached base URL from before migration. I can clear that and regenerate.

SDC SEO Brain: Good. After regenerating, submit the new feed to Merchant Center and give it a few days to process. The “page with redirect” count should stop increasing once Google’s crawling reflects the new feed.

Daniel: Will the existing 4,000 entries ever go away?

SDC SEO Brain: Eventually, yes, but “eventually” could mean months. Here’s the mechanism: Google recrawls URLs based on various signals including change frequency and importance. Old URLs with 301s will get recrawled less often over time as Google learns they always redirect. Each time Google recrawls and confirms the redirect is still there, it reinforces that the old URL is not canonical. Gradually, those entries move from “page with redirect” to effectively forgotten.

Daniel: Is there any way to speed that up?

SDC SEO Brain: Not significantly. Requesting removal in GSC doesn’t work for redirecting URLs. It’s designed for URLs you want completely delisted, not redirects. You could use the URL inspection tool to request indexing of new URLs, which might prompt Google to consolidate signals faster, but at 4,000 URLs that’s not practical manually. The real answer is: fix the sources, stop the bleeding, and let time handle the historical cleanup.

Daniel: What about redirect chains? Some of our old URLs redirect to intermediate URLs that then redirect to final URLs. Double redirect.

SDC SEO Brain: How did that happen?

Daniel: We did a smaller restructure last year, then the big migration this year. Some old redirects pointed to URLs that then got redirected again.

SDC SEO Brain: That’s a real problem, not just a reporting issue. Redirect chains pass less equity than direct redirects. Google says it will follow chains, but every hop adds latency and potentially loses PageRank. Export your redirect rules and look for any old destination URLs that now themselves redirect. Flatten those to point directly to the final destination.

Daniel: Is there a tool for that?

SDC SEO Brain: Screaming Frog can identify redirect chains. Crawl your site with “always follow redirects” enabled, then filter for URLs with redirect chains. You’ll see each hop. For the redirect file itself, you’ll need to manually or programmatically cross-reference: for each redirect, check if the destination is also a redirect in your rules.

Daniel: We have about 6,000 redirect rules total. Checking each manually would take forever.

SDC SEO Brain: Write a script. Export your redirect rules as CSV: old URL, new URL. Then for each “new URL,” check if it exists in the “old URL” column. If yes, you have a chain. The script outputs which rules need flattening. In Python, this is maybe 20 lines of code.

Daniel: I can do that. What priority should this be versus fixing internal links?

SDC SEO Brain: Internal links first, then chains. Internal links are active signals you control that happen with every crawl. Chains are passive problems that affect a subset of URLs being discovered externally. Both matter, but internal links have more impact on how Google perceives your current site structure.

Daniel: One more question. Some of the “page with redirect” URLs are weird. They have tracking parameters and stuff that we never created.

SDC SEO Brain: Example?

Daniel: Like “/old-product-url?utmsource=facebook&utmmedium=cpc” showing up as page with redirect.

SDC SEO Brain: That’s external referral traffic with tracking parameters. Someone linked to your old URL with UTM parameters, Google discovered that specific URL string, crawled it, and reported it. The base redirect works, but Google treats each unique URL string as a separate entry in its crawl queue.

Daniel: So the same product could show up multiple times with different parameters?

SDC SEO Brain: Yes. This is where parameter handling in GSC matters. Go to Settings, Crawl Stats, and look at URL parameters. You can tell Google which parameters don’t change content. But honestly, for redirecting URLs this is low priority. The redirect handles it correctly. The parameter variants will age out of Google’s crawl queue over time just like the base URLs.

Daniel: Should I be worried about the total number? 4,000 seems like a lot.

SDC SEO Brain: Relative to what? For a 15,000 SKU site that did a full URL restructure, 4,000 legacy URLs being tracked isn’t alarming. That’s less than 30% of your product count, and it includes parameter variants. A “lot” would be 50,000+ on a site your size, or if the number was growing rapidly after you fixed all the sources. You’re in normal post-migration cleanup territory.

Daniel: That’s actually reassuring. I thought I had a serious problem.

SDC SEO Brain: You have a normal problem being handled correctly. The redirects work, Google is following them, and the new URLs are getting indexed. “Page with redirect” is informational, not an error state. Google is telling you what it found, not telling you something is broken. Your job is to minimize how often Google finds old URLs, which you’re now doing by fixing internal links, updating Merchant Center, and flattening chains.

Daniel: Last thing: should I keep all 6,000 redirects forever, or can I eventually remove old ones?

SDC SEO Brain: Keep them indefinitely for any URL that ever had external links or significant traffic. Removing a redirect turns a redirect into a 404, which is worse for any remaining link equity and user experience. Storage for redirect rules is cheap. The risk of breaking something by removing them isn’t worth the tiny benefit of a cleaner redirect file.

Daniel: Even if the old URL hasn’t been accessed in over a year?

SDC SEO Brain: Your server logs might show no access, but Google’s index memory is long. A random blog post from 2019 might still link to your old URL. Someone might bookmark an old page. Redirect rules cost you nothing to maintain. Removing them costs you the insurance policy against broken experiences. Keep them.


FAQ

Q: What does “page with redirect” mean in Google Search Console?
A: It means Google discovered a URL, followed a redirect (usually 301), and indexed the final destination instead of the redirecting URL. This is normal behavior after site migrations or URL changes. The redirect is working correctly. The status is informational, not an error. Google is simply reporting what it found during crawling.

Q: Why does my “page with redirect” count keep growing after a site migration?
A: Growing counts mean Google is still discovering old URLs from active sources. Common culprits: sitemaps still containing old URLs, internal links not updated to new structure, product feeds (Google Merchant Center, Facebook catalog) generating old URL formats, email templates with hardcoded old links, or external sites linking with tracking parameters. Fix the sources to stop the growth.

Q: Should I be worried about having thousands of “page with redirect” entries?
A: Not necessarily. For a large site that underwent URL restructuring, thousands of legacy redirect entries is normal. The metric that matters is whether the count is stable or growing. Stable means you’ve stopped creating new references to old URLs. Growing means there’s an active source still pointing to old URLs that needs fixing.

Q: How long do “page with redirect” entries take to disappear from GSC?
A: Months, potentially. Google recrawls URLs based on importance and change frequency. Old redirecting URLs get recrawled less often over time as Google confirms they consistently redirect. There’s no way to force faster cleanup. The entries will gradually age out of Google’s active crawl queue as it prioritizes URLs that return actual content.

Q: Should I keep redirect rules forever or eventually remove old ones?
A: Keep them indefinitely for any URL that had external links or meaningful traffic. Removing a redirect creates a 404 error, which loses any remaining link equity and creates poor user experience for anyone with old bookmarks or clicking old links. Redirect rules cost nothing to maintain and provide insurance against breaking experiences.


Summary

“Page with redirect” in Google Search Console is not an error requiring a fix. It’s an informational status indicating Google found a URL, followed your redirect, and indexed the destination correctly. The redirects themselves are working as intended.

The actual problem behind growing “page with redirect” counts is active signals still pointing to old URLs. The most common sources are sitemaps not updated post-migration, internal links still using old URL structure, and product feeds (Merchant Center, Facebook, affiliates) generating old URL patterns. Fixing these sources stops the count from growing.

Internal link cleanup provides the biggest impact. Every internal link to an old URL forces Google to process a redirect and wastes crawl budget. Tools like Screaming Frog can identify all internal links pointing to redirecting URLs. Template-level fixes (navigation, footers) propagate site-wide, while CMS content blocks require manual updates.

Redirect chains (A → B → C instead of A → C) are a genuine technical problem, not just a reporting issue. Each hop adds latency and potentially loses PageRank. After multiple migrations, chains accumulate. Export redirect rules and programmatically identify any destination URL that also exists as a source URL in your rules, then flatten to direct redirects.

Parameter variants (URLs with UTM codes, session IDs, etc.) inflate “page with redirect” counts because Google treats each unique URL string separately. These are low priority; the redirect handles them correctly, and they’ll age out of Google’s crawl queue over time.

The “page with redirect” entries themselves will take months to disappear, with no way to accelerate the process. Google recrawls old URLs less frequently over time as it learns they consistently redirect. Focus on stopping new entries rather than trying to clear historical ones.

Redirect rules should be kept indefinitely. Removing a redirect creates a 404, which is worse than maintaining a redirect to a valid page. Storage is cheap; broken user experiences and lost link equity from removed redirects are expensive.


Sources