Page Not Indexed in Google: Complete Troubleshooting Guide
Publishing a page doesn't guarantee Google will index it. When a page fails to appear in search results — even after several weeks — there's usually a diagnosable reason. This guide walks through the full checklist, from blocking signals to content quality issues, in the order you should check them.
Step 1: Check for Explicit Blocking Signals
The first thing to rule out is that you've accidentally told Google not to index the page. These mistakes are common after template changes or CMS migrations:
- noindex meta tag: Search the page's HTML for
<meta name="robots" content="noindex">. This directly instructs search engines to skip the page. - X-Robots-Tag HTTP header: Some server configurations add a
X-Robots-Tag: noindexHTTP header. Check this with browser dev tools → Network tab, or use a tool like curl:curl -I https://yourdomain.com/your-page. - robots.txt disallow: If your robots.txt contains a
Disallow: /your-path/rule matching the page, Google can crawl it but the path block may affect indexing signals. Check atyourdomain.com/robots.txt. - Password protection or login wall: Pages behind authentication cannot be crawled or indexed. Even a soft paywall may prevent indexing if content isn't accessible to Googlebot.
Step 2: Use the URL Inspection Tool in Search Console
Google Search Console's URL Inspection tool is the most direct way to see exactly how Google views your page. Go to Search Console → URL Inspection → paste your full URL and press Enter.
- Coverage status: Look at the "Coverage" section. It will show you the indexing status, when Google last crawled the page, and the crawled-as reason.
- Google-selected canonical: If the tool shows a different canonical than what you declared, Google has chosen a different version to represent the page. You need to reconcile the conflict.
- Test Live URL: Click "Test Live URL" to fetch the current version of the page as Googlebot. This confirms whether the page is currently accessible and whether Google can render it properly.
- Request Indexing: Only use this after you've made fixes. Requesting indexing on an unchanged page rarely helps, and Google throttles requests from pages submitted too frequently without changes.
Step 3: Verify the Canonical Tag
A misconfigured canonical tag is one of the most common reasons a page isn't indexed. The page may be consolidating into a different URL than you intended.
- Each page should have a self-referential canonical:
<link rel="canonical" href="https://yourdomain.com/exact-page-url"> - If the canonical points to a different page, Google will index that other page — not this one.
- Watch for trailing slash mismatches:
/page/vs/pagetreated as separate URLs when canonical isn't set. - Ensure the canonical URL in the tag matches exactly what's in your sitemap and what you're requesting indexing for.
Step 4: Assess Content Quality
Google's indexing decision is partly a quality judgment. If the page provides little unique value, Google may choose not to include it. Common quality issues:
- Thin content: Pages with fewer than 300 words of substantive text are regularly skipped. Navigation, footers, and boilerplate don't count toward useful content.
- Duplicate or near-duplicate content: If 70%+ of your page's content matches another page on your site or elsewhere on the web, Google will pick only the "best" version to index.
- Copied manufacturer/vendor descriptions: E-commerce pages using default supplier copy are flagged as duplicate content across thousands of sites selling the same product.
- No clear search intent match: A page with no obvious search query it's answering gives Google no reason to include it in results.
Step 5: Fix Internal Linking and Discovery
Google discovers and evaluates pages partly through internal links. Orphan pages — those with no inbound internal links from other pages on your site — are deprioritized:
- Add at least 2–3 links to the page from relevant, high-traffic pages on your site.
- Include the page in your main navigation or a relevant category page if appropriate.
- Add the URL to your sitemap.xml and resubmit the sitemap in Search Console under Sitemaps.
Step 6: Check Technical Accessibility
- The page should return an HTTP
200status consistently — not intermittent500or503errors. - Page load time matters: pages that take more than 10 seconds to load may not be fully crawled.
- If the page's main content is rendered by JavaScript, confirm Googlebot can execute it. Use the URL Inspection tool's Live Test rendering view to check what Google actually sees.
- Do not block CSS or JS files in robots.txt — Googlebot needs to render the page to assess quality.
How Long Until a New Page Gets Indexed?
For established sites with good crawl history, new pages typically appear in Google within 1–2 weeks of being submitted. For newer sites or pages with weak signals, it can take 4–8 weeks. Consistently publishing high-quality content, maintaining a clean sitemap, and having strong internal links all accelerate the process.
Related: Crawled - Currently Not Indexed | Organic Rankings Dropped.