Skip to content
Features

Citelayer® AI Blind Spots — Find Pages Invisible to AI

6 min read

The Problem with AI Visibility Gaps

Publishing content isn’t the same as having AI systems find it. An AI crawler might visit your homepage and your ten most-linked posts while completely ignoring a product page, a detailed how-to guide, or a key landing page. Without visibility into which pages are being skipped, you can’t do much about it.

The AI Blind Spots feature cross-references your published content against the bot visit log from Bot Analytics. Any page that hasn’t received a single AI bot visit within your chosen time window shows up here. More usefully, each page also gets an AI Discovery Score — a 0–100 rating that diagnoses why AI crawlers might be skipping it.


What Counts as a Blind Spot

A page is a blind spot when no AI crawler (from either the confirmed_ai or possible_ai categories) has visited it within the selected time period. The feature compares your current list of published posts and pages against the wp_citelayer_bot_visits table.

The time period filter gives you four windows to choose from: 7 Days, 30 Days, 90 Days, and 365 Days. The longer the window, the smaller the blind spot list will typically be — bots are more likely to have visited a page at least once over a full year than in the past week. Start with 30 or 90 days for a realistic operational view.

At the top of the screen, you’ll see the Coverage percentage: what fraction of your published content has received at least one AI bot visit in the selected window. A site with 40 published posts where 30 have been visited shows 75% coverage.


The AI Discovery Score

Every page in the blind spots list — and optionally across your entire published content — gets an AI Discovery Score from 0 to 100. The score doesn’t measure how popular a page is with AI bots; it measures how well the page is set up to be discovered and understood by AI systems.

The scoring starts at 100 and deducts points for specific problems:

Content length Short content is harder for AI systems to evaluate and often gets skipped or treated as low-value. The scoring penalizes thin pages progressively:

  • Fewer than 500 words: −10 points
  • Fewer than 300 words: −20 points (cumulative)
  • Fewer than 100 words: −30 points (cumulative)

Title quality A missing or very short page title is a signal that a page isn’t fully built out. If the title is empty or fewer than 10 characters: −15 points.

Excerpt Excerpts serve as structured summaries that AI systems can use without parsing the full content. A page without an excerpt misses this signal. No excerpt: −10 points.

Featured image Missing featured images aren’t a direct factor in AI indexing, but they’re a proxy for content completeness. No featured image: −5 points.

Page age Very old content that hasn’t been updated in over a year may be stale or outdated. Pages older than 365 days take a −5 point adjustment.

noindex This is the most significant factor. If your page is marked as noindex — whether through Yoast SEO, Rank Math, or any other mechanism that sets the robots meta tag — AI crawlers that respect this directive won’t index the page. noindex: −40 points.

The minimum score is 0. A page with very thin content that is also marked noindex and has no title would theoretically bottom out before reaching zero, but the display floors at 0.


Reading the Results

Blind spot pages are listed with the lowest scores first — the pages that need the most attention appear at the top. Each row shows the page title, URL, the AI Discovery Score with a color indicator (red for low scores, yellow for middling, green for acceptable), and the specific issues contributing to the score.

This ordering is intentional. If you have 40 pages that AI bots haven’t visited, starting with the ones that have structural problems is more productive than starting with pages that simply haven’t been crawled yet.


What to Do About Low Scores

The score is diagnostic, not punitive. Each point deduction corresponds to something you can actually fix.

Short content is the most common issue on young or lightly-developed sites. Adding depth to thin pages — more explanation, more context, more specifics — helps both AI discovery and human readability. There’s no shortcut here; the content genuinely needs to be there.

Missing excerpts are easy to fix. In the WordPress post editor, scroll to the Excerpt field (or enable it from Screen Options if it’s hidden) and add a one- to three-sentence summary. This takes two minutes per page.

Missing featured images are similarly quick to add. Even a relevant stock image is better than nothing, but a purpose-made image specific to the content is better.

noindex pages need careful consideration. If a page is intentionally excluded from indexing — a thank-you page, a checkout step, a private resource — leave it as-is. If a page is noindex by accident or because it was never meant to stay that way, removing the noindex flag will likely have the biggest single impact on that page’s score and discoverability.

Old content isn’t always a problem. If the information is still accurate, a content refresh — updating examples, adding recent context, checking for outdated links — signals to crawlers that the page is maintained.


Performance and Caching

Calculating blind spots requires querying both the bot visit table and the posts table. On large sites with thousands of posts, this can be slow. Citelayer® caches the results as a WordPress transient for six hours. If you’ve just published new content or added bot exclusion rules and want to see fresh results immediately, you can force a cache refresh by appending ?citelayer_flush_blind_spots=1 to the admin URL.


How Blind Spots Relates to Bot Analytics

Blind Spots and Bot Analytics use the same underlying data but answer different questions. Bot Analytics asks: “Which bots are coming, and what are they looking at?” Blind Spots asks: “What content are bots not looking at, and why might that be?”

Use them together. Bot Analytics shows you your AI traffic patterns; Blind Spots shows you the gaps in that coverage. The AI Readiness Scanner rounds out the picture with site-level technical checks that affect AI discoverability across the board.