Skip to content
Reference

Citelayer® Crawler Logs — Full AI Bot Visit History

3 min read

What the Crawler Logs Show

The Crawler Logs tab is the raw, unfiltered record behind the Bot Analytics dashboard summaries. While the Overview tab shows you aggregated numbers, Crawler Logs shows you the individual events: which bot, which page, and when.

Navigate to Citelayer® → Bot Analytics → Crawler Logs to access the log.


Log Columns

Each row in the log represents one bot visit:

ColumnWhat It Shows
Date/TimeTimestamp of the visit in your WordPress timezone
Bot NameThe identified bot (e.g., GPTBot, ClaudeBot, PerplexityBot)
Categoryconfirmed_ai or possible_ai
URL PathThe page path that was visited
IP AddressThe bot’s IP, only visible if IP logging is enabled in settings

Filtering the Log

The log includes three filter controls at the top:

Bot Name — a dropdown listing all bots that have logged visits. Select one to show only that bot’s entries. This is the fastest way to answer “Is GPTBot actually visiting my site?”

URL Path — a text search field. Enter any part of a URL path to filter to visits matching that pattern. Searching for /blog/ returns all visits to pages under that path. Searching for .md shows only visits to Markdown endpoints.

Date range — from/to date pickers. Use these to focus on a specific time window without changing the global time period selector.

Filter selections combine: you can search for GPTBot visits to /product/ pages in the last 30 days simultaneously.


Pagination

The log displays 50 entries per page, sorted newest first. Navigation controls at the bottom let you page through the full history. On active sites with frequent AI crawler traffic, the log can contain thousands of entries — use the filters to narrow to what you’re investigating rather than paging through everything.


Automatic Cleanup

Records older than 90 days are deleted automatically by a WP-Cron job that runs daily. This means the Crawler Logs view always covers, at most, the last 90 days of history. If you need records beyond 90 days, export via CSV before they age out.

The 90-day window is adjustable. See the citelayer_bot_analytics_cleanup_days filter in the Hooks & Filters reference for instructions.


Exporting the Log

To export what you see in the log — including any active filters — use the CSV Export feature. The CSV export supports the same bot-name and date-range filters. The export covers up to 10,000 rows per download.


Common Debugging Scenarios

“Is GPTBot crawling my site at all?” Open Crawler Logs, select GPTBot in the bot filter, and set the date range to the last 30 days. If you see zero entries, either GPTBot hasn’t visited, or your caching plugin is serving cached pages to the bot without PHP running. Check the Cache Config tab for diagnosis.

“Is citelayer® actually logging bot visits after I set up the cache exclusion?” After configuring your cache plugin’s bot exclusion rules, watch the Crawler Logs for the next few hours. Even low-traffic sites typically see at least one or two AI bot visits per day from major crawlers. If you see entries appearing, the setup worked.

“Which pages is ClaudeBot reading on my site?” Filter the log by bot name (ClaudeBot) and leave the URL filter blank. Sort through the URL Path column to see the distribution of pages. If ClaudeBot is repeatedly visiting the same five pages, consider whether other content might benefit from more internal linking or sitemap prominence.

“Did a bot visit my site right after I published a new post?” Set the date filter to the publication date and the day after. Look for bot visits to the new post’s URL. Some crawlers pick up new content quickly; others may take days or weeks.