Skip to content

IndexNow: Definition and Use Cases

TL;DR:

IndexNow is a protocol (Bing + Yandex) for sites to notify search engines of URL updates. POST to api.indexnow.org with URL + key. Reindexing in minutes instead of hours. Supported by Bing, Yandex, Seznam, Naver.

What is IndexNow

IndexNow is a protocol (Bing + Yandex) for sites to notify search engines of URL updates. POST to api.indexnow.org with URL + key. Reindexing in minutes instead of hours. Supported by Bing, Yandex, Seznam, Naver.

robots.txt ParsingFull Allow/Disallow directive parsing
URL TestCheck if a specific URL is allowed for bot
Sitemap LinksAll Sitemap: directives in the file
AI CrawlersGPTBot, ClaudeBot, and other AI bots

Why teams trust us

Live
robots.txt check
UA
any User-Agent
Sitemap
sitemap links
Free
no signup

How it works

1

Enter site URL

2

Parse robots.txt

3

Check crawl rules

Why check robots.txt?

robots.txt controls which pages search bots can see. Incorrect directives can accidentally block the entire site from indexing or expose administrative sections.

Full Parsing

Parse robots.txt per RFC 9309: all User-agent, Allow/Disallow, Crawl-delay, Sitemap.

URL Tester

Enter a specific URL and User-agent — find out if it's allowed for that bot.

AI Crawlers

Automatically show status for GPTBot, ClaudeBot, PerplexityBot, Googlebot.

Sitemap List

All Sitemap: directives in one place with quick links for verification.

Who uses this

SEO

crawl directive audit

Developers

post-deploy check

Marketers

indexation control

Site owners

block unwanted crawlers

Common Mistakes

Disallow: / for entire siteThis blocks the entire site from indexing. Check robots.txt after every change.
Blocking AI without understandingBlocking GPTBot removes your site from ChatGPT and Perplexity citations.
Not specifying SitemapWithout a Sitemap: directive, bots must guess the sitemap URL. Always specify explicitly.
Conflicting rulesAllow and Disallow on the same URLs for different User-agents create unpredictable behavior.

Best Practices

Test after every changeOne wrong character in robots.txt can block an entire section from indexing.
Use * carefullyUser-agent: * applies to all bots, including AI crawlers.
Always specify SitemapSitemap: https://example.com/sitemap.xml helps bots find all pages.
Verify with Google Search ConsoleGSC shows how Google sees your robots.txt, including parsing errors.

Get more with a free account

Robots.txt check history and change monitoring for your site.

Sign up free

Learn more

Frequently Asked Questions

Do I need IndexNow?

If you work with web infrastructure — yes. See description above.