Verified bots policy
In order to be listed by Cloudflare as a verified bot, your bot must conform to the below requirements. To provide the best possible protection to our customers, this policy may change in the future as we adapt to new bot behaviors.
A bot or proxy must have a minimum amount of traffic for Cloudflare to be able find it in the sampled data. The minimum traffic should have more than 1000 requests per day across multiple domains.
Service must be made for a widespread use of zones.
A bot crawling one site is not valid.
The user-agent or message signature with the following requirements:
- Have at least 5 characters.
- Must not contain special characters.
- Must not include the same user-agent of another verified service.
GoogleBot/1.0
is a valid UA.
Domains should only be crawled with the explicit or implicit consent of the zone's owner or terms of use. Search engines crawlers must read the robots.txt
to exclude paths to crawl from the owner.
A tool trying to scalp inventories from different websites might be breaking terms of use while a search engine bot indexing websites but complying with robots.txt
is a valid service.
The purpose of the service should be benign or helpful to both the owner of a zone and the users of the service. The service cannot perform any of the following:
- Bot tooling
- Scalpers
- Credential-stuffing
- Directory-traversal scanning
- Excessive data scraping
- DDoS botnets
Price scraping direct ecommerce competitors is not a valid use case.
The crawling etiquette should check robots.txt
if crawling the whole website, and it should not attempt to crawl sensitive paths.
If a search engine crawler skips robots.txt
, it will be rejected.
The bot must have publicly documented expected behavior or user-agent format.
If any of the requirements to validate are breached, a service will be removed from the global allowlist.
- Adding a set of IPs that are not solely used by verified service.
- The service IPs are breached by an attacker.
- The service has vulnerabilities that have not been patched.
- A block of IPs not briefed on onboarding is added to the list.
- The disclosed purpose of the service does not reflect on the traffic.
- An AI Crawler that does not respect the crawl-delay directive in robots.txt.
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Products
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark