Overview
Manages non-public bot traffic that shows clear signs of automated tools or attempts to impersonate public bots. This includes Command Line Http Tool, Browser Automation Testing Tool, HTTP libraries, Scan Tool, Crawler Tool, Proxy Tool and Forged Spider. These tools are commonly associated with malicious bot activity, so blocking them is recommended.
Detection Logic
When the Definite Bots Policy is enabled (Log or Deny), the system will detect common automated tool User-Agent signatures and forged crawler behaviors. The specific detection dimensions are as follows:
- HTTP Libraries: Libraries such as requests and urllib, which are Python HTTP libraries, are commonly used for automation scripts and web crawler development.
- Command Line Http Tool: Tools such as curl and wget can send HTTP requests directly from the terminal. They are commonly used for debugging, script invocation, simple web crawling, or probing.
- Browser Automation Testing Tool: Tools such as Selenium and Puppeteer are used to simulate real user operations in a browser. They are primarily used for functional testing, but may also be abused for automated fake transactions, bulk registrations, and similar activities.
- Scan Tool: Tools such as Nmap and Nessus are used for security testing and vulnerability scanning, and are also frequently used by attackers.
- Crawler Tool: Tools such as Heritrix and Scrapy are used for automatically fetching web content. They serve legitimate data collection purposes, but high-frequency requests or circumventing anti-crawling mechanisms may be considered malicious.
- Proxy Tool: Such as Proxychains, Luminati, used to relay or block network traffic, can hide the real IP or analyze communication content, and are often used with other tools to circumvent restrictions.
- Forged Spider: Requests that impersonate public crawlers (such as Googlebot) by faking User-Agent and other identifiers to bypass anti-crawling strategies, typically used for stealth scraping or attacks.
If a request matches any of the categories above, the system classifies and tags it directly as a Definite Bot. A Bot Score of 100 is also assigned, and the corresponding Bot Labels are added to support further analysis and tracing. For more details, see the Bot Score and Bot Tags sections.
Response Actions
You can continuously log or with one click deny absolute Bot traffic.
| Action |
Description |
| Not Used |
This policy will not be used for traffic inspection; the traffic will still flow to other inspection modules. |
| Log |
Only logs requests of this type; the requests will be forwarded as usual. |
| Deny |
Deny the request and respond with 403. |
Steps
- Log in to the console and go to the subscribed security product page.
- Go to Security Settings–>Policies.
- Select the domain you wish to configure the security policy and click
to enter the Security Policy editing page.
- Open the Bot Management tab and enable the master switch if it is turned off.
- Go to Definite Bots, choose the action you want, including Not Used, Log, and Deny.
- Click Publish Changes at the bottom to publish the configuration. Changes take effect within 1–3 minutes.
Protection Recommendations
- These crawlers typically originate from malicious activities initiated by black-market or gray-market entities. We recommend that you enable the deny mode directly.
- If your website uses in-house or authorized automation tools, we recommend using a Custom Bots policy to skip the corresponding User-Agent signatures to avoid false denyying.