다큐멘트 센터 Flood Shield 2.0 User Guide Set Bot Management Policies

Set Bot Management Policies

최신 업데이트:2024-10-16 16:08:13

Bots can be categorized into two major types:

  • Good bots: Mainly used to facilitate people’s lives and work, such as search engines, website monitoring, etc.
  • Bad bots: Mainly manifested in destroying the security and stability of the network, such as L7 DDoS attacks, credential stuffing, malicious scanning, automated ticket grabbing, etc.

In reality, we cannot directly and clearly distinguish between all good bots or bad bots. There are still a large number of unknown types of bots in the network. CDNetworks’s bot intelligence database contains a wide range of known types of bots, and based on whether these bots usually have a positive impact on the website, we classify them into two categories: good bots and bad bots. Unknown types of bots refer to bots that are not included in CDNetworks’s bot intelligence database. These bots need to be verified through a series of detection policy.

Go to Bot Management

  1. Log in to the CDNetworks Console, find the security product in use under Subscribed Products.
  2. Go to Security Part, Configurations > Policies.
  3. Find the hostname for which you want to configure security policies, click [Feature Upgrade] Advanced Access Control.
  4. Go to Bot Management tab. If this policy is off, turn it on.

Scenario 1: Mitigating Website Access Pressure from Crawler Tools

  • Configure Basic Detection Strategy:
    • Allow passage for the site-specific monitoring tools (if it exists). Based on the characteristics of monitoring tools, create rules in Custom Bots and set the action to Skip. Learn more.
    • Allow search engine crawlers to ensure the SEO effectiveness of the website. In Known Bots, set the action for the search engine category to allow; if there’s no such need, you can also set the action to Deny.
    • Block common commercial or open-source tools. In User-Agent Based Detection, set the action for categories such as Low Version User-Agent, Fake User-Agent, Automated Tools, and Crawler Tools to Deny.
  • Enable Client-based Detection:
    • Check whether the website has any business beyond web/H5 pages. If so, configure the relevant business characteristics into Bypass Traffic from Specific Clients to avoid affecting the normal access of such business when the Web Bot Detection strategy is enabled. Learn more.
    • Enable the JavaScript-based web enhancement protection solution. Set the action for Web Bot Detection to Deny. Learn more.

Before publishing to production, it is recommended to pre-publish a test through the “Publish Changes - Publish to Staging” button at the bottom of the page to validate the compatibility of the Web Bot Detection’s JS SDK with your website.

  • After confirming the configuration is correct, click the Publish Changes button at the bottom of the page, and click on Publish to Production button at the bottom of the page to make the configuration effective.

Scenario 2: Blocking Behaviors that Do Not Comply with Normal Business Access Logic

For Behaviors that do not comply with the normal business access logic, such as automated tools bypassing page visits and directly launching continuous attacks on a certain interface. it is recommended to configure Workflow Detection strategy on top of Scenario 1 to strengthen the protection. For configuration examples, please refer to Workflow Detection Details.

이 문서의 내용이 도움이 되었습니까?
아니오
정상적으로 제출되었습니다.피드백을 주셔서 감사합니다.앞으로도 개선을 위해 노력하겠습니다.