This document intends to provide you a complete quick onboarding and security configuration guidance.
New Domain Onboarding
Steps
- Log in to the console and open the subscribed security product.
- Go to Security Settings–>Policies.
- Click +Protected Hostname.
- Select the domain to be protected from list.
- Choose an initial policy:
• Recommended default policies: Use the system’s default policy for out-of-the-box protection and adjust it later if needed.
• Duplicate policies from an existing hostname: Apply the policy of an already onboarded domain to new domain.
Initial Policy of Bot Management
If you choose Recommended default policies, a default configuration set will be provided as below. You can change the actions right here or later in Policies->Bot Management page. Default policies are:
- AI Bots: Log by default. Supports one-click blocking of AI large language model crawlers.
- Public Bots: Skip by default; Supports one-click skip/deny for active crawlers on the Internet, such as search engines, data scrapers, and website monitoring bots.
- Definite Bots: Log by default; Supports one-click blocking of traffic with clear automated tool characteristics.
- Likely Bots: Log by default; supports JavaScript Challenge, Interactive Challenge, or one-click blocking for traffic with abnormal behavior patterns or characteristics that differ from those of normal users.
Note: If your business involves automated access from internal enterprise tools or third-party vendor automation programs, you can manage this traffic after domain onboarding by customizing Bot signatures.
Bot Policy Adjustments for Onboarded Domains
Steps
- Log in to the console and open the subscribed security product.
- Go to Security Settings–>Policies.
- Select the domain for which you want to configure the security policy and click
to enter the Security Policy page.
- Open the Bot Management tab and enable the master switch if it is turned off.
- Adjust Bot Shield policies according to the Application Scenario.
- Click the <Publish Changes> at the bottom to publish the configuration. Changes take effect within 1–3 minutes.
Application Scenario
Scenario 1: Protecting Website Intellectual Property from AI Large Model Crawlers
- When monitoring detects a large volume of AI Bots traffic on your website, to prevent issues such as copyright infringement and sensitive data leaks caused by AI Bots, you can set the action for the AI Bots policy to ‘Deny’ to prevent such requests from accessing your site. Learn more.
- If you only need to block certain specific AI crawler tools, you can manage them individually through custom bots. Learn More.
Scenario 2: Managing Crawlers Beneficial for Your Business
- If your website relies on traffic acquisition and you want to increase organic traffic and exposure, you can allow beneficial crawlers through the Public Bots policy, such as search engines and market analysis bots. Learn More.
- You may also use a custom bots policy to individually skip specific public bots. Learn More.
Scenario 3: Mitigating Automated Access Pressure on Websites
- Under the Definite Bots policy, the one-click Deny feature is generally used to deny malicious automated tools, including development frameworks, HTTP libraries, vulnerability scanners, crawlers, proxy tools, spoofed spiders, and more. Learn More.
- Under the Likely Bots policy, one-click Deny/Secondary Verification can be used to deny or further verify these highly suspicious requests, which are most likely initiated by malicious automated tools. Learn More.
- If your website does not require traffic acquisition, you can also block Public Bots with one click to reduce bandwidth pressure on your website. Learn More.
Scenario 4: Enhanced Protection Against Advanced Malicious Activity
If the general protection measures in Scenario 3 are ineffective, the following enhanced actions can be taken to effectively mitigate automated attacks:
- Check whether your website includes services other than Web/H5 pages. If yes, configure the application request allowlist with their corresponding characteristics to prevent Web Bot Detection policy from affecting normal access to those services.
- Enable the web enhancement protection solution based on JavaScript technology. Set the action for Web Bot Detection to block. [Learn More]https://documents.cdnetworks.com/document/cloudsecurity2/1694-webbotdetection).
- After confirming the configuration, click the Publish Changes at the bottom of the page. On Confirm Changes page, click Publish to Production to apply the changes.
Note: Before official deployment, it is recommended to run a pre-deployment test by “Publish to Staging”. This helps to verify the compatibility between the Web Bot Detection JS SDK and your website in advance.
- Enable Workflow Detection
For activities that do not comply with normal business access logic, such as automated tools bypassing page navigation to continuously attack a specific interface, configure a Workflow Detection policy to enhance protection and block behaviors that do not follow expected business workflows. Learn more.