Forum Discussion
Bot Defense causing a lot of false positives
Hello DevCentral Community,
While configuring a Bot Defense profile for our websites, we noticed a lot of false positives, where legitimate browsers are flagged as Malicious Bots to a point where we cannot safely enable Malicious Bot blocking.
The detected anomalies are mostly :
- Device ID Deletion (can be worked around by raising the threshold from 3 to ~10)
- Resource request without browser verification cookie
- Session Opening
- Browser Verification Timed out (more rarely)
We have tried various configuration, none of which worked properly.
Currently, our test bot defense profile is as follows :
- DoS Attack Mitigation Mode : Enabled
- API Access for Browsers and Mobile Applications : Enabled
- Exceptions:
- Device ID Deletions : Block for 600s Detect after 10 (instead of 3) access attemps in 600s
- No microservice
- Browser Access : Allow
- Browser Verification : Verify After Access (Blocking) / 300s grace perdiod (we also tried verify before, but the white challenge page isn't acceptable for our users)
- Device ID mode : Generate After Access (we also tried Generate Before access)
- Single page application : Enabled (we also tried to disable it)
- Cross Domain Requests : Allow configured domains; validate upon request (with all of our websites added in related site domains)
We also tried with allow all requests
After a bit of digging around, we noticed the following :
- The false positives often happen after visiting a website that loads various resources from other domains, and we believe the issue might be linked to cross domain requests
- Google Chrome (and derivatives) are dropping the TS* cookies for cross domain requests, even with the domains added in the related domain list
- After creating an iRule that updates TS* cookies with SameSite=None; Secure, some previously blocked requests were now allowed but not all
Disabling the check for the detected anomalies feel like it would severely affect the bot defense effectiveness.
We have opened a support ticket related to this is issue over a year ago and haven't found any solution yet.
Has anyone faced a similar problem before, and has managed to solve it ?
If so, how ?
Thank you for any help.
Regards
Hi renaud-gaspard ,
The anomalies could be contributed due to Chromes third part deprecation. If that's in play you might want to consider using Cookies with Independent Partitioned state aka (CHIPs). This would be a modification to the irule and in chrome would need to have this setting ---> chrome://flags/#test-third-party-cookie-phaseout enabled.
Aside from the irule to test and try the issue with "High Number of HTML transactions since JS verification" there could be some tweaks to try for this.
HTML transactions since JS verification" threshold. The default is typically low (5-10). For SPAs returning HTML fragments, maybe try to set to 50-100 based on the traffic pattern.
Static resource paths wont carry cookies. Make sure these are exempt in the Bot Defense profile, add URL allowlist entries:
Pattern Type *.css Glob *.js Glob *.woff Glob *.woff2 Glob *.ttf Glob *.eot Glob *.svg Glob *.png Glob *.jpg Glob *.jpeg Glob *.gif Glob *.webp Glob *.ico Glob *.map Glob /static/* Glob /assets/* Glob /cdn-cgi/* Glob Adjust the path-based patterns (/static/*, /assets/*) to match your actual resource directory structure. You can also use regex if you need more refinement.
#--------------------------------------------------------------- # Purpose: Append SameSite=None; Secure; Partitioned to Bot # Defense (TS*) and persistence (BIGipServer*) cookies # to support cross-domain PBD under Chrome 3P cookie # deprecation (CHIPS). #--------------------------------------------------------------- when RULE_INIT { # Toggle debug logging: 1 = enabled, 0 = disabled (production) set static::cookie_debug 0 } when HTTP_RESPONSE_RELEASE { # Iterate over all Set-Cookie headers in the response # We must use "HTTP::header values" to capture multi-value # Set-Cookie headers — "HTTP::header value" only returns the first # Collect all Set-Cookie headers set num_cookies [HTTP::header count "Set-Cookie"] # Short-circuit if no cookies present if { $num_cookies == 0 } { return } # Build a list of modified cookies, then replace all at once # This avoids modifying the header collection while iterating set new_cookies [list] set modified 0 for { set i 0 } { $i < $num_cookies } { incr i } { set cookie [HTTP::header value "Set-Cookie" $i] # Match TS* (Bot Defense) and BIGipServer* (persistence) cookies if { [string match "TS*" $cookie] || [string match "BIGipServer*" $cookie] } { # Skip if already fully patched (idempotency guard) if { [string match "*Partitioned*" $cookie] } { lappend new_cookies $cookie continue } # Case 1: Has SameSite=None but may be missing Secure and/or Partitioned if { [string match "*SameSite=None*" $cookie] } { if { !([string match "*Secure*" $cookie]) } { append cookie "; Secure" } append cookie "; Partitioned" set modified 1 # Case 2: No SameSite attribute at all — add full attribute chain } else { append cookie "; SameSite=None; Secure; Partitioned" set modified 1 } if { $static::cookie_debug } { log local0.debug "CHIPS_IRULE: Patched cookie: $cookie" } } lappend new_cookies $cookie } # Only rewrite headers if we actually changed something if { $modified } { # Remove all existing Set-Cookie headers HTTP::header remove "Set-Cookie" # Re-insert all cookies (modified + unmodified) foreach c $new_cookies { HTTP::header insert "Set-Cookie" $c } } }
5 Replies
- Jeff_Granieri
Employee
Hi renaud-gaspard ,
As you mentioned the TS Cookies which are domain scoped pose the issue when one site loads resource from another domain. I think the "SameSite" iRule can help this with a modification on where the "Set-Cookie" is placed, if we can adjust to cover all TS cookies on all responses this should help. Bot Defense injects its cookies after HTTP_RESPONSE so changing the event handler should cover this. I used F5's F5 AI Assistant - iRules to help validate the iRule code that can help with this scenario. I have not tested this so please do so in a non-prod environment.
when HTTP_RESPONSE_RELEASE { foreach cookie [HTTP::header values "Set-Cookie"] { if { [string match "TS*" $cookie] || [string match "BIGipServer*" $cookie] } { if { !([string match "*SameSite=*" $cookie]) } { HTTP::header replace "Set-Cookie" $cookie "$cookie; SameSite=None; Secure" } } } }Here is what F5's AI Assistant for iRules, analyzed and explained for the iRule:
- Objective
• This iRule runs at the HTTP_RESPONSE_RELEASE event to inspect all outgoing Set-Cookie headers.
• It targets cookies whose names start with “TS” or “BIGipServer” and, if they lack any SameSite attribute, appends “; SameSite=None; Secure” to enforce cross-site usage and require secure transport. - Execution Flow
- Event HTTP_RESPONSE_RELEASE fires just before the BIG-IP releases the HTTP response to the client (see docs).
- The code retrieves every Set-Cookie header value via
foreach cookie [HTTP::header values "Set-Cookie"] - It checks if the cookie string begins with “TS” or “BIGipServer” ([string match "TS*" $cookie] || [string match "BIGipServer*" $cookie]).
- For each matching cookie, it tests absence of a SameSite attribute (!([string match "*SameSite=*" $cookie])).
- If no SameSite is found, it replaces that specific Set-Cookie header with a version that appends “; SameSite=None; Secure” using
HTTP::header replace "Set-Cookie" $cookie "$cookie; SameSite=None; Secure".
- Potential Logic Concerns
• If a matching cookie already has a SameSite attribute but lacks the Secure flag, this rule will skip it entirely and not add Secure. (high confidence)
Summary
The iRule correctly identifies and modifies specified persistence cookies to include SameSite=None and Secure in most cases, but it will not add Secure to cookies that already declare a SameSite attribute.Under Cross-Domain-Request config -make sure this is set: Allow configured domains; validate upon redirect.
All FQDN's that server cross-domain should be in the site domain list. For example where you see <script src> and other reference tags that contain domains.
- Objective
- renaud-gaspard
Nimbostratus
Hello,
Thank you for your response.
We already had a similar iRule :
when HTTP_RESPONSE_RELEASE { foreach cookie_name [HTTP::cookie names] { if {$cookie_name starts_with "TS"} { HTTP::cookie attribute $cookie_name remove SameSite HTTP::cookie attribute $cookie_name insert SameSite None HTTP::cookie secure $cookie_name enable } } }While it helped reduce the occurence of the anomalies, we still see a lot of "Resource request without browser verification cookie".
We also noticed the anomaly "High Number of HTML transactions since JavaScript verification". But this one happened only on one single website.
This happens even after adding the full list of all the domain names liked to cross domain requests.
- Jeff_Granieri
Employee
Hi renaud-gaspard ,
The anomalies could be contributed due to Chromes third part deprecation. If that's in play you might want to consider using Cookies with Independent Partitioned state aka (CHIPs). This would be a modification to the irule and in chrome would need to have this setting ---> chrome://flags/#test-third-party-cookie-phaseout enabled.
Aside from the irule to test and try the issue with "High Number of HTML transactions since JS verification" there could be some tweaks to try for this.
HTML transactions since JS verification" threshold. The default is typically low (5-10). For SPAs returning HTML fragments, maybe try to set to 50-100 based on the traffic pattern.
Static resource paths wont carry cookies. Make sure these are exempt in the Bot Defense profile, add URL allowlist entries:
Pattern Type *.css Glob *.js Glob *.woff Glob *.woff2 Glob *.ttf Glob *.eot Glob *.svg Glob *.png Glob *.jpg Glob *.jpeg Glob *.gif Glob *.webp Glob *.ico Glob *.map Glob /static/* Glob /assets/* Glob /cdn-cgi/* Glob Adjust the path-based patterns (/static/*, /assets/*) to match your actual resource directory structure. You can also use regex if you need more refinement.
#--------------------------------------------------------------- # Purpose: Append SameSite=None; Secure; Partitioned to Bot # Defense (TS*) and persistence (BIGipServer*) cookies # to support cross-domain PBD under Chrome 3P cookie # deprecation (CHIPS). #--------------------------------------------------------------- when RULE_INIT { # Toggle debug logging: 1 = enabled, 0 = disabled (production) set static::cookie_debug 0 } when HTTP_RESPONSE_RELEASE { # Iterate over all Set-Cookie headers in the response # We must use "HTTP::header values" to capture multi-value # Set-Cookie headers — "HTTP::header value" only returns the first # Collect all Set-Cookie headers set num_cookies [HTTP::header count "Set-Cookie"] # Short-circuit if no cookies present if { $num_cookies == 0 } { return } # Build a list of modified cookies, then replace all at once # This avoids modifying the header collection while iterating set new_cookies [list] set modified 0 for { set i 0 } { $i < $num_cookies } { incr i } { set cookie [HTTP::header value "Set-Cookie" $i] # Match TS* (Bot Defense) and BIGipServer* (persistence) cookies if { [string match "TS*" $cookie] || [string match "BIGipServer*" $cookie] } { # Skip if already fully patched (idempotency guard) if { [string match "*Partitioned*" $cookie] } { lappend new_cookies $cookie continue } # Case 1: Has SameSite=None but may be missing Secure and/or Partitioned if { [string match "*SameSite=None*" $cookie] } { if { !([string match "*Secure*" $cookie]) } { append cookie "; Secure" } append cookie "; Partitioned" set modified 1 # Case 2: No SameSite attribute at all — add full attribute chain } else { append cookie "; SameSite=None; Secure; Partitioned" set modified 1 } if { $static::cookie_debug } { log local0.debug "CHIPS_IRULE: Patched cookie: $cookie" } } lappend new_cookies $cookie } # Only rewrite headers if we actually changed something if { $modified } { # Remove all existing Set-Cookie headers HTTP::header remove "Set-Cookie" # Re-insert all cookies (modified + unmodified) foreach c $new_cookies { HTTP::header insert "Set-Cookie" $c } } }
- carlbidwell268
Altostratus
I’ve run into something similar; Bot Defense can get pretty noisy if it hasn’t “learned” your traffic patterns properly yet. A lot of the false positives tend to come from legit users who don’t behave like a typical browser (mobile apps, certain extensions, corporate proxies, etc.). Instead of loosening everything at once, it helps to review the triggered violations and identify patterns; like specific URLs, user agents, or geos, and then apply targeted relaxations there. You can also temporarily switch some actions to log/alarm mode to observe behavior before enforcing blocks again. It’s a bit of an iterative process, but once you fine-tune it around real traffic, the noise usually drops off काफी a bit without compromising protection.
- renaud-gaspard
Nimbostratus
Thank you for your answers,
Using the iRule that update the cookie with "SameSite=None; Secure; Partitioned" didn't help solving most of our false positives issues.
However, enabling it slightly changes how browsers handle the third party cookies in Incognito mode. This means the protected websites are now usable in Incognito mode.We managed to isolate the false positives to specific cases :
- Browser Verification Timed out :
- Happened on a website that has a malformed HTML page : From what I understand, because of the malformed HTML, the challenge wasn't correctly injected and was never solved by the browser, causing this anomaly.
- Also happened after a specific series of action : A new browser tab was opened by mistake after clicking on a link. Because it was opened by mistake, the tab was immediately closed (before the injected challenge had the time to be solved). Later browsing to the site was blocked.
One possible workaround would be to show a CAPTCHA challenge instead of blocking the user entirely.
- Resource request without browser verification cookie
- This happens more frequently, we probably can just disable that specific check, or add the list of static file to the whitelist as Jeff_Granieri suggested.
- Session Opening
- This was reported by someone else and wasn't reproduced during our testing.
If it happens again, we probably can slightly raise the detection threshold.
- This was reported by someone else and wasn't reproduced during our testing.
- High number of HTML transactions since verification challenge:
- This only happens on a specific website/domain.
This website uses various iframes embedding the website into itself, and also has an API endpoint where the returned Content-Type header is "text/html" even if the returned data isn't HTML.
For this specific site we could create a microservice, with challenge-free browser verification.
- This only happens on a specific website/domain.
- Device ID Deletion
- We raised the detection value slightly and it didn't cause any issue since.
As you said, carlbidwell268 , we'll need to tweak this over time, but at least we know have a mostly working profile.
Thank you both for your help
- Browser Verification Timed out :
Recent Discussions
Related Content
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com