Forum Discussion
John_Geddes_295
Nimbostratus
Sep 11, 2006iRule optimization
The iRule below was created by a former employee, and after reading the manual and some posts in this forum, I read that this rule is not really optimized, and will still be processed (due to the HTTP_REQUEST) for every consecutive hit after the first one, including images.
I basically want to rule out bot traffic to another pool, then let the rest come in to the primary pool. Any suggestions
when HTTP_REQUEST { if { [matchclass [IP::remote_addr] equals $::blacklisted_clients] } { pool Bots} elseif { [matchclass [HTTP::header User-Agent] contains $::blacklisted_useragents] } { pool Bots } elseif { [string first -nocase "bot" [HTTP::header User-Agent]] >= 0 } { pool Bots } else { pool MainPool } }
1 Reply
- Deb_Allen_18Historic F5 AccountIt seems this rule will accomplish your goal by filtering off 3 suspicious groups of traffic while allowing other traffic through to the main pool.
when CLIENT_ACCEPTED { if { [matchclass [IP::remote_addr] equals $::blacklisted_clients] } { pool Bots event disable all } } when HTTP_REQUEST { if { [matchclass [HTTP::header User-Agent] contains $::blacklisted_useragents] } { pool Bots } elseif { [string first -nocase "bot" [HTTP::header User-Agent]] >= 0 } { pool Bots } else { pool MainPool } }
Recent Discussions
Related Content
DevCentral Quicklinks
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com
Discover DevCentral Connects