Forum Discussion
Bob_10976
Feb 18, 2011Nimbostratus
iRule to Block Google and other Search Engines
Hello all,
We would like to use an iRule to block google and other search engines from crawling our sites and was hopeing someone could point me in the right direction. In the past we woul...
Steve_Brown_882
Feb 18, 2011Historic F5 Account
I agree with Aaron that a datagroup is another good way to do this and it would allow you to keep a much larger txt list that you simply upload and apply.
Something like this would work.
when HTTP_REQUEST {
if { class match [string tolower [HTTP::header User-Agent]] contains "bots-list" ] } {
drop
}
else {
return
}
}
Recent Discussions
Related Content
DevCentral Quicklinks
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com
Discover DevCentral Connects