Forum Discussion
Mike_62629
Jul 16, 2008Nimbostratus
Rate limiting Search Spiders
We're currently having some problems with some web spiders beating up our webservers sucking up available sessions in our application and slurping up a whole bunch of our bandwidth. We're interested in rate-limiting them.
I found what appeared to be a very relevant iRule at http://devcentral.f5.com/Default.aspx?tabid=109 (third place winner), but when I try to load it up in the iRule editor it complains. It complains, I believe, because HTTP Headers are not available from within CLIENT_ACCEPTED and CLIENT_CLOSED logic. That makes sense because CLIENT_ACCEPTED and CLIENT_CLOSED are associated with building and destroying tcp connections (i believe), so it wouldn't make sense for data (headers/req-uri's) to be transferred at that time.
Does anyone have any suggestions on how to accomplish this or something similar?
- aneilsingh_5064Nimbostratus@Colin, I am on version 10.2.1. I would be very interested in some working examples.
- hooleylistCirrostratusI think you can tell Google and Bing to crawl your sites at a slower rate:
- aneilsingh_5064NimbostratusHi there, we are a SaaS company and therefore don't have control of all our communities submissions to Search engines. As well each of our Communities we allow the ability for them to have a robots.txt file as well. These robots.txt files are handled and server up with Code and not sitting in the root of the typical web server.
Recent Discussions
Related Content
DevCentral Quicklinks
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com
Discover DevCentral Connects