Forum Discussion
Mike_62629
Jul 16, 2008Nimbostratus
Rate limiting Search Spiders
We're currently having some problems with some web spiders beating up our webservers sucking up available sessions in our application and slurping up a whole bunch of our bandwidth. We're interested i...
Mike_62629
Jul 16, 2008Nimbostratus
I was thinking about that, it would limit to N pending requests.
I was worried about how to respond when I wanted to reject the request. It's easy at the tcp level, just reject the connection. At the request-level I have to think about how to respond without negatively impacting our page -rank.
I'm currently working on a rule which would rate limit to N requests per M seconds and have the same problem. How do I tell msn-search to buzz off without pissing off them off too much? -- though this won't be a big problem in the future as we've added crawl-delay to our robots.txt for future crawling.
Recent Discussions
Related Content
DevCentral Quicklinks
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com
Discover DevCentral Connects