Forum Discussion
iRule to limit request rate of search engines?
Is there sample code I could look at to help write an iRule to throttle search engines as they crawl? I think I can do this by user-agent. The goal is to allow Google and others to crawl, while at the same time, limit them from taking all of our server memory. I appreciate any suggestions or guidance to sample code or other ideas. Many thanks! Dianna
1 Reply
- rob_carr
Cirrocumulus
Take a look at this rule: https://devcentral.f5.com/wiki/iRules.ControllingBots.ashx
You create two pools, where both have the same servers, but one of the pools is a 'slow' pool where the members have connection limits.
Beyond the aggregate number of connections, the connections will get dropped.
Hope that helps.
Help guide the future of your DevCentral Community!
What tools do you use to collaborate? (1min - anonymous)Recent Discussions
Related Content
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com