Forum Discussion
ASM throttle automated sessions. Humans get priority.
Tuning WebScraping can be quite a difficult task and the actual settings will heavily depend on your web application. You will inevitably go through may cycles of trial-and-error until yo uget this right.
However in your case I think WebScraping protection feature might not work at all. I might be wrong but it appears to me that you do not want to BLOCK the bots, you just want to slow them down because your bots are legitimate.
WebScraping protection in ASM won't let you do that - it drops connections from what it identifies as a bot when pre-configured threshold are exceeded. If you can't drop and block connections (because it will affect the business for example) then WebScraping is not for you - you will need an iRule which identifies your legitimate bots somehow (for example using a User Agent header) and then applies a rate-shaping profile to spoon-feed them the data they are after.
If you are OK with dropping the connection and just need to do it to overzealous bots then you can use the Client Side Integrity Defense - it injects a piece of JavaScript into your pages which tracks things like mouse moves and key presses (which humans do but scripts don't).
Re: tuning the settings of Bot Detection - there is a good article on DevCentral which explains the settings in detail here: https://devcentral.f5.com/articles/more-web-scraping-bot-detection
Hope this helps,
Sam
Help guide the future of your DevCentral Community!
What tools do you use to collaborate? (1min - anonymous)Recent Discussions
Related Content
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com