Forum Discussion
SHAREPOINT Crawler Traffic Segragation
Hi Team,
i am working on a the share point 2013 deployment with F5 LTM.
I am facing the issue with the Segregation of Crawler Traffic on the basis of User Agent.
Current irule setup is
when HTTP_REQUEST { set uagent [string tolower [HTTP::header User-Agent]] if { $uagent contains "ms search"} { pool pool_crawler } else { pool pool-NonCrwaler } }
The Irule is not working.
when taking captures from the IIS server """""""""""""""""""""""""""""""""""""""""""""""" GET / HTTP/1.0 Cache-Control: no-cache Connection: Keep-Alive Accept: / From: no-reply@x.ab.cc If-Modified-Since: Mon, 01 Jan 1601 00:00:00 GMT User-Agent: Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 6.0 Robot) Host: XXXXXXXXXXXXXXXXXXXXXXX Authorization: NTLM TlRMTVNTUAABAAAAB4IIogAAAAAAAAAAAAAAAAAAAAAGAbEdAAAADw== """"""""""""""""""""""""""""""""""""""""""""""""""
we can see that the Useragent contains MS SEARCH.
Could you please help in the correct IRULE condition which may help is in segregating the trrafic.
- Kevin_StewartEmployee
Your code looks correct. What indication are you getting that it's not working? Try adding some logging:
when HTTP_REQUEST { set uagent [string tolower [HTTP::header User-Agent]] log local0. "incoming uagent = $uagent" if { $uagent contains "ms search"} { log local0. "found a robot" pool pool_crawler } else { log local0. "not a robot" pool pool-NonCrwaler } }
- er_sandy_27437Nimbostratus
Hi Kevin,
Thanks for your response.
The above irule would not solve the purpose as it will send the information to /var/log/ltm. I can identify the same by checking the number of connections being sent to the CRAWLER pool which is Zero.
My concern is that the Irule is able to segregate the traffic based on the STRING "MS SEARCH" which we see in the USER-agent : User-Agent: Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 6.0 Robot)
Could you please help us in writing the correct script either on the basis of user-agent or some other for SHAREPOINT 2013.
Let me know if more information is required on the same.
- Kevin_StewartEmployee
The log statements are there for troubleshooting. If you watch the LTM log while testing, you'd see something like this:
: incoming uagent = blah; blah; blah; ms search 6.0 robot : found a robot : incoming uagent = blah; blah; blah : not a robot
So if you don't see any messages for "found a robot", then you know that either a) there aren't any requests with this user-agent, or b) your search criteria isn't correct.
Recent Discussions
Related Content
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com