Forum Discussion
Sharepoint 2013 crawler traffic/localhost issue
Hi Guys,
I have setup F5 LTM to load balance Sharepoint 2013 SP1. I have three servers in the farm. One server is acting as an app server, the other are acting as WFEs and load balanced to serve traffic.
I have noticed that if a crawl from the search application is started, it fails. So I set a HOSTS file entry for the site to be crawled, at 127.0.0.1 so crawler traffic does not go through F5 again. This works a charm (the other option is to use Set-SPSiteUrl - I am using host named site collections so AAM is not supported - to set a URL in the default zone that is not going through F5). However, can I do something similar to the hosts file trick via F5 LTM?
So my requirement is: If traffic to the site url is being initiated by the crawler, keep the traffic on the local server and make sure it does not leave the server (thus going to the VIP and then through F5 again where the load balanced nodes are WFEs).
Thanks!
- Lucas_Thompson_Historic F5 Account
Assuming that this is the Sharepoint Alternate Access Mapping + redirect headache. I wonder if you could rewrite the host header in the client's request to trick SP into not doing the redirect for your specific client IP?
Something like:
when HTTP_REQUEST { if { [IP::addr [IP::client_addr] equals a.b.c.d ] } { HTTP::header replace Host "hostnamethatspwants.example.com" }
Recent Discussions
Related Content
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com