robots.txt rule?
Hello,
Bear with me if this has been answered elsewhere, I've found threads that seem similar but nothing exactly like what I'm trying to do and none of the related examples seem to work.
I'm trying to present a robots.txt file in front of a VIP using an uploaded file and a simple iRule. The VIP in question is actually just a group of other iRules with no actual root directory on a server that I could otherwise drop this file into. In fact I'd like to use this robots.txt file in front of other web services whether or not the VIP points at an actual root directory, and whether or not it has other iRules in place.
It seems like this should be simple:
when HTTP_REQUEST {
if { [HTTP::uri] == "/robots.txt" } {
HTTP::respond 200 content [ifile get robots.txt]
}
}
And the above works, if it's the only iRule on the VIP. If I point a browser at https://host.foo.org I get the real server behind the F5 and if I point it at https://host.foo.org/robots.txt I get the contents of the robots.txt file.
But if I add it to a VIP with other iRules, while the later iRules work, the first rule in the list fails. That is, if I add it with a second rule like:
when HTTP_REQUEST {
if { [HTTP::uri] == "/gopher.jpg" } {
HTTP::respond 200 content [ifile get gopher.jpg]
}
}
and then try my browser at https://host/.foo.org/ I get the real server behind the F5. If I point it at https://host.foo.org/gopher.jpg I get the gopher. But if I point it at https://host.foo.org/robots.txt I get ERR_CONNECTION_RESET
I'd like to do this with one rule that sits on multiple VIPs whether or not it's added to other iRules. It seems like it should be simple. What am I missing?
Thanks,
Randy in Seattle