Hi aaron,
I want to give you the full story. The reason I am writing this iRule is to redirect bots like Googlebot to the correct URL. www3 and www10 are our backdoor virtual hosts and I don't want bots to index them (which they are). With that said, I did take both of your suggestions with my latest attempt. Here it is:
when HTTP_REQUEST {
switch -glob [string tolower [HTTP::host]] {
"www2.*" -
"www3.*" -
"www10.*" {
switch -glob [string tolower [HTTP::header User-Agent]] {
"googlebot*" -
"mediapartners*" -
"msnbot*" -
"slurp*" -
"ia_archiver*" -
"zyborg*" -
"askjeeves*" {
HTTP::respond 301 Location "http://www.[domain [HTTP::host] 2][HTTP::uri]"
}
}
}
}
}
So basically, if the the request is for www2, www3, or www10 and the agent is an identified bot, then do a 301 redirect to www, otherwise don't do anything.
I manage the dns for my domains and I know that I only use ww2, www3, and www10 so the first switch would work perfectly.
We just want to increase our www page ranks in google and other search engines by redirecting all the www2, www3, and www10 hosts.
I hope that makes more sense. Sorry for not filling you in on the full story... I just didn't want to add confusion to my current problem with extracting the domain from HTTP::host
Any comments would be greatly appreciated.
Hai