Forum Discussion
Remove server banner when block the robots
Hi all,
I have a requirement to block the robots. I have the following code that works just fine:
when HTTP_REQUEST { if { .... } some actions.... elseif { [string tolower [HTTP::uri]] equals "/robots.txt" } { HTTP::respond 200 content "User-agent: *\r\nDisallow: /"; log local0. "Attempt by [IP::client_addr] crawling our site" }
However, this code will reveal the server banner as: server: BigIP
Any suggestion to remove the server banner?
Thanks,
9 Replies
- Michael_61033
Nimbostratus
Use the "noserver" option inside the respond, that should do the trick
when HTTP_REQUEST { if { .... }{ some actions.... } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{ HTTP::respond 200 noserver content "User-agent: *\r\nDisallow: /" log local0. "Attempt by [IP::client_addr] crawling our site" } }- W__Ho_172333
Nimbostratus
Hi Michael, I did what you suggested. However, now when pointing to the url https://mysite.domain.com/robots.txt I received the error message stating: "Problem loading page" Secure Connection Failed The connection to the server was reset while the page was loading. The page you are trying to view cannot be shown because the authenticity of the received data could not be verified. Please contact the website owners to inform them of this problem. Any thoughts? Thank you.
- Michael__
Nimbostratus
Use the "noserver" option inside the respond, that should do the trick
when HTTP_REQUEST { if { .... }{ some actions.... } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{ HTTP::respond 200 noserver content "User-agent: *\r\nDisallow: /" log local0. "Attempt by [IP::client_addr] crawling our site" } }- W__Ho_172333
Nimbostratus
Hi Michael, I did what you suggested. However, now when pointing to the url https://mysite.domain.com/robots.txt I received the error message stating: "Problem loading page" Secure Connection Failed The connection to the server was reset while the page was loading. The page you are trying to view cannot be shown because the authenticity of the received data could not be verified. Please contact the website owners to inform them of this problem. Any thoughts? Thank you.
- Michael__
Nimbostratus
ok i placed the noserver option before the content ... now it is working
when HTTP_REQUEST { if { .... }{ some actions.... } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{ HTTP::respond 200 content "User-agent: *\r\nDisallow: /" noserver "Content-Type" "text/plain" log local0. "Attempt by [IP::client_addr] crawling our site" } }a quick test:
[root@f5vm01:] ~ openssl s_client -quiet -connect f5vmvs01.local:443 ... GET /robots.txt HTTP/1.1 Host: f5vmvs01.local HTTP/1.0 200 OK Content-Type: text/plain Connection: Keep-Alive Content-Length: 26 User-agent: * Disallow: /- W__Ho_172333
Nimbostratus
Hi Michael, It works! Thank you so much!
- Michael_61033
Nimbostratus
ok i placed the noserver option before the content ... now it is working
when HTTP_REQUEST { if { .... }{ some actions.... } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{ HTTP::respond 200 content "User-agent: *\r\nDisallow: /" noserver "Content-Type" "text/plain" log local0. "Attempt by [IP::client_addr] crawling our site" } }a quick test:
[root@f5vm01:] ~ openssl s_client -quiet -connect f5vmvs01.local:443 ... GET /robots.txt HTTP/1.1 Host: f5vmvs01.local HTTP/1.0 200 OK Content-Type: text/plain Connection: Keep-Alive Content-Length: 26 User-agent: * Disallow: /- W__Ho_172333
Nimbostratus
Hi Michael, It works! Thank you so much!
- JWhitesPro_1928
Cirrostratus
I've seen somewhere in the config where you can customize the server type returned...it may be in the http profile that you assign to the virtual server
Help guide the future of your DevCentral Community!
What tools do you use to collaborate? (1min - anonymous)Recent Discussions
Related Content
* Getting Started on DevCentral
* Community Guidelines
* Community Terms of Use / EULA
* Community Ranking Explained
* Community Resources
* Contact the DevCentral Team
* Update MFA on account.f5.com