Forum Discussion

W__Ho_172333's avatar
W__Ho_172333
Icon for Nimbostratus rankNimbostratus
Aug 17, 2015

Remove server banner when block the robots

Hi all,

 

I have a requirement to block the robots. I have the following code that works just fine:

 

when HTTP_REQUEST { if { .... } some actions.... elseif { [string tolower [HTTP::uri]] equals "/robots.txt" } { HTTP::respond 200 content "User-agent: *\r\nDisallow: /"; log local0. "Attempt by [IP::client_addr] crawling our site" }

 

However, this code will reveal the server banner as: server: BigIP

 

Any suggestion to remove the server banner?

 

Thanks,

 

9 Replies

  • Use the "noserver" option inside the respond, that should do the trick

    when HTTP_REQUEST { 
          if { .... }{   
                some actions.... 
    
         } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{
    
                 HTTP::respond 200 noserver content "User-agent: *\r\nDisallow: /" 
                 log local0. "Attempt by [IP::client_addr] crawling our site" 
          }
    }
    
    • W__Ho_172333's avatar
      W__Ho_172333
      Icon for Nimbostratus rankNimbostratus
      Hi Michael, I did what you suggested. However, now when pointing to the url https://mysite.domain.com/robots.txt I received the error message stating: "Problem loading page" Secure Connection Failed The connection to the server was reset while the page was loading. The page you are trying to view cannot be shown because the authenticity of the received data could not be verified. Please contact the website owners to inform them of this problem. Any thoughts? Thank you.
  • Use the "noserver" option inside the respond, that should do the trick

    when HTTP_REQUEST { 
          if { .... }{   
                some actions.... 
    
         } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{
    
                 HTTP::respond 200 noserver content "User-agent: *\r\nDisallow: /" 
                 log local0. "Attempt by [IP::client_addr] crawling our site" 
          }
    }
    
    • W__Ho_172333's avatar
      W__Ho_172333
      Icon for Nimbostratus rankNimbostratus
      Hi Michael, I did what you suggested. However, now when pointing to the url https://mysite.domain.com/robots.txt I received the error message stating: "Problem loading page" Secure Connection Failed The connection to the server was reset while the page was loading. The page you are trying to view cannot be shown because the authenticity of the received data could not be verified. Please contact the website owners to inform them of this problem. Any thoughts? Thank you.
  • ok i placed the noserver option before the content ... now it is working

    when HTTP_REQUEST { 
          if { .... }{   
                some actions.... 
    
         } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{
    
                 HTTP::respond 200 content "User-agent: *\r\nDisallow: /" noserver "Content-Type" "text/plain"
                 log local0. "Attempt by [IP::client_addr] crawling our site" 
          }
    }
    

    a quick test:

    [root@f5vm01:] ~  openssl s_client -quiet -connect f5vmvs01.local:443
    ...
    GET /robots.txt HTTP/1.1
    Host: f5vmvs01.local
    
    HTTP/1.0 200 OK
    Content-Type: text/plain
    Connection: Keep-Alive
    Content-Length: 26
    
    User-agent: *
    Disallow: /
    
  • ok i placed the noserver option before the content ... now it is working

    when HTTP_REQUEST { 
          if { .... }{   
                some actions.... 
    
         } elseif {[string tolower [HTTP::uri]] equals "/robots.txt"}{
    
                 HTTP::respond 200 content "User-agent: *\r\nDisallow: /" noserver "Content-Type" "text/plain"
                 log local0. "Attempt by [IP::client_addr] crawling our site" 
          }
    }
    

    a quick test:

    [root@f5vm01:] ~  openssl s_client -quiet -connect f5vmvs01.local:443
    ...
    GET /robots.txt HTTP/1.1
    Host: f5vmvs01.local
    
    HTTP/1.0 200 OK
    Content-Type: text/plain
    Connection: Keep-Alive
    Content-Length: 26
    
    User-agent: *
    Disallow: /
    
  • I've seen somewhere in the config where you can customize the server type returned...it may be in the http profile that you assign to the virtual server