Forum Discussion

Matt_59095's avatar
Matt_59095
Icon for Nimbostratus rankNimbostratus
Aug 18, 2011

IP to DNS reverse lookup + Geo location? Delimitted format?

LTM 1500 9.4.8

 

 

Hi - we'd like to see where our website traffic is coming from, and by which companies. I'd prefer to log that into a format that i can read into a database table, so that i can create some friendly reports for management, if doable.

 

 

I'm sure there are irules like this already setup in the forum, and many different ways to go about this, but i'm not having any luck finding it.

 

 

Can someone point me in the right direction to an irule or forum topic? My irule skills are still beginner -> medium, so please be gentle.....

 

 

I have a winscp connection to the bigip, so i'm able to pull logs to my sql server.

 

 

  • Hi Aaron,

    Here is the exact rule, copied from my LTM. I haven't even checked logging yet - i'm just testing to see if my webpage comes up after i apply the irule, and it does not. I get the "no data received" message in my chrome browser - error 324 ERR_EMPTY_RESPONSE. As i said, i have this irule, and a default pool - i wasn't sure if i maybe need to add the pool into the irule itself somewhere for this to all process correctly? I'm content with logging this locally, since traffic load will be really light, and this will just process one log entry per session, so there shouldn't be a whole lot of data being logged.

     
     when CLIENT_ACCEPTED {  
      Add some logic for determining which clients to log for  
     if {[matchclass [IP::client_addr] equals $::filteredAddresses]}{  
      Get time for start of TCP connection in milleseconds  
     set tcp_start_time [clock clicks -milliseconds]  
      Log the start of a new TCP connection  
     log "New TCP connection from [IP::client_addr]:[TCP::client_port] to [IP::local_addr]:[TCP::local_port]"  
     } else {  
      Disable all events for this rule and any other rule for this connection  
     event disable all  
     }  
     }  
     when HTTP_REQUEST {  
      Get time for start of HTTP request  
     set http_request_time [clock clicks -milliseconds]  
      Log the start of a new HTTP request  
     set LogString "Client [IP::client_addr]:[TCP::client_port] -> [HTTP::host][HTTP::uri]"  
     log local0. "$LogString (request)"  
     }  
     when HTTP_RESPONSE {  
      Received the response headers from the server. Log the pool name, IP and port, status and time delta  
     log local0. "$LogString (response) - pool info: [LB::server] - status: [HTTP::status] (request/response\  
     delta: [expr [clock clicks -milliseconds] - $http_request_time]ms)"  
     }  
     when CLIENT_CLOSED {  
      Log the end time of the TCP connection  
     log "Closed TCP connection from [IP::client_addr]:[TCP::client_port] to [IP::local_addr]:[TCP::local_port]\  
     (open for: [expr [clock clicks -milliseconds] - $tcp_start_time]ms)"  
     } 
     
  • That iRule shouldn't affect requests or response handling as it's just logging. Do you see an error in /var/log/ltm when you see the 324 error? If you take the iRule off the virtual server do you still see that error?

     

     

    Aaron
  • Aaron - if i enable the rule, the site does not come up, when i remove the irule, the site comes back up.

     

     

    I'll check the log

     

  • I'm finally getting back around to this - here is the error in the logs when i try and apply this irule:

     

     

    Nov 22 16:56:37 tmm tmm[1727]: 01220001:3: TCL error: MyIrule - can't read "::filteredAddresses": no such variable while executing "matchclass [IP::client_addr] equals $::filteredAddresses"

     

    Nov 22 16:56:37 tmm tmm[1727]: 01220001:3: TCL error: MyIrule - can't read "tcp_start_time": no such variable while executing "expr [clock clicks -milliseconds] - $tcp_start_time"

     

     

    this is on 9.4.6.
  • Do you have a datagroup named exactly filteredAddresses? If not, the runtime error is expected as the datagroup referenced from the iRule doesn't exist.

     

     

    It won't effect the error, but you should remove the $:: prefix from the iRule reference to the datagroup name if you're on 9.4.4 or higher:

     

    http://devcentral.f5.com/wiki/iRules.cmpcompatibility.ashx

     

     

    Aaron
  • Hi Aaron - no, i don't have a datagroup named filteredAddresses - do i need to make one, or modify the irule? Sorry, i'm a little clueless.
  • You can create an address type data group in the GUI under Local Traffic >> iRules >> Data Group List tab >> Create. Select a type of 'Address' and then add the hosts and/or subnets you want to log for.

    Or if you want to log for all clients, you can skip using a datagroup and use an iRule like this:

    
    when CLIENT_ACCEPTED {
     Get time for start of TCP connection in milleseconds
    set tcp_start_time [clock clicks -milliseconds]
    
     Log the start of a new TCP connection
    log local0. "New TCP connection from [IP::client_addr]:[TCP::client_port] to [IP::local_addr]:[TCP::local_port]"
    }
    when HTTP_REQUEST {
     Get time for start of HTTP request
    set http_request_time [clock clicks -milliseconds]
    
     Log the start of a new HTTP request
    set LogString "Client [IP::client_addr]:[TCP::client_port] -> [HTTP::host][HTTP::uri]"
    log local0. "$LogString (request)"
    }
    when HTTP_RESPONSE {
     Received the response headers from the server. Log the pool name, IP and port, status and time delta
    log local0. "$LogString (response) - pool info: [LB::server] - status: [HTTP::status] (request/response\
    delta: [expr {[clock clicks -milliseconds] - $http_request_time}]ms)"
    }
    when CLIENT_CLOSED {
     Log the end time of the TCP connection
    log "Closed TCP connection from [IP::client_addr]:[TCP::client_port] to [IP::local_addr]:[TCP::local_port]\
    (open for: [expr {[clock clicks -milliseconds] - $tcp_start_time}]ms)"
    }
    

    Aaron
  • Aaron - that works, however i get an entry in the log for each page for the connection, and even just images on the page - so i get way too much log writing. I was trying to go with the filteredAddress irule because it seemed to figure out a way to only log one entry per client connection.
  • Hi Aaron - i know you're just helping people out of the goodness of your heart, and you have a ton of posts that you reply to - but do you have any other comments per the first irule i was trying to get working? Is there anyway to setup the filteredAddresses to be equal to the dynamic IP during the connection, so i can simply log one IP per "session", or is there another better way to do this?

     

     

    I'm really just trying to get one IP per day, or the closest i can come to that, so that i don't have to do unique lookups in a database table after importing this data on a daily basis (as well as keeping log writing on the LTM to a minimum), and setting up some reports for upper management to look at.

     

     

    Thanks for getting back to me on the previous posts.
  • Hi Matt,

     

     

    Sorry, I missed your second question last week.

     

     

    To log one entry per client IP per day, you'd need to track whether you've logged already. Here are a few options I can think of:

     

     

    - If your clients generally support cookies, you could try setting a cookie which expires in 24 hours or at midnight your time. The downside to this is that you'd log every HTTP request for clients who don't support cookies.

     

     

    - You could try to track each unique client IP per day in a memory subtable on LTM. The downside to this is that you could use up a lot of LTM memory just for this logging.

     

     

    - You could log everything but not to the local LTM filesystem. On 9.4.x, your most efficient option would be to use the log command with a remote syslog server:

     

     

    From: http://devcentral.f5.com/wiki/iRules.log.ashx

     

    log < remote_ip>:< remote_port> < facility>.< level>] < message>

     

     

    With this option, you could run the syslog server on the SQL server, parse the syslog messages from a file to a SQL table, parse the unique IPs per day and then run your IP lookup only once per IP.

     

     

    In v10.1 - 10.2.x, you can use High Speed Logging to do this even more efficiently: http://devcentral.f5.com/wiki/iRules.hsl.ashx

     

     

    In v11, you can use a new logging profile to configure this via the GUI. You can check the WA implementation guide for details on this (the info is valid for LTM as well): http://support.f5.com/content/kb/en-us/products/wa/manuals/product/wa_implementations_11_0_0/_jcr_content/pdfAttach/download/file.res/wa_implementations_11_0_0.pdf

     

     

    I think the last option is the best from an LTM perspective as it offloads all of the logic for what to log and parse to a separate host. If you want details on any of these approaches let me know and I can provide some more info.

     

     

    Aaron