Forum Discussion

WildWeasel's avatar
WildWeasel
Icon for Altocumulus rankAltocumulus
Jun 08, 2016

How could I use iRule and Data Group vs Query Logging on External GTM's

We have an environment where we have an external view on our Infoblox Servers that sync with our External(public/internet facing) GTM's.

 

I have about 300+ domains that I need to validate are no longer being used and my old DNS days tells me to turn on query logging on my 4 External facing GTM's to capture any attempts on those GTM's for a min of 30days which would be over 1000+ domains then try and search for the 300+ domains I am trying to do research on. If this is the route, then I am going to try and utilize Splunk for the parsing of the data but a few issues with this approach is I'm not sure our Splunk Server(s) have the capacity for a month worth of queries on 1000+ domains and another issue is the GTM's are already highly utilized in Memory and CPU and adding query logging I'm afraid will kill them.

 

A better option I believe would be to utilize an iRule and a Data Group but not quite sure how to assemble this solution.

 

My thought is to create a Data Group (DG) with either 300+ specific domains or hopefully I can use wildcards so instead of blah.com I could use (la.com) which would cover multiple that I want to capture data on.

 

Create an iRule to look at incoming DNS_REQUEST and change to all lower case (since everything seems so case sensative), see if the DNS_REQUEST contains a domain in the DG if so, then send to the log "domain blah.com is in use, record type is [DNS::rrtype]"

 

Anyone have any thoughts on this approach and what the iRule may look like?

 

3 Replies

  • RFC2616 (section 3.2.3) states that host header evaluation MUST be case insensitive i.e., domain.com and DOMAIN.COM should be the same. Are you seeing different behavior on the GTM ?

     

  • This should log all DNS requests and you can probably grep for the domain name and then do a diff with the domain list that you have:

    when DNS_REQUEST {
        log local0. "QUERY from ([IP::client_addr]) for [DNS::question name])"
    }
    
  • I'd say put all the 300 domains in question into a data group, then for each request where the domain matches an entry in the data group make a log request against your Splunk. Just make sure you do not write anything into the default logs on the BigIP box itself. You can use high speed logging for your purpose.