Forum Discussion

JackM's avatar
JackM
Icon for Nimbostratus rankNimbostratus
Dec 18, 2013

Need help tracking down duplicate request objects --- thought irules may be helpful

I thought irules may be a a good utility for this. We believe we are getting on very rare occasions (unreproducibly) duplicate request objects which then seems to cause response not to get back to browser. Wireshark, and iislogs are not helping. Too much traffic to isolate.

 

We thought perhaps irule could tag and monitor for duplicates. Or perhaps another mechanism we may not have thought of.

 

Any help is greatly appreciated.

 

3 Replies

  • Can you clarify what you suspect is happening? What are these duplicate request objects?
  • JackM's avatar
    JackM
    Icon for Nimbostratus rankNimbostratus
    Sure. Now keep in mind too that this is what we are seeing. Maybe not what is happening exactly. This has been a real head scratcher and our biggest problem is the ability to trap it happening to examine it. Which is what I'm hoping to get some ideas for. On an https order the form submission it seems like 1 in 500 orders gets submitted twice. Maybe even less frequent. Given the regularity, and timestamp of the orders being off by milleconds to seconds we don't believe it is a double click. We think something is causing the request to get sent to two servers behind the pool.
  • You may have already figured something out given the delayed response, but it would certainly be possible to capture, log, and match on duplicate POST events in an iRule. Also given that you're only talking about a few seconds at most of separation between events, a relatively simple session table structure might be fitting. Un-vetted example:

     

    when HTTP_REQUEST {
        if { [HTTP::method] equals "POST" } {
            HTTP::collect [HTTP::header Content-Length] 
        }
    }
    when HTTP_REQUEST_DATA {
        if { [table lookup -subtable "POSTCHECK" [HTTP::payload]] != "" } {
             found a match
             ...now what?
        } else {
            table add -subtable "POSTCHECK" [HTTP::payload] 1 indef 10
        }
    }

    Again, completely untested example, but the idea is that a POST is collected and examined. If the POST payload (ie. username=foo&password=bar&buy=this) exists in the table, do something. Otherwise add this POST payload entry to the table for about 10 seconds. There's no mechanism in this example to prevent storing a very large POST payload, so you might need to take that into consideration; even better if you only need to filter on specific pieces of the data. It's also important to realize that this is only capturing client side events. If the browser is doing this (the most likely culprit in my opinion), then you should be able to see it. You might also want to store the client IP, browser version, and/or username in the table for comparison. If this is happening on the server side, somewhere in the web server code, then you won't see that here. It's entirely unlikely that the load balancer is producing this effect.