Forum Discussion

Adam_Fradley_16's avatar
Adam_Fradley_16
Icon for Nimbostratus rankNimbostratus
Oct 06, 2010

Multiple actions form single URL with iRule

Hi,

I need to give my customers the ability to run a server side action (flushing our custom software's cache) from a URL.

 

 

 

I have written a page that when accessed on the server via http will flush the cache, but it'll only do it on the on that specific server that the load balancer chooses, therefore I need a way of sending four URLs to clear all four servers.

 

 

 

My idea is to also give the customer a unique URL with say "http://customer.com/flush" and then write an irule to look from "/flush" and fire off:

 

 

 

http://1/serverflush

 

 

http://2/serverflush

 

 

http://3/serverflush

 

 

http://4/serverflush

 

 

 

 

We don't care what the response of the individual servers are, just that they receive the request from the F5.

 

 

 

It's not exactly a fantastic plan I know, but it only has to work for the next month while we rewrite our software to handle it properly.

 

 

 

Does anyone have any examples of an iRule that can do this type of action.

 

 

 

Thanks,

 

 

 

Adam

 

 

 

  • Hey Adam,

     

     

    I suspect that there are better solutions than this, but the first thing that comes to mind is to use the node command ( http://devcentral.f5.com/wiki/default.aspx/iRules/node.html ) to send the requests to the specific servers. This would be fairly simple. The iRule to do this might look something like:

     

     

    when HTTP_REQUEST {

     

    switch -glob [string tolower [HTTP::path]] {

     

    "/flush/1/" { node 10.0.0.1 80 }

     

    "/flush/2/" { node 10.0.0.2 80 }

     

    "/flush/3/" { node 10.0.0.3 80 }

     

    "/flush/4/" { node 10.0.0.4 80 }

     

    }

     

    }

     

     

    I don't believe that there is a method to turn a single HTTP request from a client into multiple requests to nodes. The HTTP::retry command might be able to accomplish this, but I'm not having much luck figuring out the best way to iterate through all of the pool members in a retry attempt - or even if this would work.

     

     

    What might actually be a better solution is to modify the switch statement above to respond with a page that would call itself several times to do this. Something like...

     

     

    when HTTP_REQUEST {

     

    switch -glob [string tolower [HTTP::path]] {

     

    "/flush/" { HTTP::respond 200 content {

     

     

     

     

     

     

    Cache Flushing

     

     

    Currently Flushing Cache

     

    } }

     

    "/flush/1/" { node 10.0.0.1 80 }

     

    "/flush/2/" { node 10.0.0.2 80 }

     

    "/flush/3/" { node 10.0.0.3 80 }

     

    "/flush/4/" { node 10.0.0.4 80 }

     

    }

     

    }

     

     

    I haven't tested this, but it at least parses. You could use any type of link that a browser will automatically follow, img tags, javascript links, etc. Just anything to get the browser to try to follow the links automatically.

     

     

    Hope this helps some!

     

     

    // Ben