Forum Discussion

Ger_Hackett_323's avatar
Ger_Hackett_323
Icon for Nimbostratus rankNimbostratus
Aug 25, 2011

Limit number of resource intensive pages being served

Hi All,

 

 

I have a website that contains a mixture of static and dynamic pages. Some of the dynamic pages are very resource intensive and can be indentify by examining the URI.

 

 

I would like to limit the number of these resource intensive pages that can be served at any one time. Does the rule below look like it might do the trick? It looks too simple to be true. Having reviewed other examples, they tend to decrement the counter when the connection is closed, when the CLIENT_CLOSED event fires. I have chosen to decrement the counter when the HTTP_RESPONSE event is fired because at that stage all of the hard work of generating the response should have been completed.

 

 

rule limit_specific_pages {

 

when RULE_INIT {

 

set ::requests_in_progress 0

 

set ::max_pages 20

 

}

 

 

when HTTP_REQUEST {

 

if { [HTTP::uri] starts_with "/abc" } {

 

This is a resource intensive page, so check now many

 

currently are in progress before we serve it

 

set was_resource_intensive 1

 

incr ::requests_in_progress

 

if { $::requests_in_progress >= $::max_pages } {

 

There are too many resource intensive pages being served at the moment

 

so send the user to the too busy page

 

HTTP::redirect "http://mydomain.com/too.busy"

 

return

 

} else {

 

We are under the limit, so just serve the page

 

pool A

 

}

 

} else {

 

This is just a request for a regular page

 

set was_resource_intensive 0

 

pool A

 

}

 

}

 

when HTTP_RESPONSE {

 

Now that the page has been processed, if it was a resource intensive one

 

decrement the counter

 

if { $was_resource_intensive }

 

{ incr ::requests_in_progress -1 }

 

}

 

}
  • Hamish's avatar
    Hamish
    Icon for Cirrocumulus rankCirrocumulus

     

    Looks OK at first glance. Decrementing when HTTP_RESPONSE is received is a good idea (Assuming that the CPU consumption is completed by the time HTTP_RESPONSE fires... (Which will be before the data has finished being sent, so if the page streams back slowly while the CPU is still being consumed, it'll decrement a bit soon).

     

     

    Instead of sending to the too.busy page... I'd do somethign a bit clever... Assuming this is for a PAGE (Or an iFrame etc) and never for a piece of a page, then you could send a quick piece of .js that simply counts down for X seconds and then retries automatically... (I'd suggest queuing, but that way lies depletion of resources as the queue gets bigger... A two tier approach perhaps? Small queue and auto-retry? (Sorry... dreaming out loud here :)

     

     

     

    H
  • Hi Hamish,

     

     

    Thanks for the detailed reply. I will be doing something a little more sophisticated for the “too busy” page. I just wanted to keep the code example simple.

     

     

    Your observation on when the HTTP_REPSONSE event fires is something I was also concerned about. I have done some tests and the latency between the first header and the last packet of the response is typically less than a second. So while the CPU will still be a little busy when the event fires, I can adjust the max_pages value to compensate for this.

     

     

    Thanks again,

     

     

    Ger.