we have website application expect receive incredibly high traffic @ several points throughout year. have third party load balancing software redirects users 'holding' page during busy periods, prevent our web application servers being suffocated amount of requests coming in.
going forward have more control on process , implement virtual queue of kind. current load balancer has no queuing functionality, allows traffic through based on rate limit. random , pot luck on when refresh page (or auto refreshed).
i've done reading online found little implementation detail on how implement basic virtual http request queue. there of course companies offer fledged service such queue-it , netprecept these seem overkill our current needs (and expensive).
the web application in question written in asp.net mvc. bearing in mind not need advanced features 'queue priority' etc. @ moment, have created basic proof-of-concept using static queue manager class, using concurrentqueue<t>
etc. wondering if valid, scalable approach? can part of main application layer? or should kept separate? 1 have technical know-how on how implement kind of feature asp.net mvc app?
edit: answers far. of answers seem go lot of detail caching. (very) heavily employed on our website, using asp.net web caching, caching full page requests @ load balancer level , object caching using appfabric.
the reason ability manage queue because process database-write heavy. we're creating orders product via website. means these db transactions taking things account last-minute stock checking etc. performance issues arise, , reason wanting implement queuing system of kind.
throwing more resources @ database server not realistic option. i'm looking details of technical implementations of queuing system of nature (c# or otherwise). sorry if wasn't made clear originally.
are considering following points while measuring performance of application?
- caching
- sessionless controllers
- asynccontrollers
output caching :
perhaps useful feature of mvc3 (performance-wise) output caching. biggest performance hits occur when application has fetch data, calculations on , return data. output caching can cache these results can returned directly without touching database. when executing complex queries can drop load on server (in fact drop load on server whooping 90% inplementing caching in web application).
namespace mvcapplication1.controllers { public class datacontroller : controller { [outputcache(duration=10)] public string index() { return datetime.now.tostring("t"); } } }
sessionless controllers :
controllers session state disabled provide optimization controllers not require session state. stateless controllers meant situations not require concept of session.
by default asp.net pipeline not process requests belonging same session concurrently. serialises them, i.e. queues them in order received processed serially rather in parallel. means if request in progress , request same session arrives, queued begin executing when first request has finished.
let's @ example; page making 3 asynchronous ajax requests server, session state enabled(also note session must used, asp.net smart enough not serialise requests if never use session state, if it's enabled).
jquery
$(document).ready(function () { //make 3 concurrent requests /ajaxtest/test (var = 0; < 3; i++) { $.post("/ajaxtest/test/" + i, function (data) { //do data... }, "json"); } });
controller - action method
public class ajaxtestcontroller : controller { [httppost] public jsonresult test(int? id) { thread.sleep(500); return json(/*some object*/); } }
you can see effect of serialised requests in network profile; each request takes 500ms longer previous one. means we're not getting benefit making these ajax calls asynchronously. let's @ profile again session state disabled our ajaxtestcontroller (using [sessionstate] attribute).
[sessionstate(sessionstatebehavior.disabled)] public class ajaxtestcontroller : controller { //...as above }
much better! can see how 3 requests being processed in parallel, , take total of 500ms complete, rather 1500ms saw in our first example.
async-controllers :
first, controller begins 1 or more external i/o calls (e.g., sql database calls or web service calls). without waiting them complete, releases thread asp.net worker thread pool can deal other requests.
later, when of external i/o calls have completed, underlying asp.net platform grabs free worker thread pool, reattaches original http context, , lets complete handling original request.
how measure response time under heavy traffic?
to understand how asynchronous controllers respond differing levels of traffic, , how compares straightforward synchronous controller, can put create sample mvc 2 controllers. simulate long-running external, both perform sql query takes 2 seconds complete (using sql command waitfor delay ’00:00:02′) , return same fixed text browser. 1 of them synchronously; other asynchronously.
in example can check simple c# console application simulates heavy traffic hitting given url. requests same url on , over, calculating rolling average of last few response times. first on 1 thread, gradually increases number of concurrent threads 150 on 30-minute period. if want try running tool against own site, can download c# source code.
the results illustrate number of points how asynchronous requests perform. check out graph of average response times versus number of concurrent requests (lower response times better):
to understand this, first need tell had set asp.net mvc application’s worker thread pool artificially low maximum limit of 50 worker threads. server has default max threadpool size of 200 – more sensible limit – results made clearer if reduce it. can see, synchronous , asynchronous requests performed same long there enough worker threads go around. , why shouldn’t they? once threadpool exhausted (> 50 clients), synchronous requests had form queue serviced. basic queuing theory tells average time spent waiting in queue given formula:
and see in graph. queuing time grows linearly length of queue. (apologies indulgence in using formula – can’t suppress inner mathematician. i’ll therapy if becomes problem.) asynchronous requests didn’t need start queuing soon, though. don’t need block worker thread while waiting, threadpool limit wasn’t issue. why did start queuing when there more 100 clients? it’s because ado.net connection pool limited 100 concurrent connections default.
hope should you.
Comments
Post a Comment