I'm reading about different approaches for scaling request handling capabilities on a single machine being taken by node.js, ruby, jetty and company.
Being an application developer, i.e. having very little understanding in Kernel/Networking I'm curious to understand the different approaches taken by each implementation (kernel select, polling the socket for connection开发者_如何学Python, event based and company.) ?
Please note that I'm not asking about special handling features (such as jetty continuations (request->wait->request), a pattern which is typical for AJAX clients) but more generally, should you like to implement a server that can respond with "Hello World" to the maximum number of concurrent clients how would you do it? and Why?
Information / References to reading material would be great.
Take a look at The C10K problem page.
精彩评论