Suppose we want to design a Java EE web application system which is expected to show high performance and efficiency in responses since too many users may send requests to the services.
I'm thinking of deploying the application instances onto different application servers (JBoss, Tomcat etc.). So we have the notion of "cluster of app servers". And this application requires some of data to be processed on different engines which are again settled on different servers.
So we have also the notion of "cluster of engine servers".开发者_JAVA技巧 And suppose we are to access the application from a single point, e.g. from a simple URL. How to relate this URL with the cluster of app servers? What kind of strategies should be implemented to design such a system consisting of app servers and some other servers which are required to communicate as efficient as possible?
Any solutions and suggestions are welcome. If you have dealed with real world problems similar to this, can you help me? And any suggestions for tutorials, e-books, samples are also welcome.
To answer (some of) your questions:
- How to relate this URL with the cluster of app servers? Use a load balancer in front of your web servers -- preferably use a hardware one (faster) but otherwise you can build it with normal commodity PC/hardware (e.g. haproxy)
- What kind of strategies should be implemented to design such a system consisting of app servers and some other servers which are required to communicate as efficient as possible? Depending on the type of data you are exchanging and how. If you don't need synchronous calls, then you can probably look at messaging -- JMS, ActiveMQ or RabbitMQ; so send a message to backend servers and have another thread waiting for replies, when the reply arrives pick it up and dispatch. If you need synchronous communication, there's lots of options -- from simple socket-based communication, to RMI, SOAP etc. Worth considering the amount of data exchanged in between the 2 layers.
精彩评论