开发者

How to avoid flooding in my very simple web application and make it scalable?

开发者 https://www.devze.com 2023-03-06 05:20 出处:网络
I\'m trying to develop a very simple Java web application using JSP and Servlets. 1) There is a textbox and a submit button on the page,

I'm trying to develop a very simple Java web application using JSP and Servlets.

1) There is a textbox and a submit button on the page,

2) The user enters his name, say John, to the textbox and clicks the button,

3) The string is forwarded to my servlet,

4) At the doPost method of my servlet, I access the posted string variable,

5) The web service I'll use has a sayHello method that takes an argument and returns "Hello " concatenated with the argument,

6) So, I call the sayHello method of the web-service, get the returned variable and forward this to a JSP, which basically writes Hello John.

I have 2 questions:

1) Flooding: How do I avoid flooding of requests to my servlet? How do I deal with it? I thought of creating a thread to contact the web service and say hello. When a request arrives, I check if the thread is running/busy, if not, I process the request. Therefore, I would be answering at most 1 request per unit time. How does that sound?

2) Scalability: Assume that tonight, 1,000,000 million people will reach my web app and make my app say hello ten times each. How do I make sure that this app will scale well? Anything I can do about the JSP/Servlet part, other than the hardware the server is dependent on?

I know that the questions are kind of generic so I tried to provide as much detail as I can. I would really appreciate a thorough and to-the-point an开发者_如何学运维swer =)

Thanks in advance.


You can do lots of things to make it "scalable."

For one thing, the servlet is of such minor load that it's not likely to be where the load is a problem; the web service is. You can provide two types of scaling - horizontal and vertical - here.

Horizontal scaling is when you serve the request at the same speed, but can handle more of them. This is provided through load balancing, via F5 or some other load balancer, and having the web applications served by multiple web servers.

Vertical scaling is when you throw a faster processor at the problem. Horizontal is better.

Handling the service itself is your scalability failure point, as it's where the actual work is done. In this case, you have a simple service, so you could do the same thing with the web serice that you do with the servlet. But if the problem is "harder" - well, this is where you start using JMS to provide asynchronous services, such that your service providers consume requests and provide answers as soon as they can. This gives you a natural place to add more consumers, so if you find your JMS service isn't able to handle the requests, you add another consumer (another server listening to a queue); if that's not able to keep up, wash, rinse, repeat.

Of course, there are also cloud solutions to the problem; I work for GigaSpaces Technologies, which provides a service for horizontal scalability for web services, and we do a much better job than the solution I just outlined. To see something of how it works, see http://www.youtube.com/watch?v=YTEqFzrfVss .


  1. Your solution might save the day from flooding but is going to have a very bad impact on the availability of the application. The main cause of the flooding is too many requests coming from the same machine with malicious intentions to block your application. So I'd suggest associating Session IDs with Thread IDs. In other words, you'll process only one request of a user at a time. Next one will be dealt with only when first is done.

  2. I'd rather go with the solution by @Joseph Ottinger

0

精彩评论

暂无评论...
验证码 换一张
取 消