I currently have a web application online which has around 500 users. What i was not expecting was that the web app is performing horrible under such a load. Visiting any page results in minutes of waiting. Basically each user once he uses the web application main feature is invoking a process where i read a few bytes from another web server and then forward that data to the users browser. The page is performing normally until it reaches a certain load of of around 200-300 concurrent download processes as described earlier.
I am using IIS 7.5 and ASP.NET (4). For database access i use LINQ TO SQL. I already tweaked IIS and removed any restrictions like max concurrent per cpu etc. I lost my hope that the IIS configuration is problematic in this case. What i fear is that my code produces such resul开发者_如何学Cts for IIS or that even such a approach is the main cause of this issue.
Currently it works the following way:
User visits EG.aspx
EG.aspx uses .NET HTTP Post and GET requests to authorize to another web server.
EG.aspx opens a stream to read from the second web server (file download)
EG.aspx reads 1024 bytes from the second web server and write those 1024 bytes to the browser
All headers etc. are properly set. I am not invoking any new threads during this process. It's just a long method performing all those steps. The file downloads are pretty large (around 100MB). During all that time it takes to transfer such a file to the user the page is running the mentioned process. Each user can invoke this process multiple times (download multiple files).
Are the number of streams a issue? Is there any way i could achieve the same results (your web server downloads files and sends it to the client without caching them on disk) with another approach? What would be that approach?
Web servers -- especially thick web servers like IIS or Apache -- actually max out on concurrent users very quickly. ASP.NET exacerbates this quite a bit due to features. What you need to do is offload this process to happen outside of a the client's web request to your server somehow. Easiest way to get there would be to use some sort of message bus such as MassTransit or NServiceBus. Then you could make it work like:
- User requests EG.ASPX
- EG.ASPX creates a message in the service bus and returns a "processing request" page of some sort.
- Message bus processes message and downloads file or gets an error or whatever.
- User is notified of results somehow -- lots of ways to do this -- that their file is ready
- User visits appropriate page to get their file.
For notifications, you could handle it a few ways:
- If it is a pretty short transaction, you could probably have a "please wait" page doing an ajax pingback.
- If it is longer-running, you might want to email users when it is ready.
- If you have a user portal page, you could probably work it in there.
But, bottom line, handling this within your ASP.NET page directly is pretty folly.
Why don't you just redirect to the file you are downloading. currently, the file is moving from the server to your server then to the client. So you have latency coming in to play there.
Are you using load balancing? Several hundred 100MB files being transferred at the same time is a lot of bandwidth for a single server.
can't you cache the bytes for a period of time?
Store them in a static dictionary, use locking, you will see huge performance gains.
精彩评论