I have a Tomcat Server with 250 connection threads. When I simulate concurrent file upload for 30 files (each 100 MB), the CPU and RAM Memory of the server machine goes to peak i.e., 95% usage.
I use following block of code to read the file data from HTTP Post.
// request is instance of HTTPServletRequest
int nDataLength = request.getContentLength();
byte dataBytes[] = new byte[nDataLength];
int bytesRead = 0;
int totalBytesRead = 0;
int bytesLimit = 1024;
InputStream in = new InputStream(request.getInputStream());
try
{
while(totalB开发者_开发知识库ytesRead < nDataLength)
{
bytesRead = in.read(dataBytes, totalBytesRead, bytesLimit);
totalBytesRead += bytesRead;
}
}
catch(Exception ex)
{
throw ex;
}
finally
{
in.close();
}
My doubts are:
- What could be the maximum number of concurrent File Uploads (each 100 MB files) that a Tomcat Server can handle ?
- Is there any optimization required in my code to make use of all 250 connection threads ?
- Introducing
sleep
can cause lengthy uploads. How to write efficient code ?
Thanks in advance.
regards, Kingsley Reuben J
NOTE: I wont be able to use third party applications to resolve this problem
You're running out of memory with the original solution, so stacker's suggestion (storing data in file) should work. Uploading files over http to tomcat is just not ideal to do bulk massive uploads.
You could try using a simpler server / more efficient protocol (e.g. ftp) or profile your Application Server & Application to see where your bottleneck is. One of the things that comes to mind is that HTTP uploads have to be MIME-decoded.
You should write the received data to a temporary file (in smaller blocks, say 8KB) since 30 * 100MB = 3GB and probably you machine starts paging out memory. The throughput is limited by you interface adapter.
精彩评论