I am trying to send a file to S3 via a PUT request URL that Amazon S3 has already generated for me.
My code works fine for small files, but it errors out on large files (>100 mb) after a few minutes of sending.
There error is: The request was aborted: The request was canceled. at System.Net.ConnectStream.InternalWrite(Boolean async, Byte[] buffer, Int32 offset, Int32 size, AsyncCallback callback, Object state) at System.Net.ConnectStream.Write(Byte[] buffer, Int32 offset, Int32 size)
Can someone pleas开发者_C百科e tell me what is wrong with my code that is stopping it from sending large files? It is not due to the Amazon PUT request URL expiring because I have that set to 30 minutes and the problem occurs after just a few minutes of sending.
The code eventually exceptions out on this line of code: dataStream.Write(byteArray, 0, byteArray.Length);
Once again, it works great for smaller files that I am sending to S3. Just not large files.
WebRequest request = WebRequest.Create(PUT_URL_FINAL[0]);
//PUT_URL_FINAL IS THE PRE-SIGNED AMAZON S3 URL THAT I AM SENDING THE FILE TO
request.Timeout = 360000; //6 minutes
request.Method = "PUT";
//result3 is the filename that I am sending
request.ContentType =
MimeType(GlobalClass.AppDir + Path.DirectorySeparatorChar + "unzip" +
Path.DirectorySeparatorChar +
System.Web.HttpUtility.UrlEncode(result3));
byte[] byteArray =
File.ReadAllBytes(
GlobalClass.AppDir + Path.DirectorySeparatorChar + "unzip" +
Path.DirectorySeparatorChar +
System.Web.HttpUtility.UrlEncode(result3));
request.ContentLength = byteArray.Length;
Stream dataStream = request.GetRequestStream();
// this is the line of code that it eventually quits on.
// Works fine for small files, not for large ones
dataStream.Write(byteArray, 0, byteArray.Length);
dataStream.Close();
//This will return "OK" if successful.
WebResponse response = request.GetResponse();
Console.WriteLine("++ HttpWebResponse: " +
((HttpWebResponse)response).StatusDescription);
You should set the Timeout
property of the WebRequest
to a higher value. It causes the request to time out before it is completed.
Use Fiddler or Wireshark to compare what goes over the wire when it works (3rd-party tools) and when it doesn't work (your code)... once you know the differences you can change your code accordingly...
I would try writing it in chunks and splitting up the byte array. It may be choking on one large chunk.
Something like this:
const int chunkSize = 500;
for (int i = 0; i < byteArray.Length; i += chunkSize)
{
int count = i + chunkSize > byteArray.Length ? byteArray.Length - i : chunkSize;
dataStream.Write(byteArray, i, count);
}
May want to double-check that to make sure it wrote everything, I only did very basic testing on it.
Just a rough guess, but shouldn't you have:
request.ContentLength = byteArray.LongLength;
instead of:
request.ContentLength = byteArray.Length;
Having second thought, 100 MB = 100 * 1024 * 1024 < 2^32, so it probably won't be the problem
精彩评论