开发者

Uploading large files to S3 with ruby (aws:s3) - connection reset by peer on UBUNTU

开发者 https://www.devze.com 2023-04-13 00:01 出处:网络
I am trying to store some large files on S3 using ruby aws:s3 using: S3Object.store(\"video.mp4\", open(file), \'bucket\', :access => :public_read)

I am trying to store some large files on S3 using ruby aws:s3 using:

S3Object.store("video.mp4", open(file), 'bucket', :access => :public_read)

For files of 100 MB or so everything is great but with files of over 200 MB I 开发者_C百科get a "Connection reset by peer" error in the log.

Has anyone come across this weirdness? From the web, it seems to be an issue with large but I have not yet come across a definitive solution.

I am using Ubuntu.

EDIT:

This seems to be a Linux issue as suggested here.


No idea where the original problem might be, but as workaround you could try multipart upload.

filename = "video.mp4"
min_chunk_size = 5 * 1024 * 1024  # S3 minimum chunk size (5Mb)
    @object.multipart_upload do |upload|
      io = File.open(filename)

      parts = []

      bufsize = (io.size > 2 * min_chunk_size) ? min_chunk_size : io.size
      while buf = io.read(bufsize)
        md5 = Digest::MD5.base64digest(buf)

        part = upload.add_part(buf)
        parts << part

        if (io.size - (io.pos + bufsize)) < bufsize
          bufsize = (io.size - io.pos) if (io.size - io.pos) > 0
        end
      end

      upload.complete(parts)
    end

S3 multipart upload is little tricky as each part size must be over 5Mb, but that has been taken care of above code.

0

精彩评论

暂无评论...
验证码 换一张
取 消