开发者

Prevent timeout when opening large files from URL

开发者 https://www.devze.com 2023-04-11 01:36 出处:网络
I am writing a Ruby 1.8.7 script which has to request really large XML files(1 - 5MB) from server which is quite slow(1min30sec for 1MB). The requested file is written to disk.

I am writing a Ruby 1.8.7 script which has to request really large XML files(1 - 5MB) from server which is quite slow(1min30sec for 1MB). The requested file is written to disk.

I set the timeout in my script to some ridiculous amount of seconds, since I really want to get the file, not just move on if it takes too long. Still with the high amount of seconds I keep getting timeouts.

Is there a best practice for this?

right now I use

  open(DIR + "" + number + "" + ".xml", 'wb') do |file|
  begin
    status = Timeout::timeout(6000000) do
      file &l开发者_JAVA百科t;< open(url).read
      end
    rescue Timeout::Error => e
      Rails.logger.info "Timeout for:" + number.to_s
    end
  end

now tought timeout was set in seconds which would make 6000000 way more then 1min30sec, but somehow it isn't using my timeout in seconds. Note again that i'm restricted to using Ruby 1.8.7


Unfortunately, this is problematic. In Ruby 1.9.x, open-uri-extended open can take read_timeout parameter, which it passes over to http library. But in Ruby 1.8.x, which you're using, this parameter is not available.

So, you need to use net/http directly, call start/get there and set read_timeout to your liking. If you just use the open-uri wrapper, the read_timeout remains 60 seconds, which is shorter than what you want.

0

精彩评论

暂无评论...
验证码 换一张
取 消