开发者

Port files from Cloudfiles to S3

开发者 https://www.devze.com 2023-01-31 03:10 出处:网络
I have to migrate off Rackspace to Amazon. I have a big rails app that has saved lots of files on Cloudfiles and I ll have to export them to S3.

I have to migrate off Rackspace to Amazon. I have a big rails app that has saved lots of files on Cloudfiles and I ll have to export them to S3. A开发者_高级运维re you aware of any script or process to do that migration?

Thank you


Just a tip: If you have a lot of files (or a few big ones), it makes sense to rent an EC2 instance for this. While you still have to pay bandwidth on both ends, the transfer between EC2 and S3 is free. This saves the bandwidth costs of the server.


Should be fairly straightforward to do something like this using the respective gems and a rake task:

# connect to cloudfiles & aws
cf_container.objects.each do |object_name|
  cf_object = cf_container.object object_name
  AWS::S3::S3Object.store(object_name, cf_object.data, 'name_of_s3_bucket')
end

The biggest downside to something like this is that you are passing every file through your your server/local machine. S3 allows you to make a bucket writeable from another source, but the Rackspace CloudFiles API doesn't offer any kind of "post to" service (understandably so).


Here's what I used

  def move_to_s3_from_rackspace()
    cf_connection = Fog::Storage.new({
      :provider           => 'Rackspace',
      :rackspace_username => USERNAME,
      :rackspace_api_key  => RACKSPACE_API_KEY,
      :rackspace_region   => RACKSPACE_REGION
    })
    s3_connection = Fog::Storage.new({
      :provider => 'AWS',
      :aws_access_key_id => AWS_ACCESS_KEY_ID,
      :aws_secret_access_key => AWS_SECRET_ACCESS_KEY
    })
    cf_directory = cf_connection.directories.get(RACKSPACE_CONTAINER_NAME)
    s3_directory = s3_connection.directories.get(S3_BUCKET_NAME)
    s3_file_keys = s3_directory.files.map { |file| file.key }
    cf_directory.files.each do |file|
      if s3_file_keys.include?(file.key) # already exists
        p "file already exists, skipping: '#{file.key}'"
        next
      end
      s3_directory.files.create(key: file.key, body: file.body)
    end
  end


I recently had to do this myself and wrote a nice Ruby script to do it as efficiently as I could (forking processes to avoid the GIL). I spun up a 16-core EC2 instance and was able to transfer 175,000 files in just under an hour and a half. Cost $1.32 for the instance. https://github.com/cannikin/great-migration

0

精彩评论

暂无评论...
验证码 换一张
取 消