The main idea is that I have several worker instances of a Rails app, and then a main aggregate
I want to do something like this with the following pseudo pseudo-code
posts = Post.all.to_json( :include => { :comments => { :include => :blah } })
# send data to another, identical, exactly the same Rails app
# ...
# Fast forward to the separate but identical Rails app:
# ...
# remote_posts is the posts results from the first Rails app
posts = JSON.parse(remote_posts)
posts.each do |post|
p = Post.new
p = post
p.save
end
I'm shying away from Active Resource because I have thousands of records to create, which would mean thousands of requests for each record. Unless there is a way to do it all in one request with Active Resource that is simple, I'd like to avoid it.
- Format doesn't matter. Whatever makes it convenient.
- The IDs don't need to be sent, because the other app will just be creating records and assigning new IDs in the "aggregate" system.
- The hierarchy would need to be preserved (E.g. "Hey other Rails app, I have genres, and each genre has开发者_开发百科 an artist, and each artist has an album, and each album has songs" etc.)
There are several options you could implement to get this to work:
Active Resource
As others have answered, you could make use of ActiveResource. After reading your comments, this seems like a solution you'd like to steer clear of due to the multiple-request aspect
Receiving Controller
You could have a controller in your second rails application that receives data and creates records out of it.
class RecordReceiver < ActiveRecord::Base
def create
params[:data][:posts].each do |p|
Post.create(p)
end
end
end
You could namespace this controller inside an "API" namespace, which is a rather clean solution if implemented properly.
Share the Database
You could share one database across two applications. This means you won't need to send the data from one model to another, it will already be there. This is the least amount of work for you as a developer, but may not be possible depending on the system architecture you have.
Two databases in each application
You could implement multiple database in each application, like so:
#Add to database.yml
other_development:
adapter: mysql
database: otherdb_development
username: root
password:
host: localhost
other_production:
adapter: mysql
database: otherdb_production
username: root
password:
host: localhost
Then, define your models like so:
class Post < ActiveRecord::Base
end
class PostClone < ActiveRecord::Base
establish_connection "other_#{RAILS_ENV}"
end
Now, your Clone
model will point to the current database, and the PostClone
model will point to the other database. With access to both, you can copy the data over whenever you need to with basic model methods.
Conclusion
Since you don't want to use ActiveResource, I would recommend that you simply share the database between the applications. If this isn't a possibility, then try having two models, each going to a different database. Finally, the receiving controller is a valid, albeit slower option (as it needs to do the HTTP request on top of the database requests)
Use active resource to directly create your posts in the remote app.
http://railscasts.com/tags/19
Not exactly an answer, but couple of ideas:
- Instead of your
to_json
, you can callMarshal.dump
with your posts. - You can create a controller which would receive such serialized data through HTTP on remote rails instance,
Marshal.load
and save them (probably with some code to solve all kinds of collisions).
I'm not sure how marshaling would handle included data and how much work would be needed on remote side to ensure clean importing (what about records which would break some uniqueness etc), but I'd experiment a bit and see.
BTW, Since you asked the question in the first place, I guess standard database replication solutions don't work for you?
I have a similar use case and I use ActiveResource. If you want to preserve the contained objects this is a good choice. ActiveResource gives you a choice of JSON or XML as the wire format.
You can pack all your records in one request. At the receiving end you can process the request in one transaction.
Source App
class PostsController < ApplicationController
def synch
@posts = Post.all(:conditions => {...})
#
# Multiple posts and their child objects uploaded in one HTTP call.
#
Remote::Post.upload(@posts)
end
end
# ActiveResource model for remote Post
module Remote
class Post < ActiveResource::Base
self.site = "http://host:3000/"
def self.upload posts
# pass :include option to to_xml to select
# hierarchy.
body = posts.to_xml(:root => 'posts', :except => [:id]
:include => { :genres =>{ :except => [:id],
:artists => { :except => [:id],
:albums => { :except => [:id],
:songs => {:except => [:id] }
}
}
}
}
)
post(:upload, {}, body)
end
end
end
Destination App
class PostsController < ApplicationController
def upload
#
# Multiple posts and their child objects are saved in one call.
#
Posts.create(params[:posts])
end
end
class Post < ActiveRecord::Base
has_many :genres
accepts_nested_attributes_for ::genres
end
class Genre < ActiveRecord::Base
has_many :artists
accepts_nested_attributes_for :artists
end
class Artist < ActiveRecord::Base
has_many :songs
accepts_nested_attributes_for :songs
end
class Album < ActiveRecord::Base
has_many :albums
accepts_nested_attributes_for :albums
end
class Songs < ActiveRecord::Base
end
Other options for faster processing at the destination is ARExtensions. This gem supports bulk inserts.
Destination Route
map.resources :posts, :collection => { :upload => :post }
精彩评论