I'm 开发者_如何学运维developing a "script generator" to automatize some processes at work. It has a Rails application running on a server that stores all data needed to make the script and generates the script itself at the end of the process.
The problem I am having is how to export the data from the ActiveRecord format to Plain Old Ruby Objects (POROs) so I can deal with them in my script with no database support and a pure-ruby implementation.
I thought about YAML, CSV or something like this to export the data but it would be a painful process to update these structures if the process changes. Is there a simpler way?
Ty!
By "update these structures if the process changes", do you mean changing the code that reads and writes the CSV or YAML data when the fields in the database change?
The following code writes and reads any AR object to/from CSV (requires the FasterCSV gem):
def load_from_csv(csv_filename, poro_class)
headers_read = []
first_record = true
num_headers = 0
transaction do
FCSV.foreach(csv_filename) do |row|
if first_record
headers_read = row
num_headers = headers_read.length
first_record = false
else
hash_values = {}
for col_index in 0...num_headers
hash_values[headers_read[col_index]] = row[col_index]
end
new_poro_obj = poro_class.new(hash_values) # assumes that your PORO has a constructor that accepts a hash. If not, you can do something like new_poro_obj.send(headers_read[col_index], row[col_index]) in the loop above
#work with your new_poro_obj
end
end
end
end
#objects is a list of ActiveRecord objects of the same class
def dump_to_csv(csv_filename, objects)
FCSV.open(csv_filename,'w') do |csv|
#get column names and write them as headers
col_names = objects[0].class.column_names()
csv << col_names
objects.each do |obj|
col_values = []
col_names.each do |col_name|
col_values.push obj[col_name]
end
csv << col_values
end
end
end
精彩评论