开发者

Rails Database Back-up Script

开发者 https://www.devze.com 2023-01-19 05:34 出处:网络
I currently use the script below to back-up a website but it could be improved dramatically!Please could you suggest any improvements, or perhaps alternative solutions?

I currently use the script below to back-up a website but it could be improved dramatically! Please could you suggest any improvements, or perhaps alternative solutions?

Currently, I only delete items after a massive amount has been reached - and this is not good. Does anyone know how I can delete items that are a month old, or start deleting when there are fifty backups and star开发者_开发技巧t deleting the oldest items first?

require 'find' 
require 'ftools'
  namespace :db do  desc "Backup the database to a file. Options: DIR=base_dir 
RAILS_ENV=development MAX=20"
    task :backup => [:environment] do
      datestamp = Time.now.strftime("%d-%m-%Y_%H-%M-%S")    
      base_path = ENV["DIR"] || "db"
      backup_base = File.join(base_path, 'backup')
      backup_folder = File.join(backup_base, datestamp)
      backup_file = File.join(backup_folder, "#{RAILS_ENV}_dump.sql.gz")    
      File.makedirs(backup_folder)
      db_config = ActiveRecord::Base.configurations[RAILS_ENV]   
      sh "mysqldump -u #{db_config['username'].to_s} #{'-p' if db_config[
'password']}#{db_config['password'].to_s} --opt #{db_config['database']} | 
gzip -c > #{backup_file}"     
      dir = Dir.new(backup_base)
      all_backups = (dir.entries - ['.', '..']).sort.reverse
      puts "Created backup: #{backup_file}"     
      max_backups = ENV["MAX"] || 10000000
      unwanted_backups = all_backups[max_backups.to_i..-1] || []
      for unwanted_backup in unwanted_backups
      FileUtils.rm_rf(File.join(backup_base, unwanted_backup))
      puts "deleted #{unwanted_backup}"
    end
    puts "Deleted #{unwanted_backups.length} backups, #{all_backups.length - 
unwanted_backups.length} backups available"
  end
end


We use this script, which isn't quite as complex as yours but does more or less the same thing:

#!/usr/bin/env ruby
require "date" 

DBS = %w( list the databases to back up )
USER = "" # Username with rights to all those databases, might be root 
PW = "" # Password for that username 

today_s = Date.today().to_s
yesterday_s = (Date.today()-(2)).to_s

DBS.each do |db|
  system "/usr/bin/mysqldump --user=#{USER} --password=#{PW} --add-drop-database --opt -icC #{db} > ~/dbs/#{today_s}-#{db}.sql" 
  if File.exist?("/path/to/backups/dbs/#{yesterday_s}-#{db}.sql")
    File.unlink("/path/to/backups/dbs/#{yesterday_s}-#{db}.sql")
  end
end

We then run that with cron on a regular basis (4x/day, but obviously we only keep the most-recent one from each day, because later ones for each day will overwrite earlier ones). It keeps two days worth of backups; we have a remote server which uses scp to copy the entire /path/to/backups/dbs/ directory twice daily, and that one keeps backups until we have time to burn them to DVD-ROM.

Notice that if it misses a deletion the file will hang around for quite a while--the script only deletes "yesterday's" file, not "all files older than X," which your script does. But you can probably take some ideas from this and incorporate them in your script.


why dont use git with cron job ?

git setup:

cd /PATH/TO/EXPORTFILE/
git init .
git add .
git commit -am "init commit"

cron job:

mysqldump -uUSER -pPASSWORD --skip-extended-insert DBNAME > /PATH/TO/EXPORTFILE/FILENAME.SQL && \
cd /PATH/TO/EXPORTFILE/ && \
git add . && \
git commit -am "MYSQL BACKUP" |  mail -s "MYSQL BACKUP CRON JOB" your@emailaddress.com

no deleting file, history for ALL mysqls dumps depending on cron job execution times...


Since you already put timestamp in your back up folder name, why don't you parse folder name and delete whatever has timestamp that's older than 30 days?

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号