Page 1 of 3 123 LastLast
Results 1 to 10 of 24
  1. #1
    Junior Member
    Join Date
    Apr 2009
    Posts
    24

    Default A Script to Automatically Backup Your Databases

    Hey all,

    In light of the recent events, I realized how woefully inadequate my backup habits are. As such, I decided to write a simple script that will dump all of my MySQL databases, tar them up, and email them to me. I've set it up on my account with a cron job so that it runs each day. Chances are I will probably change it to something a bit more manageable like weekly...but it's entirely your choice.

    It probably won't work too well with very large databases. You be the judge.

    Code:
    #!/bin/sh
    
    # backup each mysql db into a different file, rather than one big file
    # as with --all-databases - will make restores easier
    
    #requires: basename,date,md5sum,sed,sendmail,uuencode
    function fappend {
        echo "$2">>$1;
    }
    YYYYMMDD=`date`
    
    # CHANGE THESE
    TOEMAIL="youraddress@yourdomain.com"; #where you want it sent to
    FREMAIL="$USER@$HOSTNAME";
    SUBJECT="Daily MySQL Backup - $YYYYMMDD";
    MSGBODY="Hello, here is the backup of your MySQL databases from $YYYYMMDD.";
    ATTACHMENT="/home/$USER/db_backup.tgz" 
    MIMETYPE="application/x-compressed" 
    OUTPUTPATH="/home/$USER" 
    OUTPUTDIR="db_backup"
    MYSQLDUMP="/usr/local/bin/mysqldump"
    MYSQL="/usr/local/bin/mysql"
    LOGPATH="/var/log/db_backup.log"
    
    echo -e "\n##########\nStarting Backup Script" >> $LOGPATH
    date >> $LOGPATH
    echo -e "\nDumping Databases" >> $LOGPATH
    
    # make directory if it doesn't exist
    if [ ! -d "$OUTPUTPATH/$OUTPUTDIR" ]; then
        mkdir $OUTPUTPATH/$OUTPUTDIR
    fi
    
    # clean up any old backups - save space
    rm "OUTPUTPATH/OUTPUTDIR/*sql" > /dev/null 2>&1 >> $LOGPATH
    
    # get a list of databases
    databases=`$MYSQL -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
    
    # dump each database in turn
    for db in $databases; do
        echo $db >> $LOGPATH
        $MYSQLDUMP $db > "$OUTPUTPATH/$OUTPUTDIR/$db.sql"
    done
    
    # now tar it all up
    echo -e "Creating archive file" >> $LOGPATH
    cd $OUTPUTPATH
    # delete existing
    if [ -e "$OUTPUTDIR.tgz" ]; then
        rm $OUTPUTDIR.tgz
    fi
    tar -zvcf $OUTPUTDIR.tgz $OUTPUTDIR/ >> $LOGPATH
    
    # let's send it out
    echo -e "Emailing it to $TOEMAIL" >> $LOGPATH
    # DON'T CHANGE ANYTHING BELOW
    TMP="/tmp/tmpfil_123"$RANDOM;
    BOUNDARY=`date +%s|md5sum`
    BOUNDARY=${BOUNDARY:0:32}
    FILENAME=`basename $ATTACHMENT`
    
    rm -rf $TMP;
    cat $ATTACHMENT|uuencode --base64 $FILENAME>$TMP;
    sed -i -e '1,1d' -e '$d' $TMP;#removes first & last lines from $TMP
    DATA=`cat $TMP`
    
    rm -rf $TMP;
    fappend $TMP "From: $FREMAIL";
    fappend $TMP "To: $TOEMAIL";
    fappend $TMP "Reply-To: $FREMAIL";
    fappend $TMP "Subject: $SUBJECT";
    fappend $TMP "Content-Type: multipart/mixed; boundary=\""$BOUNDARY"\"";
    fappend $TMP "";
    fappend $TMP "This is a MIME formatted message.  If you see this text it means that your";
    fappend $TMP "email software does not support MIME formatted messages.";
    fappend $TMP "";
    fappend $TMP "--$BOUNDARY";
    fappend $TMP "Content-Type: text/plain; charset=ISO-8859-1; format=flowed";
    fappend $TMP "Content-Transfer-Encoding: 7bit";
    fappend $TMP "Content-Disposition: inline";
    fappend $TMP "";
    fappend $TMP "$MSGBODY";
    fappend $TMP "";
    fappend $TMP "";
    fappend $TMP "--$BOUNDARY";
    fappend $TMP "Content-Type: $MIMETYPE; name=\"$FILENAME\"";
    fappend $TMP "Content-Transfer-Encoding: base64";
    fappend $TMP "Content-Disposition: attachment; filename=\"$FILENAME\";";
    fappend $TMP "";
    fappend $TMP "$DATA";
    fappend $TMP "";
    fappend $TMP "";
    fappend $TMP "--$BOUNDARY--";
    fappend $TMP "";
    fappend $TMP "";
    #cat $TMP>out.txt
    cat $TMP|sendmail -t;
    rm $TMP;
    
    echo -e "\nComplete\n##########\n" >> $LOGPATH
    So, now the instructions:

    1. Put this in your home directory, /home/USERNAME/ (or anywhere you want really, just remember the path.)
    2. Make sure you change the specific variables at the top of the script. Everything that needs to be changed is commented.
    3. CHMOD it to 755
    4. Create the cron job to run it. Here's WestHost's documentation about crons: http://helpdocs.westserver.net/v3/si...Using_Cron.htm
      1. From the SSH command line enter this command: crontab -e
      2. Add this to the bottom line on the file:
        Code:
        31 22 * * * /home/USERNAME/database_backup.sh
      3. Press ctrl+o and Enter to save, then ctrl+x to quit


    That will make this script run every night at 10:22 PM. Modify the time entries in the cron to adjust how often you want to run the backup.

    In a nutshell, this script will first use mysqldump to make and sql file for each of your databases and put them all in the db_backup directory. It will then compress that whole directory and email it somewhere. What you do from there is up to you.

    Oh, and it will log it all to /var/log/db_backup.log. There's no sort of file size checking or log rotating, so watch out for the size of that file if you run this every day.

    I wrote this script fairly quickly, I'm sure it could use some polishing...there are probably some checks that I need to add in there before some commands. But it works, so I'm satisfied for now.
    Last edited by Fliggerty; 02-27-2010 at 03:10 PM.

  2. #2
    Moderator wildjokerdesign's Avatar
    Join Date
    Jun 2003
    Location
    Kansas City Mo
    Posts
    5,721

    Default

    Nice work! I want to look at it more and maybe work with you on adding some checking and such. Tomorrow I look closer.
    Shawn
    Please remember your charity of choice: http://www.redcross.org

    Handy Links: wildjokerdesign.net | Plain Text Editors: EditPlus | Crimson

  3. #3

    Default

    Thanks for the script! I've got it set-up now to run every other day. My initial test worked great. And just a note for folks who are on a 3.0 environment, the way to edit a cron script is editcron -e.

  4. #4
    Junior Member
    Join Date
    Apr 2009
    Posts
    24

    Default

    I intend to work on a different revision of this today. I'm going to add all of the proper checks that it should have had to begin with. I also intend to add the option to have it scp (or sftp, haven't decided which yet) the db's directly to another server. To be honest though, I've never written a script that takes flag input, so that is going to be a grand adventure!

    Oh, and editcron and crontab are synonymous in 3.0; one is an alias of the other.

  5. #5
    Moderator wildjokerdesign's Avatar
    Join Date
    Jun 2003
    Location
    Kansas City Mo
    Posts
    5,721

    Default

    Fliggerty,

    Just a quick note on the script. On the stuff I was working with before I had been playing with the idea of using the default $USER variable. That would be the same as having people edit the file and put in their default username. I was actually striving for something that if the person did not edit the script it would actually still work. I am going to play with the script today also and perhaps we can compare versions at some point. I figure even if we work on the same things that it is not a bad thing.
    Shawn
    Please remember your charity of choice: http://www.redcross.org

    Handy Links: wildjokerdesign.net | Plain Text Editors: EditPlus | Crimson

  6. #6
    Junior Member
    Join Date
    Apr 2009
    Posts
    24

    Default

    That is a fantastic idea. It would indeed limit the need for much of the customization of the script. However, they would still need to input an email address. Wonder if it would be simpler to add that as an argument that can be put in the cron instead?

  7. #7
    Moderator wildjokerdesign's Avatar
    Join Date
    Jun 2003
    Location
    Kansas City Mo
    Posts
    5,721

    Default

    At the moment I have this working.
    Code:
    TOEMAIL="$USER@$HOSTNAME"; #where you want it sent to
    FREMAIL="$USER@$HOSTNAME"; #where it's sending from, doesn't have to be a real address
    Sends to the default email address created on accounts. I am makeing slow progress on this but I have moved the mailing process to a sub routine/function. My idea is two fold, one is I would like to add the ability to zip and send the /var/www/html directory but I think trying to send that and the databases at one time might exceed maxes on attachment size. Already ran into that problem. Even the databases themselves have the chance of exceeding limits perhaps. In that case there might be need to split things up even further and email them.
    Shawn
    Please remember your charity of choice: http://www.redcross.org

    Handy Links: wildjokerdesign.net | Plain Text Editors: EditPlus | Crimson

  8. #8
    Junior Member
    Join Date
    Apr 2009
    Posts
    24

    Default

    I gotta tell you, I need to get in the habit of making better use of default environment variables. Always been a weakness in my scripting.
    So yeah, I've updated the script above with those changes. I've also put in a check for the tgz file before trying to delete it. That will remove a potential error.
    I'm sure a lot of people have html dirs that are way too big to be emailed, no matter how we compress them. My thought for that is to use scp or something to copy it directly to another server.
    It occurs to me that there are a few other things that might be useful to have backed up. Directories such as /var/log, /var/lib/awstats, and some config files like /etc/php.ini, /etc/httpd/conf/httpd.conf (actually probably should do the entire /etc/httpd dir, especially if you are using SSL.)
    Last edited by Fliggerty; 02-27-2010 at 03:25 PM.

  9. #9
    Moderator wildjokerdesign's Avatar
    Join Date
    Jun 2003
    Location
    Kansas City Mo
    Posts
    5,721

    Default

    Yes the more I work with this the more my head works. Shell scripts are not my area at all. I stumble through many things and learn from examples of others. I could most likely do something in PHP but I just don't think it would be as efficient.

    I think we need to find a way to limit directories that are zipped up. It would make it so that the end user had to make some choices, but my thought is that I really don't need a back up of everything in /var/www/html. What I need are directories that may change often. For example I have Gallery on on account. Don't need the base files just the uploaded images. That would be a community site where I allow my users to upload stuff.
    Shawn
    Please remember your charity of choice: http://www.redcross.org

    Handy Links: wildjokerdesign.net | Plain Text Editors: EditPlus | Crimson

  10. #10

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •