Page 1 of 2 12 LastLast
Results 1 to 10 of 12
  1. #1

    Default Backup -- better way of doing it than "the WH way"

    Hello,

    I love the ease of use with the backup solution WH has to offer, but there is one thing I do not like about it : your whole VPS is down until you have completed the download of your backup file. Is there any better free backup solution out there?

    I do not know *nix commands very well, but I didn't know if there was a way to zip your whole VPS and then download it via FTP or what have you.. that way i'm assuming your VPS will not be down during the file transmission.

    Thanks for any advice,

    Richard
    Join the fight against cancer -- http://www.cancer.org

    Wear Yellow -- http://www.nike.com/wearyellow

  2. #2
    Senior Member jalal's Avatar
    Join Date
    May 2003
    Location
    Germany
    Posts
    1,377

    Default

    Which parts of your server do you want backed up?

  3. #3
    Senior Member
    Join Date
    Aug 2004
    Location
    The land of the nerds
    Posts
    102

    Default

    You may use the tar and gzip commands in ssh to compress a specific directory.
    I am just a customer of WestHost trying to help others. I own Wikinerds.org and you can also see my personal homepage and my blog!

  4. #4

    Default

    Most importantly everything under :

    /var/www/
    /ftp/pub/
    /etc/mail/

    .. if the whole VPS can't be done "easily" ..

    Thanks,

    Richard
    Join the fight against cancer -- http://www.cancer.org

    Wear Yellow -- http://www.nike.com/wearyellow

  5. #5
    Senior Member jalal's Avatar
    Join Date
    May 2003
    Location
    Germany
    Posts
    1,377

    Default

    Welll, you'll find that something like (untested):

    $ tar czf /site-backup.tgz /var/www/ /ftp/pub/ /etc/mail/

    will do that quite easily. You'll end up with a file in the root called 'site-backup.tgz'

  6. #6
    Moderator wildjokerdesign's Avatar
    Join Date
    Jun 2003
    Location
    Kansas City Mo
    Posts
    5,720

    Default Cool

    Well I just gave Jalal's suggestion a try and it worked pretty well. Now I only did it on one a directory for a subdomain I have been working on but it zipped them in no time and saved the file in root.

    Now my question is... I wonder if you could add alias commands to the .bashrc to make things even easier. My thought is you could have something like this:
    Code:
    alias back_www="tar czf /www-backup.tgz /var/www/" 
    alias back_pub="tar czf /pub-backup.tgz /ftp/pub/" 
    alias back_mail="tar czf /mail-backup.tgz /etc/mail/" 
    alias back_all="tar czf /site-backup.tgz /var/www/ /ftp/pub/ /etc/mail/"
    I haven't edited a .bashrc file for quite some time. Do you have to restart if you do? I can't rember from the last time I did it.

    The reason I like this is because as I mentioned above I tested it on a new subdomain I have been working on and last night I was makeing changes via my laptop. This morning it was very easy to zip those changes up and download them to my main computer.

    Thanks jalal for the idea and rbayless for asking the question.
    Shawn
    Please remember your charity of choice: http://www.redcross.org

    Handy Links: wildjokerdesign.net | Plain Text Editors: EditPlus | Crimson

  7. #7
    Senior Member jalal's Avatar
    Join Date
    May 2003
    Location
    Germany
    Posts
    1,377

    Default

    Hi Shawn

    Your idea will work. You don't need the '$' in the command line though.

    Just:
    Code:
    alias back_www="tar czf /www-backup.tgz /var/www/"
    should to the job.

    You should log out of the shell and back in to have the aliases reset from the .bashrc. No need to restart the server.

  8. #8
    Moderator wildjokerdesign's Avatar
    Join Date
    Jun 2003
    Location
    Kansas City Mo
    Posts
    5,720

    Default

    Thanks Tim,
    I'll fix my post so folks won't get confused. That's what I get for using cut and past and not proof reading close enough.
    Shawn
    Please remember your charity of choice: http://www.redcross.org

    Handy Links: wildjokerdesign.net | Plain Text Editors: EditPlus | Crimson

  9. #9
    Senior Member torrin's Avatar
    Join Date
    May 2003
    Location
    Vista, CA
    Posts
    534

    Default

    One thing. I don't know if you plan on putting this in a cron job or not. If you are, the alias probably will not work because you .bashrc is not sourced when running a cron. So you'd probably be better off writing a script.

  10. #10

    Default

    Quote Originally Posted by jalal
    Welll, you'll find that something like (untested):

    $ tar czf /site-backup.tgz /var/www/ /ftp/pub/ /etc/mail/

    will do that quite easily. You'll end up with a file in the root called 'site-backup.tgz'
    Thank you jalal and others for your response!

    I am assuming, like the example above, if I put in /ftp/pub/ it will backup all the child subdirectories underneath /pub/ ? Also, is there a limit to the # of directories the tar command line will take?

    Thanks,

    Richard
    Join the fight against cancer -- http://www.cancer.org

    Wear Yellow -- http://www.nike.com/wearyellow

Similar Threads

  1. Gripe: WestHost site backup
    By Prentiss Riddle in forum Account Maintenance
    Replies: 4
    Last Post: 03-05-2008, 08:13 AM
  2. backup files to server
    By dansroka in forum General Discussion
    Replies: 2
    Last Post: 09-12-2007, 07:22 AM
  3. Backup with telnet or Putty
    By aberg in forum PHP / MySQL
    Replies: 7
    Last Post: 08-15-2006, 07:44 AM
  4. Webalizer and Backup, a solution
    By nsc in forum General Discussion
    Replies: 0
    Last Post: 12-18-2004, 01:39 PM
  5. Webalizer & site backup
    By nsc in forum Account Maintenance
    Replies: 1
    Last Post: 11-29-2004, 01:13 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •