Automatic Backup of files, websites and Databases with Cron Jobs and TAR

I’m going to share with you the way I do backups on my servers.

First of all, most of the servers I work with run on Debian Lenny, but this should work with all Debian releases and Debian based systems ( such as Ubuntu ).

First, lets say you have your website at

/var/www/mywebsite

create a file named www_backup.sh at

/etc/cron.daily

and paste / edit these contents

#!/bin/sh

# script by olivermgrech.com

BAK=/backup/folder

umask 022

if cd $BAK; then

date=`date -I`

# mywebsite

tar -zcf /backup/folder/mywebsite_bak_$date.tgz /var/www/mywebsite

mysqldump -uYOURUSERNAME -pYOURPASSWORD YOURDATABASENAME| gzip > /backup/folder/mywebsite_db_mywebsite_$date.sql.gz

#delete backups older then 30 days

find /backup/folder/ -mtime +30 -type f -exec rm -rf {} \;

fi

set permissions, 700 will be very good.

now access your cron job tables by using

crontab -e

command and add the following entry in the opened text file.

00 00 * * * /etc/cron.daily/www_backup.sh

Thats all, with this set, you will be having auto-backups everyday @ midnight.