Backup website using tar command

If we installed any control panels like (cpanel,plesk,etc..,) in our server, there will be an option to take backup of all the things like (web content, database, etc..), If we don’t have any control panel in our server (for small company and personal server) we need to take backup manually but big companies having separate backup server such as EMC, netapp, etc…, For taking backup of all the things in server the best option will be using the tar command.

Common Syntax for tar

Syntax :# tar -zcvpf /[backup-location]/[backup-filename] /[webcontent-location]

z : Compress the backup file with ‘gzip’ to make it small size.
c : Create a new backup archive
v : verbosely list files which are processed
p : Preserves the permissions of the files put in the archive for later restoration.
f : following is the archive file name

1) To create tar Archive File

The below example backup the “2daygeek” directory content and stores to the backup location at /backup/site-backup/

# tar -cvf /backup/site-backup/2daygeek-backup-17-Dec-2013.tar /home/2daygeek
# ls -lh
-rw-r--r-- 1 root root  58M Dec 17 05:33 2daygeek-backup-17-Dec-2013.tar

See above, the backup file was stored at /backup/site-backup/

2) To create tar.gz Archive File and Exclude particular directory

The below example excludes the whole “demo” directory and archive the rest of the files and folders to “.tar.gz” archive file.

# tar –exclude=’/home/2daygeek/demo’ -zcvpf /backup/site-backup/2daygeek-backup-17-Dec-2013.tar.gz /home/2daygeek

3) To create tar.gz Archive File and Exclude group of files

The below example is exclude the “.mp3 & .avi” files from “demo” directory and archive the rest of the files to “.tar.bz2”.

# tar –exclude=’/home/2daygeek/demo/*.avi’ –exclude=’/home/2daygeek/demo/*.mp3' -zcvpf /backup/site-backup/2daygeek-backup-23-nov-2013.tar.gz /home/2daygeek

4) Backup shell script

I have created the file called “” on bin directory and put the below code and the file permission should be 755 so that you can execute the file.

The below shell script take all website backup into separate files.

DATE=$(date +%d-%m-%Y)

# To create a new directory into backup directory location #
mkdir -p $BACKUP_DIR/$DATE

# take each website's backup in separate name, use below format #
tar -zcvpf $BACKUP_DIR/2daygeek-$DATE.tar.gz /home/2daygeek

# Delete files older than 10 days #
find $BACKUP_DIR/* -mtime +10 -exec rm {} \;

5) Cron job for scheduling backup

To schedule the job at your convenient time, use cron.I have set the cron every day 7′o clock to take backups of the databases

0 19 * * * /bin/

Hope this article was useful for you. Kindly provide your valuable feedback/comments in the commenting section.

Stay tuned with us !!

Magesh Maruthamuthu

Love to play with all Linux distribution

You may also like...