view · edit · print · history

1. Goal

You can find many others backup solutions in this Wiki, each of them fitting a given context.
As I did not find a solution that fits my case, I decided to create my own solution.

My context

I tested this procedure with my Linksys NSLU2, with Openslug 2.7 beta.

  • 2 attached disks on the slug,
  • 1 disk (500 GB) is storing all live data, the other (200 GB) is dedicated for backups,
  • the main disk (500 GB) is mounted on /home,
  • the backup drive (200 GB) is mounted on /backup.

The challenge

  • I have only 200 GB to backup 500 GB disk,
  • I want to have many dated backups of my data (not easy with only 200 GB backup space),
  • I want each backups to be a snapshot :
    • no incremental backups : I want to be able to browse each dated backup and to find all data, whether they were modified or not (well, a snapshot) (sounds more and more challenging with a 200 GB backup drive),
    • no compression : I want to be able to browse backup-ed data the same way than live data.

My requirements

  • I want to choose/change easily which data (directories) I want to backup,
  • I want to specify the backup frequency for each directory,
  • I don't want to install any additionnal software on my Windows PC (KIS : Keep It Simple),
  • I want the backup to occur even if all Windows workstations are shut down i.e. I want the slug to be the master of the backup :
    • As a consequence, I will not backup the Windows system files with this procedure (I prefer to have ghosts of all my PCs?),
    • All I want to backup are the data stored on the 500 GB disk on the 200 GB disk,
  • I want to specify directory by directory the number of dated snapshots I want to keep. All additional snapshots must be automatically deleted,
  • I want to receive a mail with the result of the backup.

2. An overview of the solution

All my requirements all together could seem insane.
How to backup 500 GB on 200 GB, with multiple snapshots (dated backup) and no incremental implementation ?

The idea is the following (thanks to all the guys that proposed that idea on Internet) :

  • when you create a snapshot, you just compare the snapshot with the last snapshot,
  • when a file has not changed since the last snapshot, you do not create a new file in the new snapshot, but a hard link,
    • a hard link is a Unix trick that lets you link an existing file
    • you can access to the hard link as if it was the file itself
    • you must delete the file and all its hard links to really delete the data : there is an occurence counter, and the data are deleted only if the counter is 0

The rest is only scripting...

3. Prerequisites

I tested this procedure with my Linksys NSLU2, with Openslug 2.7 beta.

You need some optware packages already installed :

 ipkg update

to have a real scripting capability

 ipkg install bash
 ipkg install coreutils
 ipkg install gawk

to schedule the backups

 ipkg install cron

to receive the result of the backup by mail

 ipkg install mailx (available in OpenSlug-2.7)
 ipkg install nail
 ipkg install ssmtp (available in OpenSlug-2.7)

to perform the backup

 ipkg install rsync

4. Overview of the procedure

  1. Create the backup script
  2. Create the configuration file to specify which directories to backup
  3. Add cron jobs to schedule the backups
  4. Configure the mail sending

5. Step 1 : Create the backup script

  • Create a script file called backup.sh in /home/scripts.
The script is too long to be included in this procedure. Please follow this link to see the script.

NOTE that you will have to make several corrections when using the script in pre-OpenSlug-2.7 environment. Up until OpenSlug-2.7, various system utilities were placed in `/opt/bin` directory. OpenSlug-2.7 probably changes this standard and places them (more appropriately) in `/usr/bin`. Therefore make sure that you change `/usr/bin` to `/opt/bin` in the backup script if you want to use it in OpenSlug-2.6 or earlier distributions.

  • Do not forget to chmod 700 backup.sh
  • there are very few things to change to adapt to your configuration. When necessary, change the following constants (first lines of the script) using vi for example :

Constants you are likely to change

  • BACKUP_DIRECTORY : the mounting point of the backup drive (/backup for me)
  • BACKUP_PARTITION : the reference of the backup drive in /dev (/dev/sdb2 for me). Used to report the available free space

Constants you are not likely to change

  • BACKUP_DIRECTORY_PATTERN : if I backup a directory called "Data", the backup directory will be named "BackupData?". The word "Backup" that is appended at the beginning is contained in the BACKUP_DIRECTORY_PATTERN constant
  • STANDARD_CONFIG_FILE : the file name of the config file that specifies which directories to backup (use it if you have only one configuration file, i;e; you backup all directories at the same frequency).

Script usage

With no parameters


The script will describe shortly how to use this script.

With 1 parameter

 backup.sh DailyBackups?

The first parameters is the name of a file that contains the directory to backup (see Step 2 : Create the configuration file to specify which directories to backup).

With 3 parameters (mainly for debugging purposes)

 backup.sh /home/Outlook Outlook 12

The 3 parameters are exactly equivalent to one line in the previous config file (see Step 2 : Create the configuration file to specify which directories to backup) :

  • the directory to backup
  • the keyword that will appended to backup directory name (in this example, the backup directory will be named BackupOutlook?)
  • the number of backups I want to keep (all additional backups will be deleted)

6. Step 2 : Create the configuration file to specify which directories to backup

One configuration file for a given backup frequency

You must now create one config file for a given backup frequency.

For example, if you want to backup daily some directories, weekly others, and montly some others, you have to create 3 config files :

The first file will contains all directories to backup daily.
The second file will contains all directories to backup weekly.
The third file will contains all directories to backup monthly.

I propose to create them in /home/scripts (the same directory that contains backup.sh).

The syntax of the configuration file

Whatever the file, the syntax is the same.

A configuration file is a set of lines.
Each line gives 3 information separated by a tab :

  • the directory to backup,
  • the keyword that will appended to BACKUP_DIRECTORY_PATTERN (in this example, the directory backup will be called BackupUsers? for the first line and BackupOutlook? for the second line),
  • the number of backups I want to keep (all additional backups will be deleted).

Here is an example of DailyBackups? :

/home/Users     Users   365

Here is an example of WeeklyBackups? :

/home/Outlook   Outlook 52
/home/Mp3       Mp3     52
/home/Photos    Photos  52

Here is an example of MonthyBackups? :

/home/Utl       Utl     12
/home/Backup    Backup  12
/etc            Etc     12
/home/scripts   Scripts 12

7. Step 3 : Add cron jobs to schedule the backups

Create the crontab file

  • Create the /etc/crontab file :
0 4 1 * * /home/scripts/backup.sh /home/scripts/MonthlyBackups | nail -s "Monthly backup status" myemail@myisp.fr
0 4 1 * * /home/scripts/backup.sh /home/scripts/WeeklyBackups | nail -s "Weekly backup status" myemail@myisp.fr
0 4 * * * /home/scripts/backup.sh /home/scripts/DailyBackups | nail -s "Daily backup status" myemail@myisp.fr
  • myemail@myisp.fr : is the adress mail at which you want to receive the backup result,
  • the string after nail -s will be the title of the mail you will receive.

Add a script to load correctly the crontab file at startup

The issue is that the /etc/crontab file will not be correctly loaded at startup.
That's why a script is needed, that will be executed at each startup.

  • Create /etc/init.d/_configure_cron.sh :
#! /bin/bash
# load the relevant cron file


if [ ! -e "$BIN_CRONTAB" ]; then
        echo "Missing $BIN_CRONTAB";
        exit $KO;
elif [ ! -e "$CFG_CRONTAB" ]; then
        echo "Missing $CFG_CRONTAB";
        exit $KO;
        $BIN_CRONTAB -u root $CFG_CRONTAB;
        $BIN_CRONTAB -l

exit $OK;
  • Do not forget to chmod 700 _configure_cron.sh.
  • Create a link to automatically start the script at startup
 ln -s /etc/init.d/_configure_cron.sh /etc/rcS.d/S99configure_cron.sh

8. Step 4 : Configure the mail sending

  • It seems that nail looks for sendmail in /usr/lib whereas sendmail is in /usr/sbin.
Therefore, type the following :
 ln -s /usr/sbin/sendmail /usr/lib/sendmail
  • Edit the /etc/ssmtp/ssmtp.conf :
# /etc/ssmtp.conf -- a config file for sSMTP sendmail.
# The person who gets all mail for userids < 1000
# The place where the mail goes. The actual machine name is required
# no MX records are consulted. Commonly mailhosts are named mail.domain.com
# The example will fit if you are in domain.com and you mailhub is so named.
# Where will the mail seem to come from?
# The full hostname
  • root : the sender name of the mail
  • mailhub : the name of the SMTP server of your ISP
  • hostname : the domain name of your isp

8. Step 5 : Reboot and check that everything is ok

Reboot the slung


Check cron jobs

 crontab -l

Check the scripts

 cd /home/scripts
 backup.sh DailyBackups?

If necessary, check the backup.sh script

 cd /home/scripts
 backup.sh /home/Outlook Outlook 12
view · edit · print · history · Last edited by JNC.
Based on work by JNC and Darek Sliwa.
Originally by JNC.
Page last modified on January 21, 2009, at 02:02 PM