Self-Hosting Your Own Cloud – Part 11: Monthly Optical Data Backups

In my last post, I discussed how to store and manage photos and videos. Next, I’ll talk about why and how I do monthly optical data backups.

This is the 11th post in a series about protecting your privacy by self-hosting while attempting to maintain the conveniences of public cloud services. See the bottom of this post for a list.

A 3-2-1 backup strategy is widely considered good practice. If you’re self-hosting a lot of services and data, this strategy becomes especially important.

For my particular setup, I think I’ve achieved this rule:

  • Three ZFS pools set up across three different systems — one primary copy and two used as backups.
  • Two different storage media – magnetic disk (hard drives) and optical (Blu-ray Disc)
  • One copy offsite — one of my ZFS pools containing a fully copy of my data, synchronized nightly, is hosted on my own hardware in a relative’s house in another city.

Having optical media in addition to hard drives is an extra level of insurance for me in the event of hardware failure.

Data Selection

First, we’ll select the data we want to backup. Since Blu-ray media is the highest capacity at the moment, and it can only hold up to 100 GB per disc (I’m using 50 GB discs), we’ll have to be selective about what files we want to backup. A simple way to do this is to create a backup source directory and then symlink files and/or other directories into this new directory.

In this example, we’ll call this directory /vault/Backup/Optical.

Let’s make this directory, and then symlink some directories:

~$ mkdir /vault/Backup/Optical
~$ cd /vault/Backup/Optical
Optical$ ln -s /vault/Containers/Data Container\ Data
Optical$ ln -s /vault/Personal/Vital\ Records
Optical$ ln -s /vault/Personal/Scans

Now, let’s verify that our symlinks are in place:


Optical$ ls -la
total 12
drwxrwxr-x 3 jordan jordan 4096 Apr 18 20:44 ./
drwxrwxr-x 3 jordan jordan 4096 May  9 02:34 ../
lrwxrwxrwx 1 jordan jordan   15 Feb 21 11:39 Container Data -> /vault/Containers/Data
lrwxrwxrwx 1 jordan jordan   25 Apr 18 20:44 Scans -> /vault/Personal/Scans
lrwxrwxrwx 1 jordan jordan   15 Feb 21 11:41 Vital Records -> /vault/Personal/Vital Records

Software Installation

I am using genisoimage to generate my disc image and growisofs to burn my images to disc. On Ubuntu, you can simply apt install these named packages.

Next, we’ll create a script to prepare and burn the data.

Backup Script

We’ll build a script that does the following:

  1. Generates ISO image of our data
  2. Burns the image to disc
  3. Removes the ISO image

Using your favorite editor, create an optical-backup.sh script:

# Define the path to our data
SOURCE_DIRECTORY=/vault/Backup/Optical

# Define a temporary place for our ISO image
# I'd recommend pointing this to a place running on an SSD (not a slow hard drive)
ISO_FILE_NAME=/tmp/backup-`date +%Y-%m-%d`.iso

# Define the path to our burner device
# I prefer the ``by-id'' names because they are easy to identify.
OPTICAL_DRIVE=/dev/disk/by-id/ata-HL-DT-ST_BD-RE_WH14NS40_KLSK6HH0325

# Generate an ISO image with today's date as the label. Some pretty standard options
# that can be tweaked as needed. An especially important one is the -f argument which
# will follow the symbolic links that we set up earlier.
genisoimage -udf -V "`date +%Y-%m-%d`" -J -r -iso-level 3 -f -allow-multidot -allow-leading-dots -joliet-long -o $ISO_FILE_NAME $SOURCE_DIRECTORY

# Burn the ISO to disc.
growisofs -speed 2 -Z $OPTICAL_DRIVE=$ISO_FILE_NAME

# Remove the unneeded ISO
rm -f $ISO_FILE_NAME

You’ll want to make this script executable and then test it with a blank disc. Once you are confident that it is working properly, we can schedule it to run unattended.

Next, we’ll set up a cron job to automatically run our backup.

Cron Job

On the first day of every month, at 3 a.m., I have a cron job configured to run my backup script.

To set up the job, edit the crontab as root. Why root? This ensures that files of every owner/permission are included in our backup and allows easy access to the optical device.

Launch the crontab editor:

$ sudo crontab -e [sudo] password for jordan:

Add a line at the bottom of the file:

0 3 1 * * /path/to/optical-backup.sh >> /var/log/optical-backup.log 2>&1

crontab guru is a great tool for helping with the date/time format of the crontab. It’s also nice to log the backup process to a file so you can examine the results/troubleshoot problems.

Monthly Backup

On the first of every month, I simply:

  1. Go to my server system
  2. Eject the newly-burned disc and label it with today’s date
  3. Place a new blank disc into the burner
  4. Place my newly-burned/labeled disc into my fireproof safe

Conclusion

For data backups, even just a subset of your data, your most important data, is better than no backup at all.


Self-Hosting Your Own Cloud

  1. Setting up OpenVPN
  2. SMB File Server with Automated Backups using Rsync/Rclone
  3. Note-taking with Nextcloud & Syncthing
  4. Movies and Music using Emby
  5. Protect Yourself Online with Privacy Tools
  6. Ad and Tracker Blocking with Pi-Hole
  7. Building a Personal Private Network with WireGuard
  8. Monitoring Your Internet Traffic with ntopng
  9. Building a NAS with ZFS
  10. Photos and Videos
  11. Monthly Optical Data Backups (this post)