Incremental Backup Script

I am Paranoid! (When it comes to Backups)

There I said it. I am a paranoid person but thankfully most of my paranoia is restricted to backups of important files. When you think about it, an academic truly is their files… my main document directory represents thousands of hours of time and years of my life. Scary thought, right?

So I pursue a multi-layered approach to backing up my critical files. My current approach utilizes manual backups, p2p backups / synchronization among my systems, and backups to Google Drive. However, even that wasn’t enough for me to be comfortable. What if I overwrote a file by accident? Only the p2p network (which uses Syncthing and includes rudimentary versioning) could save me. But what if the conflict resolution system made an incorrect decision and preserved the wrong copy of the file? What then?

Like I said, I’m paranoid. So I decided that each major work system that I use should make incremental backups on a daily basis. Essentially with the command line you can easily search for changed files, then tar them. This produces a snap-shot style backup which could, in theory, be used to regenerate your files as they were at any given time.¬† Write a quick cron task and now it is scheduled and automated.

However, just to make things more complicated I had three more requirements:

  1. While I want no more than one incremental, snap-shot style backup per day, I cannot guarantee that each system will be on at a given time.
  2. I want the snap-shots to be easy to navigate.
  3. As this is a script that I intend to run automatically for years, it has to have robust error handling and logging.

So this is now more complex than a single line of BASH, time for some scripting!

I am attaching the solution I came up with but I’ll explain my reasoning for each thing more in another post

Here is the script I came up with. There is nothing innovative about it but it has been running once an hour (it is scheduled with cron) for 4 months without a single hiccup or fault–my definition of success! If it is useful to you feel free to use/modify/abuse it… consider it under the MIT License.

Thanks for reading and I’d love to hear any thoughts that people have on how to improve it

#! /bin/bash

set -ue

function build_vars
# Current Date / Time
epoch=$(date +%s)
month=$(date +%m)
year=$(date +%Y)
day=$(date +%d)

# Building Environment: Directories

# Building Environment: File Names

function check_epoch
#Checking Time Since Last Run
if [ -n "$(find "$lastepochfile" -mtime -1)" ]; then 
  echo "Less than 24 hours since last archive"
  exit 0

function check_env
#Checking Configuration Directory
if [ ! -d "$configdir" ]; then
  mkdir "$configdir"
  echo "$epoch" Making "$configdir" >> "$logfile"

#Checking Year Directory
if [ ! -d "$yeardir" ]; then
  mkdir "$yeardir"
  echo "$epoch" Making "$yeardir" >> "$logfile"

#Checking Month Directory
if [ ! -d "$monthdir" ]; then
  mkdir "$monthdir"
  echo "$epoch" Making "$monthdir" >> "$logfile"

#Checking Day Directory
if [ ! -d "$daydir" ]; then
  mkdir "$daydir"
  echo "$epoch" Making "$daydir" >> "$logfile"

function write_file
#Incremental Tar & Compression
echo "$epoch" Writing "$backupfile" >> "$logfile"
find "$target_dir" -type f ! -name ".*" -newer "$lastepochfile" -print0 | tar czvf "$backupfile" --null -T -

#Log Success
echo "$epoch" Success "$backupfile" >> "$logfile"

function write_epoch
# Unprotect Last Epoch File
chmod 600 "$lastepochfile"

# Write Last Epoch File
echo "$epoch" > "$lastepochfile"

# Protect Last Epoch File
chmod 400 "$lastepochfile"

# Configuration


exit 0



One thought on “Incremental Backup Script”

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s