Automated folders backup

This is a very simple way to back up folders in an incremental way and it can be tweaked slightly to make more or less copies while at the same time keeping a defined number of copies on disk, It starts by creating one copy of one or more folders and saves it in its unique folder named after the time and date of the day. The day after, it creates a second copy and the next day the same and so on. By the sixth day it will make a another copy but the script will also remove the oldest copy from disk. So you will always have a predefined number of copies on disk at all times (in this example it’s 5). If for any reason the admin executes the script manually this script won’t delete the oldest two folders but only the single oldest so keep that in mind. Another thing is that the websites_backup.service file has the last 2 lines commented. This is made on purpose due to that initially my idea was run the script once if the system reboots but this will accumulate unnecessary copies and probably make your disk run out of space if not paying attention.I kept the lines there just to remind myself of that.

I’m using this script to keep copies of my websites. These are full folder copies.

websites_backup.sh

#!/bin/bash

SOURCE_DIR="/var/www/*"
BACKUP_DIR="/media/backup/websites_backups"
LOG_FILE="/home/admin/CUSTOM_SERVER_SCRIPTS/backup.log"
DATE=$(date +"%Y-%m-%d_%H-%M-%S")
NUMBER_OF_BACKUPS="5"

        {
                echo "Websites Backup Script execution started at: $(date)"
                mkdir -p "$BACKUP_DIR/$DATE"
                cp -ra $SOURCE_DIR "$BACKUP_DIR/$DATE"
                echo "Copying all contents of /var/www/ folder to /media/backup/websites_backups"
                echo "Websites Backups completed at: $(date)"
                OLDEST_FOLDER=$(ls -t1 "$BACKUP_DIR" | tail -n +$((NUMBER_OF_BACKUPS+1)))
                if [ -n "$OLDEST_FOLDER" ]; then
                        rm -rf "$BACKUP_DIR/$OLDEST_FOLDER"
                        echo "Deleted oldest backup folder, no more than 3 backups allowed."
                fi
        } >> "$LOG_FILE" 2>&1  # Redirecting both stdout and stderr to the log file

websites_backup.service

To automate this process, a service file can be saved in /etc/systemd/system

[Unit]
Description=Backup process for websites in /var/www/ folder.
After=network-online.target

[Service]
Type=simple
ExecStart=/home/admin/CUSTOM_SERVER_SCRIPTS/websites_backup_script/websites_backup.sh
Restart=on-failure
StartLimitInterval=0

# Running this script during system boot has been disabled.
# Uncomment if you want it enabled. Keep in mind space storage management.
#[Install]
#WantedBy=multi-user.target

Lets enable the service and to verify it’s running properly by typing:

sudo systemctl enable websites_backup.service
sudo systemctl status websites_backup.service

websites_backup.timer

[Unit]
Description=websites_backup.service Timer

[Timer]
OnCalendar=*-*-* 7:00:00
AccuracySec=1h
RandomizedDelaySec=30m
Persistent=true

[Install]
WantedBy=timers.target

Optional: Set OnCalendar=Mon 7:00:00 for weekly backup.

This timer will trigger the execution of the websites_backup.service file that will run the websites_backup.sh script. This process will be repeated everyday at 7AM, it will also be triggered within 1 hour of the expected time of execution and an extra 30 mins max randomized execution time to avoid eventual processes spikes (if any). I believe it’s good practice to add randomized execution just in case you forget and decide to create a lot of services starting at the same time.

In the near future I’d like to expand the functionality by also keeping 3 weekly copies at all times.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *