Forumgarden backup arrangements

User avatar
spot
Posts: 40531
Joined: Tue Apr 19, 2005 5:19 pm
Location: Brigstowe

Forumgarden backup arrangements

Post by spot »

I'm currently responsible for backing up five servers. It might be a good idea to write up my current process here.

DR;TL: the text below shows commands and prices for using Contabo storage with 3x RAID redundancy: $2 a month for 250GB of user data.

There's this thing called the cloud. If you have an account with a company which owns a bit of the cloud, you can run processes locally which create, amend and delete remote resources for which you make payment.

The granularity of payment differs from one supplier to another. Amazon Web Services (AWS) for instance bills an agreed amount per minute - or second, I'm not sure - which means you can (if you need one) create a virtual computer - a 500 cpu resource with 8TB of memory, say - and run your process for quarter of an hour and then delete the instance. Your monthly bill might show that later as $15. That's cheaper than owning one. Right, I checked... today you can rent a virtual 448 core CPU with 6TB of memory and a 100 GBit network connection for $54.60 an hour, billed by the second. Or you can make a dozen smaller machines for the same price. So long as you destroy them after, they're comparatively cheap.

And AWS storage for data backups is $5 a month per 100GB with 1GB granularity.

Contabo turns out to be cheaper for data backups. Contabo charges $2 a month per 250GB, with 250GB and 1 month granularity, up to an account limit of 240TB. I'm paying for one unit. That one unit is where the data from all five of the machines I'm handling are backed up.

I'm using "block storage", which is fine for few large files. It's awful for storing a file system one file at a time - for that you need a file system backup plan. AWS has one, for example. I'm not sure what Contabo offers in that regard.

So, on each machine I'm backing up, I have a script which gathers all the data into one place and merges it into a single backup file, compresses the file, encrypts the file, and copies to my Contabo block storage account. As they accumulate I'll work out an automated deletion process which keeps just a few long-term and a few recent copies, I've not done that yet.

Code: Select all

#!/bin/bash
# remote archive to contabo, requires:
#     ~/.config/rclone/rclone.conf     (contabo permission)
#     .my.cnf                          (mysql permission)
#     passphrase                       (gpg permission)
#     contabo-backup.sh                (this script)
# jh april 2020
if (( $EUID != 0 )); then
    echo "Please run as root!"
    exit
fi
cd /root/contabo
ARCHIVE=$(</etc/hostname)
BACKDIR=$ARCHIVE.Back.$(date "+%Y.%m.%d-%H.%M.%S")
echo "Contabo archive $BACKDIR started"
mkdir /tmp/$BACKDIR
service apache2 stop
echo "apache down"
date +%T
mysqldump --all-databases >/tmp/$BACKDIR/$ARCHIVE.sql
service apache2 start
date +%T
echo "apache back"
rsync -a /usr/local/bin /usr/local/sbin /var/www /etc /root /tmp/$BACKDIR
tar -cJ --exclude tmp/$BACKDIR/root/.gnupg/* -f $BACKDIR.txz -C / tmp/$BACKDIR
rm -rf /tmp/$BACKDIR
gpg --cipher-algo aes256 --output $BACKDIR.txz.gpg --passphrase-file ./passphrase --batch --yes --symmetric $BACKDIR.txz
rm $BACKDIR.txz
rclone sync -P $BACKDIR.txz.gpg eu2:private/$ARCHIVE --s3-no-head
rm $BACKDIR.txz.gpg
date +%T
echo "Contabo archive $BACKDIR completed"
All the data is gathered under a temporary directory. "tar -cJ -f" uses x-compression (the "J") to create ("c") a file ("f"). "gpg" outputs a symmetric-encrypted file using a passphrase. "rclone" copies it into rented Contabo storage under my account.

That runs from /etc/cron.weekly

To get the data restored, I download the backup file to the local machine and:

Code: Select all

gpg -o example.com.Back.2022.04.14-19.19.41.txz --passphrase-file ./passphrase --batch -d example.com.Back.2022.04.14-19.19.41.txz.gpg
unxz example.com.Back.2022.04.14-19.19.41.txz
tar xf example.com.Back.2022.04.14-19.19.41.tar
The -d option in gpg decrypts the specific backup file using the same passphrase it was encrypted with. "unxz" decompresses the decrypted file. "tar x" expands the tar file to recreate the temporary directory, retrieving all the selected files.

It took me a while to work all that out. This thread will be a helpful reminder after I've forgotten stuff.
Nullius in verba|||||||||||
To Fate I sue, of other means bereft, the only refuge for the wretched left.

Who has a spare two minutes to play in this month's FG Trivia game!
My other operating system is Slackware
User avatar
spot
Posts: 40531
Joined: Tue Apr 19, 2005 5:19 pm
Location: Brigstowe

Re: Forumgarden backup arrangements

Post by spot »

The other thing which might be useful is a VPS (Virtual Private Server), while I'm on the subject.

https://www.ionos.co.uk/servers/vps is the cheapest I've seen. £1.20 a month ($1.60) including taxes, one complete online linux server with root access, nobody else on it unless you invite them. That's around the same price my electricity costs to keep a Raspberry Pi permanently online at 3.5W, but with RAID drive security and commercially reliable availability.
Nullius in verba|||||||||||
To Fate I sue, of other means bereft, the only refuge for the wretched left.

Who has a spare two minutes to play in this month's FG Trivia game!
My other operating system is Slackware

Return to “Computers Internet”