As part of my 100 Days of Code Challenge this is something I worked on in week 2.
The start of this week with the sniffles as my wife calls it. I had the cold and I spent a lot of time reading on my phone, not a lot of time working unfortunately.
One of the things I did work on in that downtime was improving a backups setup that I had been using.
I wrote a script to run through each of the sites I wanted backed up and while writing it I decided I'd put a little color in it for a change to make it easier to see the status of the execution.
I run a lot of WordPress sites using docker and multi-container setups. It's mainly based on similar ideas I wrote about when I first put WordPress Containers into production.
There are 2 things we need to back up for WordPress sites. There's the site files and there's the database. Both of those are run inside of different containers.
The content of these sites isn't updated often – many of them are essentially static at this point. Slow weekly/monthly backups are enough to secure what they store.
All of the sites and their databases get included in different system level backups at different times of the month. It's not ideal but it's been enough so far.
In a more tailored system the file and database backups would occur more regularly – and be more targeted towards each individual site.
The container that runs the WordPress files holds them inside of a directory on the host that is mounted inside of the the container. Technically – and in the default home directory backups I run as part of a different system – they can be backed up from the host by backing up those directories.
I wanted to perform the backup without relying on the host system. Instead I came up with this one liner to fire up a different container that's using the ubuntu image and archive the files through it for the backup. It's a little complex but I'll explain each bit below.
docker run --rm --volumes-from pattonwebzinfo_wordpress_1 -v $(pwd)/backup:/backup/ ubuntu tar -cvf /backup/pattonwebz_info.tar /var/www/
docker run --rm– Run a container that will be removed when it exits.
--volumes-from pattonwebz_info_wordpress_1– Mount the volumes FROM another container INTO this container too.
-v $(pwd)/backup:/backup/create a volume that's found at [current working directory]/backup on the host and make it accessible inside the container from /backup/
ubuntu– the container we're using is a base ubuntu instance which will have the command we need available.
tar -cvf /backup/pattonwebz_info.tar /var/www/– using
tarcommand we backup the /var/www/ directory (which is a directory mounted from the wordpress container) and store the archive inside of this containers mounted volume – backup.
To backup the database I considered spinning up a different mysql container and mounting the directories inside of it in a read-only mode to prevent from 2 systems accessing the same set of files.
I worried that file locking might be a problem with this and it seemed like a custom image may be needed – or for custom configs to be written in advance and mounted into the db backup container.
There was also authentication issues to deal with. I don't want to be bundling credentials into scripts if I don't need to.
I opted to take a different approach and use
docker exec to execute a
mysqldump command inside of the container and redirect the output to a file inside of the host filesystem. Inside of
[current working directory]/backup/wp_database.sql– this puts it right next to the sites files archive.
docker exec pattonwebzinfo_mysql_1 sh -c 'exec mysqldump -u"$MYSQL_USER" -p"$MYSQL_PASSWORD" "$MYSQL_DATABASE"' > $(pwd)/backup/wp_database.sql
The database name, user and password are available inside the container as environment variables so can be used easily.
Script it up to make it cycle through all the containers I need to backup and then trigger that script via cron. Targeted scheduled backups for WordPress docker containers.