Skip to main content

Backup WordPress Database And Filesystem Data On Linux With Scripts

f you’re like me, you run a WordPress blog and are terrified of the thought of something going wrong.  With core updates, theme updates, plugin updates, and server component updates, there is a lot of room for error.  This is where a WordPress backup could help ease your mind.
WordPress recommends taking a backup of your blog before any of these are done and there are even some popular plugins that will do this for you.  For example, you could use the popular UpdraftPlus or similar, but I believe there is room for error in those as well.  While I could be wrong, I think WordPress must be in good shape for backup plugins to be successful.
The alternative would be to create your own backup scripts that run on a cron schedule.  We’re going to see how to do this for WordPress instances running on a Linux machine.

Creating the Backup Script

There are two core components that need to be backed up in case of a catastrophe.  You need to backup the WordPress files which can include plugins, themes, and uploads, as well as the data that resides on your database.
Create the following backup.sh script somewhere on your server:

#!/bin/bash
 
TODAY=`date '+%Y%m%d'`
TEMP_DIR=/home/nraboy/backups/temp
 
BACKUP_NAME="blog"
DB_NAME="DATABASE_NAME_HERE"
DB_USER="DATABASE_USERNAME_HERE"
DB_PASS="DATABASE_PASSWORD_HERE"
SITE_PATH=/var/www
 
echo "Starting Backup..."
 
mkdir $TEMP_DIR
 
mysqldump -u $DB_USER -p$DB_PASS $DB_NAME > $TEMP_DIR/database.sql
 
tar --exclude="updraft" -zcf $TEMP_DIR/files.tar.gz $SITE_PATH
 
tar -zcf $BACKUP_NAME-$TODAY.tar.gz -C $TEMP_DIR .
 
rm -Rf $TEMP_DIR
 
echo "Backup Complete [$(du -sh $BACKUP_NAME-$TODAY.tar.gz | awk '{print $1}')]"


So what is happening in the above script?
First we are obtaining the date which will be used when naming our backups.  In my scenario I had never planned to take more than one backup per day.  We also need to define a temporary directory which will contain each of the backup components.
The next section we define the backup name which you can use to identify the backup.  The end result will be a file named something like blog-20161230.tar.gz, based on what I have in the script.
The SITE_PATH should be the location where your WordPress blog or website resides on the server.  A common location is the /var/www directory if you’re unsure.  Within the path there should be a file called wp-config.php which contains the database information.
Everything so far was initialization.
When the script runs, the temporary directory will be created and the MySQL database will be dumped into a SQL file.  This dump contains table structure and data.  Once the database is dumped all the WordPress files are archived into a tar file, excluding directories that we define.  Excluded directories could be other backup directories, cache directories, etc.
With two files in our temporary directory, we can create our single and final tar archive from them.
Now that you have a script that will create and bundle a file and database backup, you need to configure it to run on a schedule using crontab.
Execute crontab e on your server and add the following line:

The above line will execute our backup script every day at 2:00am.  You’ll end up with a backup in the current working directory unless you specify the output directory in our script, which we did not.
We could easily do something like this in our script:

tar -zcf /home/nraboy/$BACKUP_NAME-$TODAY.tar.gz -C $TEMP_DIR .

In the event that something bad happens and you need to restore your WordPress blog from this backup, you could extract the files and replace what you currently have, then import the SQL file into MySQL.  This is a full snapshot, not an incremental backup.

Conclusion

While there are many free and paid WordPress backup plugins available, sometimes it takes a good old fashioned Linux script to make you feel at ease about your website or blog.  I personally only take backups on a weekly or monthly basis, but your needs may be different than mine.  Just note that because these are not incremental, they may take a bit of space on your hard drive.

Comments

Popular posts from this blog

Python and Parquet Performance

In Pandas, PyArrow, fastparquet, AWS Data Wrangler, PySpark and Dask. This post outlines how to use all common Python libraries to read and write Parquet format while taking advantage of  columnar storage ,  columnar compression  and  data partitioning . Used together, these three optimizations can dramatically accelerate I/O for your Python applications compared to CSV, JSON, HDF or other row-based formats. Parquet makes applications possible that are simply impossible using a text format like JSON or CSV. Introduction I have recently gotten more familiar with how to work with  Parquet  datasets across the six major tools used to read and write from Parquet in the Python ecosystem:  Pandas ,  PyArrow ,  fastparquet ,  AWS Data Wrangler ,  PySpark  and  Dask . My work of late in algorithmic trading involves switching between these tools a lot and as I said I often mix up the APIs. I use Pandas and PyArrow for in-RAM comput...

Kubernetes Configuration Provider to load data from Secrets and Config Maps

Using Kubernetes Configuration Provider to load data from Secrets and Config Maps When running Apache Kafka on Kubernetes, you will sooner or later probably need to use Config Maps or Secrets. Either to store something in them, or load them into your Kafka configuration. That is true regardless of whether you use Strimzi to manage your Apache Kafka cluster or something else. Kubernetes has its own way of using Secrets and Config Maps from Pods. But they might not be always sufficient. That is why in Strimzi, we created Kubernetes Configuration Provider for Apache Kafka which we will introduce in this blog post. Usually, when you need to use data from a Config Map or Secret in your Pod, you will either mount it as volume or map it to an environment variable. Both methods are configured in the spec section or the Pod resource or in the spec.template.spec section when using higher level resources such as Deployments or StatefulSets. When mounted as a volume, the contents of the Secr...

Andriod Bug

A bug that steals cash by racking up charges from sending premium rate text messages has been found in Google Play.  Security researchers have identified 32 apps on Google Play that harbour the bug called BadNews. A security firm Lookout, which uncovered BadNews, said that the malicious program lays dormant on handsets for weeks to escape detection.  The malware targeted Android owners in Russia, Ukraine, Belarus and other countries in eastern Europe. 32 apps were available through four separate developer accounts on Google Play. Google has now suspended those accounts and it has pulled all the affected apps from Google Play, it added. Half of the 32 apps seeded with BadNews are Russian and the version of AlphaSMS it installed is tuned to use premium rate numbers in Russia, Ukraine, Belarus, Armenia and Kazakhstan.