Welcome to The Infinity! If you're anything like me, spending your nights wondering, "Did I remember to back up the database? What would I do if something happened?" then you've come to the right place.
Backing up a database, especially one like PostgreSQL that holds precious data, is absolutely critical. But let's be honest, it can sometimes be a tedious and complicated task. In this article, we'll walk you through how to use a fantastic tool called Rclone to automatically and securely back up your data to Google Drive—for free. Don't be intimidated; by the end of this guide, you'll be one of those people who sleep soundly, knowing your data is safe. Let's get started!
Step 1: Getting Rclone Ready and Introducing It to Google Drive
First, let's get Rclone installed on your server. Open up your terminal and paste in this magic command:
curl [https://rclone.org/install.sh](https://rclone.org/install.sh) | sudo bash
This command will install the latest version of Rclone on your system. Now comes the most crucial part: getting Rclone to talk to your Google Drive account. Relax, we won't get lost in Google's complex API panels. We'll take the simplest route.
Type rclone config
in your terminal to start the wizard. Answer the questions just like this:
n) New remote
q) Quit config
n/q> n // We're telling it we want to create a new connection.
Enter name for new remote.
name> gdrive_backups // Give your connection a name you'll remember.
...
Choose a number from below...
...
22 / Google Drive
\ (drive)
...
Storage> 22 // We're choosing Google Drive.
Option client_id.
...
client_id> // THIS IS THE MOST IMPORTANT PART: Leave it blank and just press Enter!
Option client_secret.
...
client_secret> // LEAVE THIS BLANK AS WELL and press Enter! Rclone will use its own key.
Option scope.
...
1 / Full access all files...
\ (drive)
...
scope> 1 // Let's grant full access for now to keep things simple.
Option service_account_file.
...
service_account_file> // Leave it blank and press Enter.
Edit advanced config?
y/n> n // We don't need any advanced settings.
Use web browser to automatically authenticate rclone with remote?
y/n> y // Say yes, this is the easiest way.
After you type y
, a little magic will happen. Rclone will give you a link in the terminal. Copy that link and paste it into your computer's web browser. Google will ask you, "An app called rclone wants to access your account, do you approve?" Go ahead and click "Allow."
After granting permission, you'll see a "Success!" page with a code. Copy that code and paste it back into your terminal. Finally, it will ask if this is a "Team Drive?"—answer n
, confirm the settings with y
, and exit with q
.
That's it! To test the connection, type rclone lsd gdrive_backups:
. If it lists the folders from your Google Drive, you're all set!
Step 2: The Star of the Night: The Automated Backup Script
Now that Rclone is ready, let's write a small but mighty script that will dump our PostgreSQL backup and send it to Google Drive.
Create a file named backup.sh
and paste the following into it:
#!/bin/bash
# --- ADJUST THE SETTINGS BELOW TO YOUR NEEDS ---
DB_NAME="my_database_name"
DB_USER="my_database_user"
RCLONE_REMOTE_NAME="gdrive_backups" # The name you gave the remote in rclone
GDRIVE_FOLDER="Database_Backups" # The folder in Drive where backups will go
# --- The Rest of the Script ---
TIMESTAMP=$(date +"%Y-%m-%d_%H-%M-%S")
BACKUP_FILE="/tmp/${DB_NAME}_${TIMESTAMP}.sql.gz"
echo "Taking PostgreSQL backup of: ${DB_NAME}"
# We dump the database with pg_dump and immediately compress it with gzip.
# Pro Tip: Using a ~/.pgpass file is more secure than putting passwords in scripts.
pg_dump -U $DB_USER -d $DB_NAME -h localhost --format=c --blobs | gzip > $BACKUP_FILE
# Let's check if pg_dump was successful.
if [ ${PIPESTATUS[0]} -ne 0 ]; then
echo "ERROR: pg_dump command failed!"
exit 1
fi
echo "Backup created successfully: ${BACKUP_FILE}"
echo "Uploading to Google Drive with Rclone..."
# We copy the file to the folder inside our Drive remote.
rclone copy $BACKUP_FILE "${RCLONE_REMOTE_NAME}:${GDRIVE_FOLDER}/" --progress
# Was rclone successful?
if [ $? -ne 0 ]; then
echo "ERROR: Rclone upload failed!"
exit 1
fi
echo "Upload complete. Deleting temporary file from server."
rm $BACKUP_FILE
echo "Process complete! See you at the next backup."
Don't forget to make this file executable with the command chmod +x backup.sh
!
Step 3: Let's Hire the Robots: Automation with Cronjob
Our script is great, but are we going to run it manually every night? Of course not! Linux's trusty scheduler, cron
, was made for this.
Type crontab -e
in the terminal and add this line to the very bottom of the file:
0 2 * * * /home/youruser/backup.sh >> /var/log/backup.log 2>&1
What does this line mean?
0 2 * * *
: This means "at 2:00 AM, every single night."/home/youruser/backup.sh
: The full path to your script. This is very important; you must use the correct path to your script.>> /var/log/backup.log 2>&1
: This part means "write everything the script says or does into this log file." This way, if something goes wrong, you can check the log to see what happened.Save the file and exit. And that's it! You now have a robot. Every night at 2 AM, it will wake up, take your backup, upload it to Google Drive, and quietly complete its mission.
Closing
And that's it! You now have an automation system that works diligently every night, moving your valuable data safely to the cloud. Thank you for reading this guide to the end.
If you find practical and life-saving DevOps solutions like this useful, don't forget to register with The Infinity! That way, you can become part of our community and be the first to know about new articles and tips.
Happy automating!
Resources & Further Reading
pg_dump
command, detailing all the flags and backup formats.