1. Synology NAS operation

If your Synology NAS has been directly accessible externally, then it is very simple:

Control Panel > Shared Folder > New Save Backup Folder, here I named vpsbackup

Control Panel > External Access > Router Configuration > Add FTP Port

To ensure successful router port mapping, you can use any FTP client connection to see it.

Of course, if you plan to back up to other servers, you can just look at the following paragraph.

2. Operation on the VPS server side

First, create a new directory called backup

1

# mkdir backup

Then create a script in the home directory of the VPS, I named it here backup.sh

#cd / home 
#vi backup . sh

Edit this script:

#!bin/bash
./etc/profile
MYSQL_USER=root #mysql username
MYSQL_PASS=passba #mysql password
MYSQL_NAME=wordpress #database name
FTP_USER=USER #ftp username
FTP_PASS=passba #ftp password
FTP_IP=ftp.yourNAS .com #ftp address or Domain
FTP_backup=vpsbackup #ftp backup directory
WEB_DATA=/var/www/html #your website dir
#change as your need before this line
OldWeb=Web_$(date -d -5day +"%Y%m%d").tar.gz
WebBakName=Web_$(date +%Y%m%d).tar.gz
OldData=Data_$(date -d -5day +"%Y%m%d").sql
DataBakName=Data_$(date +"%Y%m%d").sql
rm -rf /backup/Data_$(date -d -3day +"%Y%m%d").sql /backup/Web_$(date -d -3day +"%Y%m%d").tar.gz
#delete datas 3 days before 
cd /backup
echo "You are in backup dir"
# export sql data,no compress ,of course, you can compress it
mysqldump -u$MYSQL_USER -p$MYSQL_PASS $MYSQL_NAME > $DataBakName
echo "Your database backup successfully completed"
#compress website files
tar zcf /backup/$WebBakName $WEB_DATA
#backup to the ftp,delete backup 5 days before
ftp -v -n $FTP_IP << END
user $FTP_USER $FTP_PASS
type binary
cd $FTP_backup
delete $OldData
delete $OldWeb
put $DataBakName
put $WebBakName
bye
END

 

Test if this script can run

#sh backup . sh

 

Run this script automatically every day, starting at 3 am every day.

I got a error:

GeSHi Error: GeSHi could not find the language sh (using path /www/wwwroot/laod.cn/wp-content/plugins/codecolorer/lib/geshi/) (code 2)

Because I am using Centos7, the database uses Mariadb, so the export command in the script is a bit different from the original. In addition, I used to run the original script directly and found that it can not run automatically, check the problem of finding the absolute path for half a day; in the execution environment variable of crontab, there is no environment variable set by the corresponding user, you also need to manually set the environment variable. In order for it to take effect, so I added it in the script.

. /etc/profile

It doesn’t matter what directory the script puts in, just remember to make the corresponding changes.

In fact, I have the idea to write this script as 2 copies, the database is backed up every day, and the website files can be weekly or longer. This depends on everyone’s favorite.

This script can also send the backup database file to the specified mailbox, but my mail postfix settings don’t know what to do, and I haven’t got it yet. The VPS pit is too deep and can’t be filled in for a while.