Rclone on QNAP - Backup cloud storages to your ancient NAS

on under developer
5 minute read

Out Of Date Warning

This article was published on 05/09/2016, this means the content may be out of date or no longer relevant.
You should verify that the technical information in this article is still up to date before relying upon it for your own purposes.

Recently, I've came across rclone, a powerful "Rsync for cloud storages". It can copy/sync files from and to various cloud provider, such as Amazon S3, Google Drive, Dropbox, One-Drive and so forth. Additionally, if you are using rclone to push your your local backups, you can encrypt them before.

Personally, I am using S3 and Google Drive for different purposes, but had no automatic regular backup into my own infrastructure. So, I decided to install it on my local QNAP NAS. Rcloud as a Go programmed software is precompiled without dependencies for most platforms. That comes in handy, as the QNAP Linux is very minimalistic with few and old software installed.

Install Rcloud

Rcloud is a available as precompiled ARM or x32/64 versions.

First, log into your NAS with ssh:

ssh admin@192.168.178.101

(Same password as the web UI)

Now, try to find out your system architecture:

[~] # uname -a
Linux NAS 3.4.6 #1 Wed Jun 1 13:01:34 CST 2016 armv5tel unknown

The above is a ARM 32bit. If your NAS is a more powerful version, it might be a x64 or similar.

Download the right version from here, e.g.:

mkdir -p /share/backup/sync
cd /share/backup/sync

wget http://downloads.rclone.org/rclone-v1.33-linux-arm.zip
unzip rclone-v1.33-linux-arm.zip
mv rclone-v1.33-linux-arm/rclone .

For the rest of the example, I assume you install rcloud and the following scripts to /share/backup/sync/.

(SHA1 sum for version 1.33 arm: 1dee0f264bd822b77a227855978d0dd8279d9fc5)

SSL

Unfortunately, QNAP has no bundled root certificates, that means no system software (such as Curl or Rclone) can easily access SSL websites. Rclone has a command line switch for disabling SSL validation, but this is neither very nice nor does it work with the Google Drive app authorization flow.

[UPDATE] Generate certificate without installing Perl

One commenter, Elad Eayl, pointed out that you don't need Perl to generate rehashed certificates, as openssl is installed on the NAS.

cd /share/backup/sync
wget --no-check-certificate https://curl.haxx.se/ca/cacert.pem
mkdir certs
cd certs
for filename in cert*pem;do mv $filename `openssl x509 -hash -noout -in $filename`.0; done;
cp *.pem /etc/ssl/certs/

[OLD] 1. Install "Perl" from QNAP App Manager

Web-UI -> App-Store -> Perl. That will install perl binary to /opt/bin,

2. Download & install SSL bundle

Download the cert-bundle from from Curl, unwrap each cert into it's own file, rehash with a perl tool and copy all that over to /etc/ssl/certs (Found here).

cd /share/backup/sync
wget --no-check-certificate https://curl.haxx.se/ca/cacert.pem
mkdir certs
cat cacert.pem | awk 'split_after==1{n++;split_after=0} /-----END CERTIFICATE-----/ {split_after=1} {print > "certs/cert" n ".pem"}'
wget --ca-certificate cacert.pem https://raw.githubusercontent.com/ChatSecure/OpenSSL/master/tools/c_rehash
/opt/bin/perl c_rehash certs
cp certs/* /etc/ssl/certs/

(After a reboot the certs will be reset!)

Example configure Google Drive

Here a copy out from the step-by-step guide of Rclone. Be sure to open the Google Drive authorization url in a webbrowser and paste the validation token that you will receive.

[/share/backup/sync] # ./rclone -v --config rclone.conf config
2016/09/05 16:50:40 rclone: Version "v1.33" starting with parameters ["./rclone" "-v" "--config" "rclone.conf" "config"]
Current remotes:

Name                 Type
====                 ====
s3                   s3

e) Edit existing remote
n) New remote
d) Delete remote
s) Set configuration password
q) Quit config
e/n/d/s/q> n
name> gdrive
Type of storage to configure.
Choose a number from below, or type in your own value
 1 / Amazon Drive
   \ "amazon cloud drive"
 2 / Amazon S3 (also Dreamhost, Ceph, Minio)
   \ "s3"
 3 / Backblaze B2
   \ "b2"
 4 / Dropbox
   \ "dropbox"
 5 / Encrypt/Decrypt a remote
   \ "crypt"
 6 / Google Cloud Storage (this is not Google Drive)
   \ "google cloud storage"
 7 / Google Drive
   \ "drive"
 8 / Hubic
   \ "hubic"
 9 / Local Disk
   \ "local"
10 / Microsoft OneDrive
   \ "onedrive"
11 / Openstack Swift (Rackspace Cloud Files, Memset Memstore, OVH)
   \ "swift"
12 / Yandex Disk
   \ "yandex"
Storage> 7
Google Application Client Id - leave blank normally.
client_id>
Google Application Client Secret - leave blank normally.
client_secret>
Remote config
Use auto config?
 * Say Y if not sure
 * Say N if you are working on a remote or headless machine or Y didn't work
y) Yes
n) No
y/n> n
If your browser doesn't open automatically go to the following link: https://accounts.google.com/o/oauth2/auth?client_id=xxxxxxxxxxxxxxxxxxxxxxx
Log in and authorize rclone for access
Enter verification code> 4/uFcSu-Wyc4ylPxSBo7zLnBCNo0iztzoI6PwP3Lf7riw
2016/09/05 16:51:18 gdrive: Saving new token in config file
--------------------
[gdrive]
client_id =
client_secret =
token = {"access_token":"xxxxxx","token_type":"Bearer","refresh_token":"xxxxxx","expiry":"2016-09-05T17:51:18.744308415+02:00"}
--------------------
y) Yes this is OK
e) Edit this remote
d) Delete this remote
y/e/d> y

You can now list google drive content:

## List the content
$ ./rclone -v --config rclone.conf ls gdrive:
... output snipped

## Download it all
$ ./rclone -v --config rclone.conf sync gdrive: ./google-drive-backup
...

I've also added a S3 bucket without to much problems. Just create an IAM user with bucket read/write access, easiest:

  1. Create S3 Bucket (name needs to be unique over all S3 buckets), remember bucket name + region
  2. Create an IAM User in Security credentials, download credentials
  3. Under tab Permissions: Attach policy: AmazonS3FullAccess
  4. Use credentials from downloaded file when configuring rclone.

Persistent Cronjob with QNAP NAS

As mentioned, QNAP resets most of it directories when rebooting, this includes crontab -e added cronjobs that will be gone after a reboot. Fortunately, there is a workaround as written here:

# run the script every day at 23:01:
$ echo "1 23 * * * /share/backup/sync/backup.sh" >> /etc/config/crontab

Create the backup script into /share/backup/sync/backup.sh (using Vi or by navigating to nfs/samba - might need to fix the permissions in that case).

#!/bin/sh
cd /share/backup/sync/
cp certs/* /etc/ssl/certs/
./rclone --config rclone.conf sync grive: ./google-drive-backup
# ...any more backup stores that you have configured, like:
./rclone --config rclone.conf sync s3:mybucketname ./s3-mybucketname

... and make it executable + reload the crontab into the running system, too.

chmod +x /share/backup/sync/backup.sh
crontab /etc/config/crontab

Happy Backupping!