3
2

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

SSHでつなぐサーバ上のデータをDropboxに直接バックアップする / Backup data from SSH server to Dropbox directly by using Dropbox Uploader

Last updated at Posted at 2017-10-10

What's this?

バックアップの重要性はいまさら言うことでもなく、私を含め多くの方がDropboxのようなクラウドサービスを使ってデータをバックアップしていることでしょう。PCの場合はそれでいいのですが、SSHでつなぐようなサーバの場合はどうやってバックアップしていますか? 手元の端末にバックアップとを取るにはデータサイズが大きいこともありますし、あるいは端末の電源を切ったり、ラップトップだったらネット接続が切れたりしたらバックアップが取れない心配もあります。ここでは、サーバ上から直接Dropboxにデータをバックアップする方法を紹介します。

It is vital to back up your data to avoid losing it. You can just install Dropbox client to your personal computers and keep everything synced. When it comes to remote servers, however, things get a little bit complicated if you don't have a backup server. You may not want to backup data from the server to your personal device because the data can be huge. Moreover, there's no guarantee that your backup script runs every time, because you may shut down your personal computer or, if it's a mobile device, it may not have a WiFi connection. Here's an easy way to backup your data directly and automatically from the server to Dropbox.

Prerequisite

  1. Dropbox Account (Dropbox plus subscription if you backup a large amount of data)
  2. Have Dropbox Uploader (https://github.com/andreafabrizi/Dropbox-Uploader) installed somewhere.

How to do it?

以下のbackup.shを適当な場所に保存してください。ここでは~/backup.shとします。# Settingsで定義している変数を環境に応じて適宜編集してください。

Save this backup.sh somewhere. Here let's assume ~/backup.sh. Edit # Settings section for your environment.

backup.sh
#!/bin/sh

# Settings. Edit the variables for your environment.
DIR_TARGET=/home/mydata
DIR_TEMP=/data/username/mydata
DIR_LOG=/data/username/mylog
DIR_UPLOADER=/home/username/tools/Dropbox-Uploader

# Avoid running multiple instances at once
if [ $$ != `pgrep -fo $0`  ]; then
    echo "Found myself already running. Exiting. `date +%F_%T`"
    exit 1
fi

# Start timing
SECONDS=0
echo "---Starting backup at `date +%F_%T`" >> "$DIR_LOG"/backup.log

# Upload
echo "------Starting to compress at `date +%F_%T`" >> "$DIR_LOG"/backup.log
zip -r "$DIR_TEMP"/backup.zip "DIR_TARGET"
echo "------Zipping completed. Starting to upload at `date +%F_%T`" >> "$DIR_LOG"/backup.log
"$DIR_UPLOADER"/dropbox_uploader.sh upload "$DIR_TEMP"/backup.zip /
echo "------Uploading completed. Starting to download from the server to verify at `date +%F_%T`" >> "$DIR_LOG"/backup.log

# Verify (although it can't detect when the zip file is alreadly corrupted before data transfer to Dropbox)
"$DIR_UPLOADER"/dropbox_uploader.sh download /backup.zip "$DIR_TEMP"/backup_downloaded.zip
echo "------Download completed. Now checking md5sum at `date +%F_%T`" >> "$DIR_LOG"/backup.log
CHECKSUM_ORIGINAL=`md5sum -b < "$DIR_TEMP"/backup.zip`
CHECKSUM_ONSERVER=`md5sum -b < "$DIR_TEMP"/backup_downloaded.zip`

if [ "$CHECKSUM_ORIGINAL" = "$CHECKSUM_ONSERVER" ]; then
    echo "------Checksum matched at `date +%F_%T`" >> "$DIR_LOG"/backup.log
    IF_SUCCESSFUL=SUCCESSFUL
    rm "$DIR_TEMP"/backup_downloaded.zip

else
    echo "------ERROR: Checksum mismatch. Original: $CHECKSUM_ORIGINAL Server: $CHECKSUM_ONSERVER at `date +%F_%T`" >> "$DIR_LOG"/backup.log
    IF_SUCCESSFUL=FAILED
fi

# Get the size of the directory and compressed file
cd $DIR_TARGET
SIZE_DIR=`du -hd 0|cut -f 1`
cd $DIR_TEMP
SIZE_COMPRESSED=`ls -lah|grep backup.tar.gz|awk '{print $5}'|head -n 1`
SIZE_REMAINING=`"$DIR_UPLOADER"/dropbox_uploader.sh space|grep Free:|cut -f 2|sed -e 's/ Mb//'`

if [ "$SIZE_REMAINING" -ge 1000 ]; then
    UNIT_REMAINING="G"
    SIZE_REMAINING=`expr $SIZE_REMAINING / 1000`
else
    UNIT_REMAINING="M"
fi

# Get current and elapsed time
TIME_CURRENT=`date +%F_%T`
TIME_ELAPSED="$(($SECONDS / 3600))h $(($(($SECONDS % 3600)) / 60))m $(($SECONDS % 60))s"

# Log
touch "$DIR_LOG"/backup.log
echo "`date +%F_%T` $IF_SUCCESSFUL, $SIZE_DIR compressed into $SIZE_COMPRESSED ($SIZE_REMAINING$UNIT_REMAINING remaining) in $TIME_ELAPSED, " >> "$DIR_LOG"/backup.log
"$DIR_UPLOADER"/dropbox_uploader.sh upload "$DIR_LOG"/backup.log /

Run it automatically

自動実行するためcronを設定します。たとえば以下の内容をmycrontabとでも保存してください。毎日午前3時にDropboxにアップロードすることになります。

To set cron to automatically run backup.sh, save the following line as mycrontab or whatever. This is to upload to Dropbox at 3AM every day.

* 3 * * * ~/backup.sh

そしてcrontab mycronで読み込んでやります。こうするとそれまでに設定していたcrontabは消えてしまうので、あらかじめcrontab -lで確認しておいてください。

then run crontab mycrontab to schedule to run the script. Note that crontab previously set will be overridden by importing a new one. You can see the current crontab by crontab -l.

Show backup status on your shell

以下の内容を~/.bashrcあるいは相当するシェルの設定ファイルに追記してください。こうすることで直近のバックアップ結果がログイン時に表示されるようになります。ログファイルのパスは上で編集したものに合わせてください。

Add the following section to ~/.bashrc or its counterpart of the shell you use, so that you can see the most recent backup
history gets prompted every time you log in. Edit the path of the log file for your environment.

~/.bashrc
# Show the last backup status (only when logging in on terminal)
# The if-statement is necessary to avoid interfering scp data transfer.
# Edit the path of the log file so that it' consistent to backup.sh
if [[ $- =~ "i" ]]; then
    echo "Last backup: `tail -n 1 /home/username/log/backup.log`"
fi

Note

まとまった量のデータを毎日バックアップしていると、Dropboxクライアントを入れている端末がそのつどダウンロードするのに時間がかかります。その間に他のファイルのリアルタイムな同期ができなくて不便なことがあるため、Dropboxクライアントで選択型同期を用いて、"Apps"ディレクトリは同期しないようにすることを推奨します。

Dropbox client application on your PC downloads the backup data from the server, and it may take a long while every day, blocking syncing of other files. To address this, I recommend you use "selective sync" feature on your Dropbox client and exclude "Apps" directory where the backup data from your server is stored.

このスクリプトではデータを.zipに圧縮してからバックアップしていますが、もちろんそのままバックアップすることもできます。ただ、ファイル数が大きいとDropboxの転送が遅くなることに注意してください。私の環境では約50万ファイル44GBを14GBに圧縮してバックアップするという一連のプロセスがおよそ1時間30分で完了しています。

This script compresses your data into a zip file before backing it up. You can, of course, omit the step and back up original files. Note, however, that it may take a long while to transfer your files to Dropbox if they are numerous. It takes roughly one hour and a half to compress 50,000 files of 44GB into 14GB and then transfer to Dropbox.

3
2
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
3
2

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?