Backup your website using curlftpfs and rsync

This entry was posted by on Monday, 28 March, 2011 at

If you have a website, your hosting provider may have installed some backup options. Sometimes it is even possible to backup to a remote FTP server. All nice, but what if you only have FTP access to your webspace?

This is where linux commandline scripting provides a great solution. The idea is quite simple: on your home computer, backup server, NAS or whatever machine, a cron job does the following:

  1. Mount the FTP site
  2. Rsync the website with the backup
  3. Unmount the FTP site

And there you have it, only files which have been modified will be rsynced to your backup, no need to transfer all files in your website. Many of these will not change often anyway.

As a bash script, this looks as follows:

#!/bin/bash
# mount
curlftpfs -o umask=0777,uid=1000,gid=1000,allow_other ftp://username:password@ftp.mywebsite.org /mnt/mywebsite
# rsync
rsync --recursive --times --perms --links --delete /mnt/mywebsite/www /mnt/backups/mywebsite
# unmount
umount /mnt/mywebsite

This copies recursive, preserves time and permissions, recreates symlinks and deletes files on the local system if they have been deleted on the remote system.

Note that your username and passwords are in the file, so make sure they are chmod 700!

To get it to work, copy the shell script to your /etc/cron.daily directory. By default cron.daily runs at 6:25 but I’d rather have it do this work at night, so change /etc/crontab as follows:

25 2   * * *   root    test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily )

And then it should work.

References
FTP on Ubuntu using CurlFtpFs.
Mounting FTP as a folder.

Trackbacks/Pingbacks

  1. Moving files to a Linux box @ Freeminded.org
  2. Upgrading Ubuntu Server to 11.04 using SSH @ Freeminded.org
  3. NAS: Ubuntu Server on Intel ATOM @ Freeminded.org

Leave a Reply