Type your search keyword, and press enter

rsync with spaces in filenames fun

Your ads will be inserted here by

Easy Plugin for AdSense.

Please go to the plugin admin page to
Paste your ad code OR
Suppress this ad slot.

While I was trying to copy a load of files from one server to another I found a problem I’ve seen time and time again, rsync with filenames that have spaces in them. While normally this can be easily fixed via number of methods:

Continue reading “rsync with spaces in filenames fun”

Hourly Backup Script for Email

Your ads will be inserted here by

Easy Plugin for AdSense.

Please go to the plugin admin page to
Paste your ad code OR
Suppress this ad slot.

So I was working on a script a while ago to backup all my email on my Linux box. I put together a fairly simple bash script to delve into home directories and pick out the Maildir and back it up. Great!

DRIVE=”/dev/sdb1″
HOME=”/home/”
LIST=”/tmp/backlist_$$.txt”
#
mount $DRIVE /backup
set $(date)
for DIR in `find /home/ -name “Maildir” -print`; do
NAME=`echo $DIR|awk -F\/ {‘ print $3 ‘}`
tar vcfz “/backup/Mail/mail_full_${NAME}_$6-$2-$3.tgz” $DIR
umount /backup

So it was doing a full backup each day, which quickly became stoopid in size, so I decided to go for a weekly full and daily differentials.

DRIVE=”/dev/sdb1″
HOME=”/home/”
LIST=”/tmp/backlist_$$.txt”
#
mount $DRIVE /backup
set $(date)
#
if test “$1” = “Sat” ; then
# weekly a full backup of all data and config. settings:

#Backup of users Mail Directories
for DIR in `find /home/ -name “Maildir” -print`; do
NAME=`echo $DIR|awk -F\/ {‘ print $3 ‘}`
tar vcfz “/backup/Mail/mail_full_${NAME}_$6-$2-$3.tgz” $DIR
done
rm -f /backup/home/Mail/mail_diff*
#Daily differential backup
else
# incremental maildir
for DIR in `find /home/ -name “Maildir” -print`; do
NAME=`echo $DIR|awk -F\/ {‘ print $3 ‘}`
find $DIR -depth -type f \( -ctime -1 -o -mtime -1 \) -print > $LIST
tar vcfzT “/backup/Mail/mail_diff_${NAME}_$6-$2-$3.tgz” “$LIST”
done

fi
sleep 5
umount /backup

Nice! So now I’ve got a nice little backup solution. I have a function which rotates which I have yet to add. Then it will be nearly perfect…

 

Or will it?

So what happens if I recieve an email during the day, then accidentally delete it before the daily backup?? err…. uh-oh!

OK so a secondary mail store now would work a treat

for DIR in `find /home/ -name “Maildir” -print`; do
NAME=`echo $DIR|awk -F\/ {‘ print $3 ‘}`
/usr/bin/rsync -a $DIR/. /backup/Maildirs/$NAME/Maildir/ –delete -exclude=”dovecot.index*” –exclude=”dovecot-uidlist”
done

Dropping that in a seperate script and adding an hourly cron job would make sure I could go back an hour. But only an hour… hmm

So utilising a magic feature of the *nix filesystem inodes, I created a section which will copy using hardlinks several generations of the files but only actually store any differences thus not eating up precious disk space…

for DIR in `find /home/ -name “Maildir” -print`; do
NAME=`echo $DIR|awk -F\/ {‘ print $3 ‘}`
# NAME=”accounts”
echo “Running hourly backup for $NAME”
mv /backup/Maildirs/$NAME/Maildir.{12,tmp}
mv /backup/Maildirs/$NAME/Maildir.{11,12}
mv /backup/Maildirs/$NAME/Maildir.{10,11}
mv /backup/Maildirs/$NAME/Maildir.{9,10}
mv /backup/Maildirs/$NAME/Maildir.{8,9}
mv /backup/Maildirs/$NAME/Maildir.{7,8}
mv /backup/Maildirs/$NAME/Maildir.{6,7}
mv /backup/Maildirs/$NAME/Maildir.{5,6}
mv /backup/Maildirs/$NAME/Maildir.{4,5}
mv /backup/Maildirs/$NAME/Maildir.{3,4}
mv /backup/Maildirs/$NAME/Maildir.{2,3}
mv /backup/Maildirs/$NAME/Maildir.{1,2}
mv /backup/Maildirs/$NAME/Maildir.{0,1}
mv /backup/Maildirs/$NAME/Maildir.{tmp,0}
cp -al /backup/Maildirs/$NAME/Maildir.{1/.,0}
/usr/bin/rsync -a $DIR/. /backup/Maildirs/$NAME/Maildir.0/ –delete -exclude=”dovecot.index*” –exclude=”dovecot-uidlist”
echo `date` >/backup/Maildirs/$NAME/Maildir.0/dumptime
done

I probably could have made this a lot more elegant (and probably will put a loop in there) but for now it does the trick. It keeps 12 hourly generations of the backup but really only stores the full copy once and the differences between each generation in the copies by hardlinking the files to the same inodes rather than creating new copies each time.

My next task to make this even better is to integrate the 2 scripts, pretty them up and use hourly/daily/weekly versions rather than the tarballs.