Discussion:
[BackupPC-users] Backup a Mac OS X Client
Steve Waltner
2003-07-20 10:53:02 UTC
Permalink
I've seen a couple people mention getting BackupPC to backup Mac OS X,
but I've had limited success with that. Could you please outline the
steps you took to get this to run?

I've got through the OpenSSH client setup steps so "ssh 192.168.1.100
-l root pwd" will correctly login to the PowerBook (running OS X
10.2.6) and print "/private/var/root". I create a swaltner-powerbook
entry in the hosts file and then add the following to its per-PC config
file:

$Conf{XferMethod} = 'rsync';
$Conf{ClientNameAlias} = '192.168.1.100';

After starting a backup, it connects, but fails within a few seconds
logging the following messages

2003/7/17 15:14:33 full backup started for directory /
2003/7/17 15:15:46 Got fatal error during xfer (fileListReceive failed)
2003/7/17 15:15:56 Dump aborted (fileListReceive failed)

The instructions say it requires rsync 2.5.5, while OS X ships with
2.5.2, so I compiled and installed 2.5.6 into /usr/local/rsync-2.5.6,
set the RsyncClientPath config and retried getting the same results. I
did a lsof on the rsync process running on the PowerBook and found that
it was scanning files/directories before it failed. I set the
RsyncShareName to something much smaller that / to see if that worked.
By setting this to a directory with about 100 files, I could get a
backup to actually start. I also noticed during this rsync backup that
the extract process running on the server was using a whole CPU, where
the CPU usage for SMB backups is negligible.

I then tried tar and ran into a different problem. I seemed to always
get the following error in the log:

2003/7/17 16:41:39 full backup started for directory /
2003/7/17 16:41:43 Got fatal error during xfer (Tar exited with error
65280 () status)
2003/7/17 16:41:53 Dump aborted (Tar exited with error 65280 () status)

I tried upgrading to GNU tar 1.3.25, which I downloaded from
alpha.gnu.org, which has the same problem as the OS X supplied version
of tar.

One of these methods failed when it hit a 2.6 GB file on my hard drive.
I'm assuming this was the rsync method, since I don't remember getting
anything besides the 65280 error for tar. I went through the various
programs (rsync, openssh, perl on both client/server) and didn't see
anything that didn't report largfiles support.

I've tried quite a few things and can't get this figured out, so if
anyone has suggestions on what to do, I'd appreciate the help. We've
got several PowerBooks and PowerMacs that have no backup solution right
now. In fact, one of our desktop users had a hard drive failure last
week and lost some files, which is why I'm trying to get them added to
the backup rotation.

Steve
Steve Waltner
2003-07-21 15:27:23 UTC
Permalink
Le Dimanche, 20 juil 2003, à 08:52 America/Montreal, Steve Waltner a
Post by Steve Waltner
I've seen a couple people mention getting BackupPC to backup Mac OS
X, but I've had limited success with that. Could you please outline
the steps you took to get this to run?
1. Installed fink (http://fink.sf.net/) on the MacOSX client.
2. Type "sudo apt-get install rsync" in the terminal. This will
trigger fink to download and install rsync 2.5.5.
3. Put backuppc's public SSH key in ~gfk/.ssh/authorized_keys on the
MacOSX client. Check that it works. I don't know about you, but it
always takes me 2-3 times to make these things work...
4. Set up sudo so that it doesn't ask for a password on the MacOSX
ALL
-----
$Conf{XferMethod} = 'rsync';
# This is the path of fink's version of rsync on the MacOSX client.
$Conf{RsyncClientPath} = '/sw/bin/rsync';
# I log as user gfk and use sudo instead of root to do the backup.
$Conf{RsyncClientCmd} = '$sshPath -q -l gfk $host sudo $rsyncPath
$argList';
$Conf{RsyncClientRestoreCmd} = '$sshPath -l gfk $host sudo $rsyncPath
$argList';
# Only backup the Users directory
$Conf{RsyncShareName} = '/Users';
-----
6. Add "powerbook 0" to the end of the list in conf/hosts
Hope this helps,
GFK's
I only see a couple differences between our two configs. The first is
that I compiled rsync 2.5.6 from source, where you downloaded rsync via
fink and then I ssh into the root account, where you ssh into your own
account and then sudo to root. I installed fink this morning and
installed the rsync 2.5.5-1 package. I'm still running into the 2 GB
limitation for files with this setup.

Would you create a large file on your drive to test this limitation.
The easiest way to do this would be to launch Disk Copy and select
"New->Blank Image" from the menu bar. Create a DVD-RAM (2.6 GB), Mac OS
Extended, no encryption disk image and place it somewhere under
/Users/gfk/. Try an incremental backup and let me know what happens. I
was getting errors on a few files that were larger than 2 GB. I needed
to add them to my BackupFilesExclude list to get the backup process to
start copying files.

Steve
Guillaume Filion
2003-07-21 18:20:11 UTC
Permalink
Le Lundi, 21 juil 2003, à 13:11 America/Montreal, Steve Waltner a écrit
Post by Steve Waltner
Would you create a large file on your drive to test this limitation.
The easiest way to do this would be to launch Disk Copy and select
"New->Blank Image" from the menu bar. Create a DVD-RAM (2.6 GB), Mac
OS Extended, no encryption disk image and place it somewhere under
/Users/gfk/. Try an incremental backup and let me know what happens.
Sure, here it is:
[powerbook:~] gfk% ls -l test-backuppc.dmg
-rw-r--r-- 1 gfk staff 2599520256 Jul 21 16:09 test-backuppc.dmg

And here's the error I get when I try to do an incremental backup:
2003/7/21 16:10:49 incr backup started back to 2003-07-18 15:11:42 for
directory /Users
2003/7/21 16:10:57 Got fatal error during xfer (fileListReceive failed)
2003/7/21 16:11:07 Dump aborted (fileListReceive failed)

If I "rm test-backuppc.dmg" and try an incremental backup, I get this:
003/7/21 16:12:53 incr backup started back to 2003-07-18 15:11:42 for
directory /Users
2003/7/21 16:13:49 incr backup 3 complete, 2744 files, 21648839 bytes,
0 xferErrs (0 bad files, 0 bad shares, 0 other)

So there is really a 2 GB limitation in rsync on MacOSX. You might want
to consider making a bug report at rsync.
(http://rsync.samba.org/nobugs.html)

Good luck,
GFK's
--
Guillaume Filion
Logidac Tech., Beaumont, Québec, Canada - http://logidac.com/
PGP Key and more: http://guillaume.filion.org/
c***@users.sourceforge.net
2003-07-22 05:04:16 UTC
Permalink
Post by Steve Waltner
Would you create a large file on your drive to test this limitation.
The easiest way to do this would be to launch Disk Copy and select
"New->Blank Image" from the menu bar. Create a DVD-RAM (2.6 GB), Mac OS
Extended, no encryption disk image and place it somewhere under=20
/Users/gfk/. Try an incremental backup and let me know what happens. I
was getting errors on a few files that were larger than 2 GB. I needed
to add them to my BackupFilesExclude list to get the backup process to
start copying files.
I just made a change to File-RsyncP to fix a bug in large file support.
Unfortunately I haven't has time to test it on a large file yet. Sorry.

The new version is 0.43 on both SourceForge and CPAN:

http://perlrsync.sourceforge.net/

Would you mind trying again with the latest File-RsyncP?

Also, one good test to try is the real rsync on both sides. If that
fails then there is a problem with rsync's configuration. If that
works then the bug is in File-RsyncP or BackupPC.

Craig
Steve Waltner
2003-07-22 14:04:16 UTC
Permalink
Post by c***@users.sourceforge.net
Post by Steve Waltner
Would you create a large file on your drive to test this limitation.
The easiest way to do this would be to launch Disk Copy and select
"New->Blank Image" from the menu bar. Create a DVD-RAM (2.6 GB), Mac
OS
Extended, no encryption disk image and place it somewhere under=20
/Users/gfk/. Try an incremental backup and let me know what happens. I
was getting errors on a few files that were larger than 2 GB. I needed
to add them to my BackupFilesExclude list to get the backup process to
start copying files.
I just made a change to File-RsyncP to fix a bug in large file support.
Unfortunately I haven't has time to test it on a large file yet.
Sorry.
http://perlrsync.sourceforge.net/
Would you mind trying again with the latest File-RsyncP?
Also, one good test to try is the real rsync on both sides. If that
fails then there is a problem with rsync's configuration. If that
works then the bug is in File-RsyncP or BackupPC.
Craig
I was getting ready to let you know that there is a bug in File-RsyncP
related to largefile support. I was running File-RsyncP 0.41 on my
Solaris system. After Guillaume said he was experiencing the same
trouble, I began investigating further. I upgraded to 0.42 yesterday
afternoon, but that also failed. I verified the output of "perl -V" for
uselargefiles=define. I also tried a rsync from the Solaris server to
grab the same 2.5 GB file and this worked fine, so the limitation does
appear to be in File-RsyncP instead of the rsync server on Mac OS X.

I just upgraded to File-RsyncP 0.43 and retried the backup. It still
fails if I don't mask out my > 2GB files. Is there anything else to
check on my

[dhcp-153-79-9-127:~] swaltner% ls -l Documents/Virtual\ PC\
List/Windows\ 2000vpc6/Windows\ 2000\ Professional.vhdp/BaseDrive.vhd
-rw-r--r-- 1 swaltner unknown 2940937215 Jul 22 08:26
Documents/Virtual PC List/Windows 2000.vpc6/Windows 2000
Professional.vhdp/BaseDrive.vhd
[dhcp-153-79-9-127:~] swaltner%

Here is an abreviated output from "BackupPC_dump -f -v
swaltner-powerbook":
started full dump, share=/Users
Running: /bin/ssh -l root dhcp-153-79-9-127 /sw/bin/rsync --server
--sender --numeric-ids --perms --owner --group --devices --links
--times --block-size=2048 --recursive --ignore-times . /Users/
Xfer PIDs are now 28202
xferPids 28202
Got remote protocol 26
overflow: flags=0x61 l1=99 l2=1701273963,
lastname=swaltner/Documents/Virtual PC List/Windows 2000.vpc6/Windows
2000 Professional.vhdp/BaseDrive.vhd
overflow: flags=0x46 l1=0 l2=6646889,
lastname=swaltner/Documents/Virtual PC List/Windows 2000.vpc6/Windows
2000 Professional.vhdp/BaseDrive.vhd
fileListReceive() failed
Done: 0 files, 0 bytes

Steve

Loading...