Discussion:
[BackupPC-users] BackupPC v4.1.1-Problem restoring backups
Ib H. Rasmussen
2017-04-24 08:21:13 UTC
Permalink
I have just installed BackupPc v4.1.1 from Github on a new CentOS7 server.

The backup-server is at the same time my file-server. Unfortunately I
have lost a number of my data-directories, but I do still have several
backup's from a previous BackupPC v3 installation.

So my priority is to restore the backup's of the missing data. I'm using
rsync (which was also used to originally backup the data), and the data
directory is world writeable to rule-out any access-right problem.

Selinux has been deactivated for the same reason.

When restoring, I get the following error in the BackupPC LOG-file:

2017-04-20 11:38:30 User ihr requested restore to ihrsrv31-documentation
(ihrsrv31-documentation)
2017-04-20 11:38:30 Started restore on ihrsrv31-documentation (pid=6542)
2017-04-20 11:38:32 Restore failed on ihrsrv31-documentation (rsync
error: unexplained error (code 255) at io.c(629) [sender=3.0.9.6])

BackupPC::XS and Rsync-bpc are both installed from Github

How can I remedy this problem, and get my data back?

Best Regards

***@tdcadsl.dk


------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Johan Ehnberg
2017-04-24 10:05:35 UTC
Permalink
Hi Ib,

I am assuming you are running an restore job from the web UI.

First, make sure that PoolV3Enabled is on since you are accessing a V3
pool from a V4 installation. This is found under Server settings general
parameters.

As a quick solution just to get access to the data or to do a manual
restore, you can of course browse the web UI for the required files and
download them through the browser. On the command line, you can use
BackupPC_tarCreate for a faster approach directly on the server,
especially when using large files. Do the files show up? Does this work
or are you getting errors?

In order to get the proper restore functions working, can you please
post your Rsync Paths/Commands/Args? Can you make a successful backup
using those? Essentially, comparing the backup and restore args is
essential, if the backups are working.

If it is not working, do you still have the V3 equivalents for these
settings at hand to compare against?

Also ensure that SSH keys are installed on the new server and that SSH
is accepting the new host key (of the client) automatically or add it
manually.

To debug the action itself, you can run BackupPC_restore on the command
line on the server with the -v flag to get verbose output.

Post these details and we should be able to find out what is not working
properly.

Best regards,
Johan Ehnberg
Post by Ib H. Rasmussen
I have just installed BackupPc v4.1.1 from Github on a new CentOS7 server.
The backup-server is at the same time my file-server. Unfortunately I
have lost a number of my data-directories, but I do still have several
backup's from a previous BackupPC v3 installation.
So my priority is to restore the backup's of the missing data. I'm using
rsync (which was also used to originally backup the data), and the data
directory is world writeable to rule-out any access-right problem.
Selinux has been deactivated for the same reason.
2017-04-20 11:38:30 User ihr requested restore to ihrsrv31-documentation
(ihrsrv31-documentation)
2017-04-20 11:38:30 Started restore on ihrsrv31-documentation (pid=6542)
2017-04-20 11:38:32 Restore failed on ihrsrv31-documentation (rsync
error: unexplained error (code 255) at io.c(629) [sender=3.0.9.6])
BackupPC::XS and Rsync-bpc are both installed from Github
How can I remedy this problem, and get my data back?
Best Regards
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
--
Johan Ehnberg
***@molnix.com
+358503209688

Molnix Oy
molnix.com

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Ib H. Rasmussen
2017-04-24 15:50:48 UTC
Permalink
Hi Johan, and thanks for the fast responce,

I'm running the restore job from the Web UI, as you assume.

I have checked, and PoolV3Enabled is ON (1)

I can browse and restore single files via the browser, but as it
concerns about 2TB of data it is some job!!.

The RSync Paths/Commands/Args are pretty much default, like:

###########################################################################
# Rsync/Rsyncd Configuration
# (can be overwritten in the per-PC log file)
###########################################################################
#
# Path to rsync executable on the client. If it is set, it is passed to
# to rsync_bpc using the --rsync-path option. You can also add sudo,
# for example:
#
# $Conf{RsyncClientPath} = 'sudo /usr/bin/rsync';
#
# For OSX laptop clients, you can use caffeinate to make sure the laptop
# stays awake during the backup, eg:
#
# $Conf{RsyncClientPath} = '/usr/bin/sudo /usr/bin/caffeinate -ism
/usr/bin/rsync';
#
# This setting only matters if $Conf{XferMethod} = 'rsync'.
#
$Conf{RsyncClientPath} = '/usr/bin/rsync';

#
# Full path to rsync_bpc on the server. Rsync_bpc is the customized
# version of rsync that is used on the server for rsync and rsyncd
# transfers.
#
$Conf{RsyncBackupPCPath} = "/usr/bin/rsync_bpc";

#
# Ssh arguments for rsync to run ssh to connect to the client.
# Rather than permit root ssh on the client, it is more secure
# to just allow ssh via a low-privileged user, and use sudo
# in $Conf{RsyncClientPath}.
#
# This setting only matters if $Conf{XferMethod} = 'rsync'.
#
$Conf{RsyncSshArgs} = [
'-e', '$sshPath -l root',
];

#
# Share name to backup. For $Conf{XferMethod} = "rsync" this should
# be a file system path, eg '/' or '/home'.
#
# For $Conf{XferMethod} = "rsyncd" this should be the name of the module
# to backup (ie: the name from /etc/rsynd.conf).
#
# This can also be a list of multiple file system paths or modules.
# For example, by adding --one-file-system to $Conf{RsyncArgs} you
# can backup each file system separately, which makes restoring one
# bad file system easier. In this case you would list all of the mount
# points:
#
# $Conf{RsyncShareName} = ['/', '/var', '/data', '/boot'];
#
$Conf{RsyncShareName} = '/datard1/documentation';

#
# Rsync daemon port on the client, for $Conf{XferMethod} = "rsyncd".
#
$Conf{RsyncdClientPort} = 873;

#
# Rsync daemon username on client, for $Conf{XferMethod} = "rsyncd".
# The username and password are stored on the client in whatever file
# the "secrets file" parameter in rsyncd.conf points to
# (eg: /etc/rsyncd.secrets).
#
$Conf{RsyncdUserName} = '';

#
# Rsync daemon username on client, for $Conf{XferMethod} = "rsyncd".
# The username and password are stored on the client in whatever file
# the "secrets file" parameter in rsyncd.conf points to
# (eg: /etc/rsyncd.secrets).
#
$Conf{RsyncdPasswd} = '';

#
# Additional arguments for a full rsync or rsyncd backup.
#
# The --checksum argument causes the client to send full-file checksum
# for every file (meaning the client reads every file and computes the
# checksum, which is sent with the file list). On the server, rsync_bpc
# will skip any files that have a matching full-file checksum, and size,
# mtime and number of hardlinks. Any file that has different attributes
# will be updating using the block rsync algorithm.
#
# In V3, full backups applied the block rsync algorithm to every file,
# which is a lot slower but a bit more conservative. To get that
# behavior, replace --checksum with --ignore-times.
#
$Conf{RsyncFullArgsExtra} = [
'--checksum',
];

#
# Arguments to rsync for backup. Do not edit the first set unless you
# have a good understanding of rsync options.
#
$Conf{RsyncArgs} = [
'--super',
'--recursive',
'--protect-args',
'--numeric-ids',
'--perms',
'--owner',
'--group',
'-D',
'--times',
'--links',
'--hard-links',
'--delete',
'--partial',
'--log-format=log: %o %i %B %8U,%8G %9l %f%L',
'--stats',
#
# Add additional arguments here, for example --acls or --xattrs
# if all the clients support them.
#
#'--acls',
#'--xattrs',
];

#
# Additional arguments added to RsyncArgs. This can be used in
# combination with $Conf{RsyncArgs} to allow customization of
# the rsync arguments on a part-client basis. The standard
# arguments go in $Conf{RsyncArgs} and $Conf{RsyncArgsExtra}
# can be set on a per-client basis.
#
# Examples of additional arguments that should work are --exclude/--include,
# eg:
#
# $Conf{RsyncArgsExtra} = [
# '--exclude', '/proc',
# '--exclude', '*.tmp',
# '--acls',
# '--xattrs',
# ];
#
# Both $Conf{RsyncArgs} and $Conf{RsyncArgsExtra} are subject
# to the following variable substitutions:
#
# $client client name being backed up
# $host hostname (could be different from client name if
# $Conf{ClientNameAlias} is set)
# $hostIP IP address of host
# $confDir configuration directory path
#
# This allows settings of the form:
#
# $Conf{RsyncArgsExtra} = [
# '--exclude-from=$confDir/pc/$host.exclude',
# ];
#
$Conf{RsyncArgsExtra} = [];

#
# Arguments to rsync for restore. Do not edit the first set unless you
# have a thorough understanding of how File::RsyncP works.
#
# If you want to disable direct restores using rsync (eg: is the module
# is read-only), you should set $Conf{RsyncRestoreArgs} to undef and
# the corresponding CGI restore option will be removed.
#
# $Conf{RsyncRestoreArgs} is subject to the following variable
# substitutions:
#
# $client client name being backed up
# $host hostname (could be different from client name if
# $Conf{ClientNameAlias} is set)
# $hostIP IP address of host
# $confDir configuration directory path
#
# Note: $Conf{RsyncArgsExtra} doesn't apply to $Conf{RsyncRestoreArgs}.
#
$Conf{RsyncRestoreArgs} = [
'--recursive',
'--super',
'--protect-args',
'--numeric-ids',
'--perms',
'--owner',
'--group',
'-D',
'--times',
'--links',
'--hard-links',
'--delete',
'--partial',
'--log-format=log: %o %i %B %8U,%8G %9l %f%L',
'--stats',
#
# Add additional arguments here
#
#'--acls',
#'--xattrs',
];

I have tried an incr. backup, and that leads to the exact same RSync error.

I do not have the equivalent V3 settings at hand, but they could be
found. I do remenber the V3 RSync-commands were executed without ssh, as
the file- and backup-server is the same machine (just using different
discs).

I have generated ssh-keys, and transferred it into
/root/.ssh/authorized_keys, but if I had made a mistake with it, I would
have expected an error concerning ssh and not RSync? Of cause I could be
wrong.

About the syntax for executing the restore from the command-line i'm not
quite sure, maybe you could elaborate a bit about that (If you still
think it's worth while).

Best Regards
Ib H. Rasmussen
Post by Johan Ehnberg
Hi Ib,
I am assuming you are running an restore job from the web UI.
First, make sure that PoolV3Enabled is on since you are accessing a V3
pool from a V4 installation. This is found under Server settings general
parameters.
As a quick solution just to get access to the data or to do a manual
restore, you can of course browse the web UI for the required files and
download them through the browser. On the command line, you can use
BackupPC_tarCreate for a faster approach directly on the server,
especially when using large files. Do the files show up? Does this work
or are you getting errors?
In order to get the proper restore functions working, can you please
post your Rsync Paths/Commands/Args? Can you make a successful backup
using those? Essentially, comparing the backup and restore args is
essential, if the backups are working.
If it is not working, do you still have the V3 equivalents for these
settings at hand to compare against?
Also ensure that SSH keys are installed on the new server and that SSH
is accepting the new host key (of the client) automatically or add it
manually.
To debug the action itself, you can run BackupPC_restore on the command
line on the server with the -v flag to get verbose output.
Post these details and we should be able to find out what is not working
properly.
Best regards,
Johan Ehnberg
Post by Ib H. Rasmussen
I have just installed BackupPc v4.1.1 from Github on a new CentOS7 server.
The backup-server is at the same time my file-server. Unfortunately I
have lost a number of my data-directories, but I do still have several
backup's from a previous BackupPC v3 installation.
So my priority is to restore the backup's of the missing data. I'm using
rsync (which was also used to originally backup the data), and the data
directory is world writeable to rule-out any access-right problem.
Selinux has been deactivated for the same reason.
2017-04-20 11:38:30 User ihr requested restore to ihrsrv31-documentation
(ihrsrv31-documentation)
2017-04-20 11:38:30 Started restore on ihrsrv31-documentation (pid=6542)
2017-04-20 11:38:32 Restore failed on ihrsrv31-documentation (rsync
error: unexplained error (code 255) at io.c(629) [sender=3.0.9.6])
BackupPC::XS and Rsync-bpc are both installed from Github
How can I remedy this problem, and get my data back?
Best Regards
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Johan Ehnberg
2017-04-24 19:14:36 UTC
Permalink
Hi,
Post by Ib H. Rasmussen
I can browse and restore single files via the browser, but as it
concerns about 2TB of data it is some job!!
The tarCreate option can be a doable solution as a one-off since the
rsync route may require more time to set up but web restores work.
Basically it creates a tar that you can pipe back to your restore location:

BackupPC_tarCreate -h YOURHOSTNAME -n -0 -s / / | tar -x -C /tmp

Or, more elaborately for a remote host over a slow link running as
another user than backuppc and ensuring all file attributes can be set
when extracting, something like:

sudo sudo -i -u backuppc /usr/share/backuppc/bin/BackupPC_tarCreate -h
YOURHOSTNAME -n -0 -s / / 2> /dev/null |pv -Cq -B 256 |plzip -1 |ssh
SOMEOTHERHOST -- 'plzip -d | sudo tar -x -C /tmp'
Post by Ib H. Rasmussen
About the syntax for executing the restore from the command-line i'm not
quite sure, maybe you could elaborate a bit about that (If you still
think it's worth while).
They are actually documented well in the file itself (it is perl so not
binary, i.e. you can read the file with 'less
/usr/share/backuppc/bin/BackupPC_restore').

Let us know how this round goes.

Best regards,
Johan
--
Johan Ehnberg
***@molnix.com
+358503209688

Molnix Oy
molnix.com

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Ib H. Rasmussen
2017-04-25 07:11:06 UTC
Permalink
Hi Johan,

I tried your suggestion of using tarCreate, but with no luck - I keep
getting the error "bad backup number" for host.

In order to keep it simple, I decided to just make a list of the backup,
and leave the tar-part out for a start.

it's all run as the backuppc-user (backuppc)

I entered: /usr/local/BackupPC/bin/BackupPC_tarCreate -h ihrsrv31 -n -0
-s /datard1/documentation -l *

and got the error "bad backup number -0 for host ihrsrv31".

no matter what I enter as backup number - a direct number like 1189
(which is a previous full V3 backup), or a relative number like -0 / -1
/ -2 as you suggest, I get the bad number error.

Have I misunderstood something?

Best Regards

Ib H. Rasmussen
Post by Johan Ehnberg
Hi,
Post by Ib H. Rasmussen
I can browse and restore single files via the browser, but as it
concerns about 2TB of data it is some job!!
The tarCreate option can be a doable solution as a one-off since the
rsync route may require more time to set up but web restores work.
BackupPC_tarCreate -h YOURHOSTNAME -n -0 -s / / | tar -x -C /tmp
Or, more elaborately for a remote host over a slow link running as
another user than backuppc and ensuring all file attributes can be set
sudo sudo -i -u backuppc /usr/share/backuppc/bin/BackupPC_tarCreate -h
YOURHOSTNAME -n -0 -s / / 2> /dev/null |pv -Cq -B 256 |plzip -1 |ssh
SOMEOTHERHOST -- 'plzip -d | sudo tar -x -C /tmp'
Post by Ib H. Rasmussen
About the syntax for executing the restore from the command-line i'm not
quite sure, maybe you could elaborate a bit about that (If you still
think it's worth while).
They are actually documented well in the file itself (it is perl so not
binary, i.e. you can read the file with 'less
/usr/share/backuppc/bin/BackupPC_restore').
Let us know how this round goes.
Best regards,
Johan
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Johan Ehnberg
2017-04-25 07:26:48 UTC
Permalink
Hi Ib,

This is sounds more like a problem with BackupPC internals, if the
backup number is indeed correct. Maybe someone more familiar with the
code in question can help? Maybe something got corrupted in the pool or
metadata as well (you mentioned data loss)?

Specifically, manual and automatic jobs both are failing but the web UI
works; where the working and failing paths diverge should help dissect
and pinpoint the error.

Meanwhile, if you have the extra space, another route to take is to
convert the pool. I mention extra space because I recommend keeping the
original pool available and instead doing a partition copy of it. On
this mailing list, there are a few notes about the new tool to convert
the pool from V3 to V4 - I have not used it myself, though. My thinking
here is that maybe V4 is better geared for restoring V4 pool files. If
that is the case and you succeed this way, there is room for improvement
in the compatibility layers for V3.

Let's see if more people pitch in here, and I'll be back with further
ideas if I come up with any.

Best regards,
Johan
Post by Ib H. Rasmussen
Hi Johan,
I tried your suggestion of using tarCreate, but with no luck - I keep
getting the error "bad backup number" for host.
In order to keep it simple, I decided to just make a list of the backup,
and leave the tar-part out for a start.
it's all run as the backuppc-user (backuppc)
I entered: /usr/local/BackupPC/bin/BackupPC_tarCreate -h ihrsrv31 -n -0
-s /datard1/documentation -l *
and got the error "bad backup number -0 for host ihrsrv31".
no matter what I enter as backup number - a direct number like 1189
(which is a previous full V3 backup), or a relative number like -0 / -1
/ -2 as you suggest, I get the bad number error.
Have I misunderstood something?
Best Regards
Ib H. Rasmussen
Post by Johan Ehnberg
Hi,
Post by Ib H. Rasmussen
I can browse and restore single files via the browser, but as it
concerns about 2TB of data it is some job!!
The tarCreate option can be a doable solution as a one-off since the
rsync route may require more time to set up but web restores work.
BackupPC_tarCreate -h YOURHOSTNAME -n -0 -s / / | tar -x -C /tmp
Or, more elaborately for a remote host over a slow link running as
another user than backuppc and ensuring all file attributes can be set
sudo sudo -i -u backuppc /usr/share/backuppc/bin/BackupPC_tarCreate -h
YOURHOSTNAME -n -0 -s / / 2> /dev/null |pv -Cq -B 256 |plzip -1 |ssh
SOMEOTHERHOST -- 'plzip -d | sudo tar -x -C /tmp'
Post by Ib H. Rasmussen
About the syntax for executing the restore from the command-line i'm not
quite sure, maybe you could elaborate a bit about that (If you still
think it's worth while).
They are actually documented well in the file itself (it is perl so not
binary, i.e. you can read the file with 'less
/usr/share/backuppc/bin/BackupPC_restore').
Let us know how this round goes.
Best regards,
Johan
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
--
Johan Ehnberg
***@molnix.com
+358503209688

Molnix Oy
molnix.com

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Ib H. Rasmussen
2017-04-27 06:40:03 UTC
Permalink
Hi Johan,

First thank you so much for your efford, although is hasn't solved the
problem yet.

Unfortunally I don't that much extra space to accommodate an extra copy
of the backups. (around 6TB).

I have been thinking it looks like none of the BackupPC utility programs
(BackupPC_xxxx) are working, so there must be a common denominator.

I wonder how do these programs find the backups - for tarCreate for
instance you nowhere give a path to the backups - so how does this work?

I'm looking forward to hear from anybody with a clue to fix this problem!

Best regards

Ib
Post by Johan Ehnberg
Hi Ib,
This is sounds more like a problem with BackupPC internals, if the
backup number is indeed correct. Maybe someone more familiar with the
code in question can help? Maybe something got corrupted in the pool or
metadata as well (you mentioned data loss)?
Specifically, manual and automatic jobs both are failing but the web UI
works; where the working and failing paths diverge should help dissect
and pinpoint the error.
Meanwhile, if you have the extra space, another route to take is to
convert the pool. I mention extra space because I recommend keeping the
original pool available and instead doing a partition copy of it. On
this mailing list, there are a few notes about the new tool to convert
the pool from V3 to V4 - I have not used it myself, though. My thinking
here is that maybe V4 is better geared for restoring V4 pool files. If
that is the case and you succeed this way, there is room for improvement
in the compatibility layers for V3.
Let's see if more people pitch in here, and I'll be back with further
ideas if I come up with any.
Best regards,
Johan
Post by Ib H. Rasmussen
Hi Johan,
I tried your suggestion of using tarCreate, but with no luck - I keep
getting the error "bad backup number" for host.
In order to keep it simple, I decided to just make a list of the backup,
and leave the tar-part out for a start.
it's all run as the backuppc-user (backuppc)
I entered: /usr/local/BackupPC/bin/BackupPC_tarCreate -h ihrsrv31 -n -0
-s /datard1/documentation -l *
and got the error "bad backup number -0 for host ihrsrv31".
no matter what I enter as backup number - a direct number like 1189
(which is a previous full V3 backup), or a relative number like -0 / -1
/ -2 as you suggest, I get the bad number error.
Have I misunderstood something?
Best Regards
Ib H. Rasmussen
Post by Johan Ehnberg
Hi,
Post by Ib H. Rasmussen
I can browse and restore single files via the browser, but as it
concerns about 2TB of data it is some job!!
The tarCreate option can be a doable solution as a one-off since the
rsync route may require more time to set up but web restores work.
BackupPC_tarCreate -h YOURHOSTNAME -n -0 -s / / | tar -x -C /tmp
Or, more elaborately for a remote host over a slow link running as
another user than backuppc and ensuring all file attributes can be set
sudo sudo -i -u backuppc /usr/share/backuppc/bin/BackupPC_tarCreate -h
YOURHOSTNAME -n -0 -s / / 2> /dev/null |pv -Cq -B 256 |plzip -1 |ssh
SOMEOTHERHOST -- 'plzip -d | sudo tar -x -C /tmp'
Post by Ib H. Rasmussen
About the syntax for executing the restore from the command-line i'm not
quite sure, maybe you could elaborate a bit about that (If you still
think it's worth while).
They are actually documented well in the file itself (it is perl so not
binary, i.e. you can read the file with 'less
/usr/share/backuppc/bin/BackupPC_restore').
Let us know how this round goes.
Best regards,
Johan
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/
Johan Ehnberg
2017-04-27 08:56:39 UTC
Permalink
Hi Ib,

Another approach that I came to think of is to try to emulate the
environment of your previous working setup. Ie. install BackupPC 3 using
the same OS that you had before. VMs or containers are a good way to do
this.
Post by Ib H. Rasmussen
I wonder how do these programs find the backups - for tarCreate for
instance you nowhere give a path to the backups - so how does this work?
All of that comes from config.pl, under /etc/backuppc or /etc/BackupPC.
The upgrade process also does some mangling of that file, so not
everything used as such from a V3 install.

Best regards,
Johan

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
BackupPC-users mailing list
BackupPC-***@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Loading...