Possible Duplicate:
Splitting large directory tree into specified-size chunks?
I have a partition that is about 14G and I need to back it up to DVD in such a way that I can use each disc to extract the contents if I need to.
Normally to backup such a partition, one can tar zip the folder and it's contents and slice it to 4.4G and then burn each slice to DVD, eg: to backup /mnt/data folder which contains about 14G of data, we run
tar cvzf - /mnt/data | split -b 4.4G backup.tgz
which would create something like (following is just an example)
backup.tgzaa 4.4G in size
backup.tgzab 4.4G in size
backup.tgzac 4.4G in size
backup.tgzad 0.8G in size
Once burned and if the situation arises for me to restore it, I need to put the slices together using cat and then gunzip and untar, if however there was an error with one of the discs when restoring then it's not possible to restore the backup.
To avoid this, I would like to backup the first 4.4G to the first DVD, then next 4.4G to the second disk, with any files hitting the 4.4G to cross over the next disc and so o.
In the event of any issues, I would at least have some access to the data.
Is there a way to do this??
Mani