1

Let's say I have multiple large files and I have less than 1GB of free space.

I'd like to merge these files into one big file.

Not a solution:

Usually I use cat filename + Esc + * > mergedfilename

Application:

I am working on a 256GB external hard drive. I have 256GB raw backup image split in 64GB files.

Questions:

  • Is there a way to say to cat/pv/dd to delete the source file but not at the end, continuously ? After x sectors or x blocks.

  • If not possible, what's your solution in order to achieve this ? Using mksquashfs, pgiz/gzip/..., dd/pv ?

I am aware this is dangerous but the only constraint here is no free space to handle the size of a single split file. Want to know how to manage with a lack of free space and what are the commands used in this case. Why not a specific file system that remove the EOF from each single files and create a new one from these without copying any data.

Source: What's the best way to join files again after splitting them?

None
  • 647
  • How much RAM do you have? You may not have hard disk space, but how much RAM space are you working with? Some operations may be possible if you have RAM to store one or more of those 64GB files. – Centimane Jun 29 '17 at 00:49
  • I'd like to avoid the RAM which is really not enough to hold one file. Yes it's a duplicate. Sorry for that. Thank you. – None Jun 29 '17 at 01:45

0 Answers0