Let's say I have multiple large files and I have less than 1GB of free space.
I'd like to merge these files into one big file.
Not a solution:
Usually I use cat filename
+ Esc
+ *
> mergedfilename
Application:
I am working on a 256GB external hard drive. I have 256GB raw backup image split in 64GB files.
Questions:
Is there a way to say to cat/pv/dd to delete the source file but not at the end, continuously ? After x sectors or x blocks.
If not possible, what's your solution in order to achieve this ? Using
mksquashfs
,pgiz/gzip/...
,dd/pv
?
I am aware this is dangerous but the only constraint here is no free space to handle the size of a single split file. Want to know how to manage with a lack of free space and what are the commands used in this case. Why not a specific file system that remove the EOF from each single files and create a new one from these without copying any data.
Source: What's the best way to join files again after splitting them?