4

I have many files in a folder. I want to concatenate all these files to a single file. For example cat * > final_file; But this will increase disk space and also will consume time. Is there is a way where I can hardlink/softlink all the files to final_file? For example ln * final_file.

quartz
  • 143
  • 1
  • 4

3 Answers3

5

With links, I'm afraid, this will not be possible. However, you could use a named pipe. Example:

# create some dummy files
echo alpha >a
echo beta  >b
echo gamma >c

# create named pipe
mkfifo allfiles

# concatenate files into pipe
cat a b c >allfiles

The last call will block until some process reads from the pipe and then exit. For a continuous operation one can use a loop, which waits for a process to read and starts over again.

while true; do
  cat a b c >allfiles
done
Marco
  • 33,548
2

This is not possible.

N files mean N inodes. Hard links, by definition, are simply different names for the same inode. Symlinks are files that point to a certain inode (their target). Either way, soft or hard, the link can refer to a single inode.

Joseph R.
  • 39,549
2

In a straight way, no ... You cannot hard/soft link to a single file. links are nothing more and nothing less than pointer from one file to another.

Now if you are worried about space and want to release the space you can do the following:

for i in *
do
    cat < "$i" >> destination_file &&
      rm -f -- "$i"
done

Basically, it will append the output to destination_file and remove the file afterwards. Also I'm assuming you don't need the original files.

BitsOfNix
  • 5,117
  • 1
    Why do you parse ls? Just use for i in *. And why the loop in the first place? Just do cat * >> destination. – Marco Jul 31 '13 at 20:10
  • Why not quote the variable ("$i") to allow for spaces in the file name? – Joseph R. Jul 31 '13 at 20:13
  • @JosephR. That doesn't help. If you have special characters it'll break. – Marco Jul 31 '13 at 20:16
  • @Marco But it would at least help with filenames with spaces in them. – Joseph R. Jul 31 '13 at 20:19
  • @Marco I'm not talking about the ls part. I agree with you that it's unnecessary (and incorrect). I meant to use the double quotes as well as for i in *. – Joseph R. Jul 31 '13 at 20:28
  • @JosephR. No, it would not. The quoting is applied too late. The string is already split at the spaces and the individual parts assigned to i. – Marco Jul 31 '13 at 20:29
  • @JosephR. Sorry, I misunderstood you. Totally! cat $i would not work at all, regardless of * or ls. – Marco Jul 31 '13 at 20:31
  • 1
    @Marco in the OP he's worried about space. otherwise cat * >> destination was more than enough ... hence the loop to cat and remove the file. – BitsOfNix Jul 31 '13 at 20:36
  • 1
    @AlexandreAlves My point is: i) This loop is terrible. ii) It it totally unnecessary. Just use cat * >dest followed by rm !(dest) (bash) or rm ^dest (zsh). – Marco Jul 31 '13 at 20:41
  • @Marco 1st why is terrible? 2nd, from my point of view you are assuming that he OP has small files. if you have 10 files of 1G each and want to cat in a single one you need extra 10G if you only have 7G available cat * > dest_file will not work due to lack of space. – BitsOfNix Jul 31 '13 at 20:46
  • @AlexandreAlves For starters, you were trying to parse the output of ls, which you shouldn't. – Joseph R. Jul 31 '13 at 20:49