Using a bash shell operated through a pipe (non-interactive), I'm trying to pass huge amounts of data to a shell command. So far, I cannot get it to work reliably.
For example, using a here document, it would look like this:
(sed s/X//|base64 -d|lzcat|tar x) << EOF
XXQAAgAD//////////wAzG+wBunDDREwYD51KYXL50sahXmBTOGSine7WC0RATjpIrem5ygsQWKoZ
XwhPmkJAuCyqnO1KQAoFruXjSOsR3KJY+zHvzYFOgpl3ZJa+1+b0cB0w2vYzj53qplKMTjRkchPnr
XZ/nbloA=
EOF
But with huge amounts of data, this won't work since bash tries to load it all into memory before passing it to the command.
On the other hand, if I do it directly without a here document, it should be passed directly to the command, but then the shell seems to interpret an unpredictable amount of lines as shell commands:
(sed s/X//|base64 -d|lzcat|tar x)
XXQAAgAD//////////wAzG+wBunDDREwYD51KYXL50sahXmBTOGSine7WC0RATjpIrem5ygsQWKoZ
XwhPmkJAuCyqnO1KQAoFruXjSOsR3KJY+zHvzYFOgpl3ZJa+1+b0cB0w2vYzj53qplKMTjRkchPnr
XZ/nbloA=
I guess this has something to do with how the non-interactive shell buffers input.
I do not need to return to the shell having the data passed, so a solution like the latter one would work for me, if it behaved predictably.