Since bash
can't store NUL bytes in its variables anyway, you can always do:
IFS= read -rd '' var < file
which will store the content of the file up to the first NUL byte or the end of the file if the file has no NUL bytes (text files, by definition (by the POSIX definition at least) don't contain NUL bytes).
Another option is to store the content of the file as the array of its lines (including the line delimiter if any):
readarray array < file
You can then join them with:
IFS=; var="${array[*]}"
If the input contains NUL bytes, everything past the first occurrence on each line will be lost.
In POSIX sh syntax, you can do:
var=$(cat < file; echo .); var=${var%.}
We add a .
which we remove afterwards to work around the fact that command substitution strips all trailing newline characters.
If the file contains NUL bytes, the behaviour will vary between implementations. zsh
is the only shell that will preserve them (it's also the only shell that can store NUL bytes in its variables). bash
and a few other shells just removes them, while some others choke on them and discard everything past the first NUL occurrence.
You could also store the content of the file in some encoded form like:
var=$(uuencode -m - < file)
And get it back with:
printf '%s\n' "$var" | uudecode
Or with NULs encoded as \0000
so as to be able to use it in arguments to printf %b
in bash
(assuming you're not using locales where the charset is BIG5, GB18030, GBK, BIG5-HKCSC):
var=; while true; do
if IFS= read -rd '' rec; then
var+=${rec//\\/\\\\}\\0000
else
var+=${rec//\\/\\\\}
break
fi
done < file
And then:
printf %b "$var"
to get it back.