You'd do:
unset -v line1 line2
{ IFS= read -r line1 && IFS= read -r line2; } < input.txt
Or:
{ line1=$(line) && line2=$(line); } < input.txt
(less efficient as line
is rarely built-in and most shells need to fork to implement command substitution. line
is also no longer a standard command).
To use a loop:
unset -v line1 line2 line3
for var in line1 line2 line3; do
IFS= read -r "$var" || break
done < input.txt
Or to automatically define the names of the variables as line<++n>
:
n=1; while IFS= read -r "line$n"; do
n=$((n + 1))
done < input.txt
Note that bash
supports array variables and a readarray
builtin to read lines into an array:
readarray -t line < input.txt
Note however that contrary to most other shells, bash
array indices start at 0 not 1 (inherited from ksh
), so the first line will be in ${line[0]}
, not ${line[1]}
(though as @Costas has shown, you can make readarray
(aka mapfile
) start writing the values at indice 1 (bash
arrays also contrary to most other shells' being sparse arrays) with -O 1
).
See also: Understand "IFS= read -r line"?
grep
,cut
, et. al.), it's much much better to just do what you need to do. Don't micromanage in a shell script; orchestrate tools to get the job done. – Wildcard Jan 26 '17 at 01:25