11

I have a text file split up like so:

field1,field2,field3 
xield1,xield2,xield3 
dield1,dield2,dield3 
gield1,gield2,gield3

Each of these columns will be a parameter to a program, and I would like the program to be called for each line

I was hoping for a loop, something like:

for $i in file
    command $field2 -x $field3 -PN -$field1 >> output
done

What would be the best way to accomplish something like this in bash?

Dean
  • 111

5 Answers5

8
while IFS=, read xx yy zz;do
    echo $xx $yy $zz
done < input_file

This should work if the number of fields are constant. Instead of echo use your command.

Vombat
  • 12,884
  • Thanks, I was just trying this but it only seems to work for the first line. As soon as a command succeeds it doesn't try the next one, if it fails it will try the next one though... – Dean Dec 06 '13 at 15:09
  • How do you mean by success or fail? What your command does? – Vombat Dec 06 '13 at 15:12
  • 1
    I would guess that the command he is running is reading standard input before the "read" comand can get at it. – plugwash May 24 '18 at 19:21
5

You should use a while with the read built-in:

while IFS= read -r line;do
    fields=($(printf "%s" "$line"|cut -d',' --output-delimiter=' ' -f1-))
    command "${fields[1]}" -x "${fields[2]}" ... # ${fields[1]} is field 2
done < your_file_here

How this works

  • The cut statement takes the line and splits it on the delimiter specified by -d.
  • The --output-delimiter is the separator character that cut will use to display the selected fields, here we choose a space so we can put the different fields into the array fields.
  • Finally, we want all fields (from field 1 to the end) and that's where -f1- comes into play.
  • Now you have the different fields stored in the array variable fields, you can access any particular field you want with the syntax ${field[number]} where number is one less than the actual field number you want since array indexing is zero-based in Bash.

Note

  • This will fail if any of your fields contains whitespace.

For a constant number of fields

You can instead do something similar to 1_CR's answer:

while IFS= read -r line;do
    IFS=, read -r field1 field2 field3 <<-EOI
    $line
    EOI
    command "$field2" -x "$field3" ... 
done < your_file_here

The above, while seeming more noisy, should work in any POSIX-compliant shell, not just Bash.

Joseph R.
  • 39,549
1

You can get read to split each line into an array on , by setting IFS appropriately.

while IFS=, read -r -a input; do
 printf "%s\n" "${input[0]}" "${input[1]}"
done < input.txt

So in the example above, you may access each array element using its index, starting 0.

iruvar
  • 16,725
1

This awk one-liner will do what you want:

awk -F, '{cmd="echo " $2 " -x " $3 " -PN " $1 ">> output";  system(cmd)}' f.txt

Replace echo with your command and f.txt with the file that you wish to iterate through.

Brief explanation: -F, will set , as the delimiter. cmd builds the command and system(cmd) calls the command.

Ketan
  • 9,226
1

gnu sed can be used as well.

sed infile -e 's!^\([^,]*\),\([^,]*\),\([^,]*\)$!command \1 -x \2 -PN \3!e' >> output

notice the use of the e option to the s command

hildred
  • 5,829
  • 3
  • 31
  • 43