Here, you can pass the contents of the $var1
and $var2
shell variables as arguments to your awk
script, and access them in the script via the special ARGV
array variable:
var=$(awk -- 'BEGIN {printf "%.2f", ARGV[1] / ARGV[2]}' "$var1" "$var")
Or you can export the var1
and var2
shell variables to the environment for awk
to retrieve them via its ENVIRON
array variable:
var=$(
export var1 var2
awk 'BEGIN {printf "%.2f", ENVIRON["var1"] / ENVIRON["var2"]}'
)
In the general case, it's often preferable to use ARGV
/ENVIRON
over -v
or var=value
arguments to pass arbitrary data as those latter ones mangle backslash characters (not really an issue here for numbers though).
Embedding the expansion of the shell variable in awk
's code argument should be avoided as it amounts to a code injection vulnerability. Same for forgetting to quote parameter expansions in arguments to -v
.
Note that if the shell is ksh93
or zsh
, you don't need awk
at all here:
printf -v var %.2f '1.*var1 / var2' # zsh
var=${ printf %.2f '1.*var1 / var2'; } # ksh93
(where the multiplication with 1.
forces the evaluation to be done as floating point).
However note that you'd need to sanitise the contents of those variables or otherwise that would be another arbitrary code execution vulnerability if the contents of those variables weren't under your control.
awk
is really needed ?var=$((echo scale=2 ; echo $var1 / $var2 ) | bc)
– Archemar Dec 21 '20 at 12:09bc
, so when i try using the same command that u suggested it saysbash: bc: command not found
– SomeoneNew2Unix Dec 21 '20 at 12:11