It seems you want to pass a nested JSON object as an argument to some command. The shell won't expand a variable that is inside single quotes.
The solution here is not to switch to using double quotes, though, as the data you are injecting into the JSON document might need encoding for it to be valid JSON (if it contains tabs, quotes, or other characters that would break the format).
There are two good tools for creating the JSON for you. The simplest to use is jo
:
while IFS= read -r line; do
utility "$( jo id="$( jo S="$line" )" )"
done <input.txt
The other is jq
:
while IFS= read -r line; do
utility "$( jq -c --arg data "$line" -n '{ id: { S: $data } }' )"
done <input.txt
Both of these loops would correctly create a nested JSON object with your read data, possibly JSON-encoded, and call the utility
utility with that as an argument.
With jq
, you could even turn the loop inside-out, as it were, to avoid having to call jq
in every iteration. Doing this relies on using -c
, which makes jq
output each element of the resulting set is a single line ("compact output").
jq -c -R '{ id: { S: . } }' input.txt |
while IFS= read -r json; do
utility "$json"
done
... or get it to produce your commands and eval
them:
eval "$(
jq -r -R '[ "utility", ({ id: { S: . } } | @json) ] | @sh' input.txt
)"
Since you're calling the file that you read from output.txt
in the question, there are presumably some steps in some workflow that generates it. It may be possible to integrate the calling of utility
much earlier in your pipeline or bypass it altogether, depending on what you are doing. Especially if the IDs are coming from some JSON document.