I have a bash script containing a group of commands in curly braces { ... }
. This group contains some initial echo
commands and then one loop. At each iteration the loop executes various slow commands (basically with curl
and some extra parsing). Each iteration is slow (because of network interaction) but it prints one line (of python code); as far as I can see, there should be no buffering issue coming from the commands themselves because they terminate their job and leave.
The whole group of commands is piped to python -u
(I also tried with tail -f
in order to check) and obviously the whole loop is executed before anything is read by python -u
or tail -f
.
I know how to unbuffer (when possible) one command with various tools like stdbuf
but I don't think it can help here because it looks like the issue comes from the command-grouping rather than from such or such command.
Any hint?