Is there a standard tool that can filter text stream based on command execution result?
Consider grep
for example. It can filter text stream based on regex. But the more general problem is extending filtering condition. For example, I may want to select all files matching some condition (find
has plenty of checks available, but it is a subset anyway), or just use another program to filter my data. Consider this pipe:
produce_data | xargs -l bash -c '[ -f $0 ] && ping -c1 -w1 $1 && echo $0 $@'
It is completely useless, but it provides a general approach. I can use any bash oneliner to test each line. In this example I want the lines which consist of existing file and a reachable host. I would like to have a standard tool that can do it like this:
produce_data | super_filter -- bash -c '[ -f $0 ] && ping -c1 -w1 $1'
it could be easily used with find
:
find here | super_filter -- test -r
note how it allows you to use universal tools to filter files instead of specific find
flags which I always forget.
A more real-life example where such tool would be helpful is finding object files with specific symbols.
So super_filter
would allow any condition checker to operate in stream mode. The syntax could be like in xargs
or parallel
.