Say that
https://example.nosuchtld
https://example.net
https://example.org
https://example.willfail
is the content of urls.txt. I want to run <command> <url> for every URL/line of urls.txt —where <command> is, let's say, curl; so,
cat urls.txt | xargs -n1 curl
or
<urls.txt xargs -n1 curl
for instance. I want every URL/line which was unsuccessfully curled (so, the first and last ones) to
- be removed from
urls.txt; and - be appended to another file —let's say
nope.txt— to be created if it doesn't already exist
leaving urls.txt as
https://example.net
https://example.org
and nope.txt as
https://example.nosuchtld
https://example.willfail
I know that the exit status of every command run by the shell is made available via the variable $?, that 0 represents successful execution of a command, and that all other integers represent failure. I'm unsure, though, of how to construct a composite command that incorporates this, deletes lines from the file being read from, and appends 'em to a different file.
curlas an example to concertize my question, I'm really looking to understand a solution or solutions that'd be likely to work with any command including those that lack a--failoption. Consider, for instance, the command line app nb: it has no such flag. What could be done, then, when processing a URL file with it? Thoughts? – seconddayout May 15 '22 at 01:27&&logical operator, verdad? – seconddayout May 15 '22 at 02:10