I have a script on a linux server which connects to a URL refines the data and returns a text file with multiple links, please see below:
Script:
#!/bin/bash
/usr/bin/curl -k -s URL | cut --characters=44-51 | sort --unique | sed -n -E 's|^AP(.+)$|http://ap\1.ztb.icb.commerzbank.com:1025/|p' >> testOutput.txt
Output:
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
I want to now be able to curl each of these URL's from the text file generated by the script and return a true or false depending on whether it was able to connect to them or not.
man curl
(or by just accessing the URL and check the exit code). – nohillside May 13 '20 at 10:14httping
does exactly this, orcurl -w %{http_code} -s --output /dev/null url
..., see https://unix.stackexchange.com/questions/26426/how-do-i-get-only-the-http-status-of-a-site-in-a-shell-script – pLumo Jun 02 '20 at 18:47