1

I have a script on a linux server which connects to a URL refines the data and returns a text file with multiple links, please see below:

Script:

#!/bin/bash
/usr/bin/curl -k -s URL | cut --characters=44-51 | sort --unique | sed -n -E 's|^AP(.+)$|http://ap\1.ztb.icb.commerzbank.com:1025/|p' >> testOutput.txt

Output:

http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/
http://APrandomnumber.com:1025/

I want to now be able to curl each of these URL's from the text file generated by the script and return a true or false depending on whether it was able to connect to them or not.

Nikhil
  • 53
  • @muru - I am a junior, i do not have enough knowledge to be able to extract answers from the question you suggested and incorporate that into my script. If you understand how to do it, please let me know. Thanks – Nikhil May 12 '20 at 16:56
  • What specific problem are you having with the answers there? – muru May 12 '20 at 17:05
  • there are multiple answers, and there are so many words which i do not understand such as "hee foo bar baz bat" im mind boggled haha – Nikhil May 12 '20 at 17:09
  • What is your problem here, "take each URL and run a command with it" OR "use curl to check whether an URL is reachable"? The first part is answered in the linked question, the second part can be answered by reading man curl (or by just accessing the URL and check the exit code). – nohillside May 13 '20 at 10:14
  • httping does exactly this, or curl -w %{http_code} -s --output /dev/null url ..., see https://unix.stackexchange.com/questions/26426/how-do-i-get-only-the-http-status-of-a-site-in-a-shell-script – pLumo Jun 02 '20 at 18:47

0 Answers0