I have a list of URLS that I need to check, to see if they still work or not. I would like to write a bash script that does that for me.
I only need the returned HTTP status code, i.e. 200, 404, 500 and so forth. Nothing more.
EDIT Note that there is an issue if the page says "404 not found" but returns a 200 OK message. It's a misconfigured web server, but you may have to consider this case.
For more on this, see Check if a URL goes to a page containing the text "404"
Curl has a specific option, --write-out
, for this:
$ curl -o /dev/null --silent --head --write-out '%{http_code}\n' <url>
200
-o /dev/null
throws away the usual output--silent
throws away the progress meter--head
makes a HEAD HTTP request, instead of GET--write-out '%{http_code}\n'
prints the required status codeTo wrap this up in a complete Bash script:
#!/bin/bash
while read LINE; do
curl -o /dev/null --silent --head --write-out "%{http_code} $LINE\n" "$LINE"
done < url-list.txt
(Eagle-eyed readers will notice that this uses one curl process per URL, which imposes fork and TCP connection penalties. It would be faster if multiple URLs were combined in a single curl, but there isn't space to write out the monsterous repetition of options that curl requires to do this.)