There are many ways to see if a website or webpage online or not. Easiest way perhaps it to just click the url. But what if you have many websites and pages want to check or test? Here are just few of them.

Use Get

Try this command with a given url

$wget http://www.fibrevillage.com -O /dev/null --2015-08-17 18:18:55--  http://www.fibrevillage.com/
Resolving www.fibrevillage.com... 23.229.159.161
Connecting to www.fibrevillage.com|23.229.159.161|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: http://fibrevillage.com/ [following]
--2015-08-17 18:18:55--  http://fibrevillage.com/
Resolving fibrevillage.com... 23.229.159.161
Reusing existing connection to www.fibrevillage.com:80.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: “/dev/null”

    [ <=>                                                                            ] 25,304      --.-K/s   in 0.04s   

2015-08-17 18:19:02 (581 KB/s) - “/dev/null” saved [25304]

Response code "200" is a good sign indicates the webpage is up.

For scripting purpose, you may want to do it simple

wget  -O /dev/null  http://www.fibrevillage.com 2> /tmp/output | grep  "200 OK" /tmp/output

There is a downside of the tool, because get get the whole page from the website, this is more than just test a webpage online or not.

Use curl to test the page

curl can do the same as wget does, to just download a page to see if it's ok

$curl http://fibrevillage.com  -o log
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 25432    0 25432    0     0  12035      0 --:--:--  0:00:02 --:--:-- 12297
$echo $?
0

However, it doesn't show the http response code.

To get it, try http HEAD command

$curl -I -X head http://fibrevillage.com  2>/dev/null  | grep "200 OK"
HTTP/1.1 200 OK

Or, try http GET command

$curl -I -X get http://fibrevillage.com  2>/dev/null  | grep "200 OK"
HTTP/1.1 200 OK

Or this format

curl -s --head  --request GET http://fibrevillage.com | grep "200 OK" > /dev/null

Add more options in case there is redirection

$curl -sL -I -X get http://www.fibrevillage.com 2>/dev/null | grep "200 OK"
HTTP/1.1 200 OK

Simple script

url="http://www.fibrevillage.com"if curl -sL -I -X head  $url | grep "200 OK">/dev/null
   echo " $url is online"
else
   echo "$url is offline"
fi

Base on the script, easily you can do more, for example run a loop for list of urls, or add few lines to capture access time cost for each page.

What a existing script, take a look this:

https://nagios-plugins.org/doc/man/check_http.html