There are many ways to see if a website or webpage online or not. Easiest way perhaps it to just click the url. But what if you have many websites and pages want to check or test? Here are just few of them.

Use Get

Try this command with a given url

$wget -O /dev/null --2015-08-17 18:18:55--
Connecting to||:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: [following]
--2015-08-17 18:18:55--
Reusing existing connection to
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: “/dev/null”

    [ <=>                                                                            ] 25,304      --.-K/s   in 0.04s   

2015-08-17 18:19:02 (581 KB/s) - “/dev/null” saved [25304]

Response code "200" is a good sign indicates the webpage is up.

For scripting purpose, you may want to do it simple

wget  -O /dev/null 2> /tmp/output | grep  "200 OK" /tmp/output

There is a downside of the tool, because get get the whole page from the website, this is more than just test a webpage online or not.

Use curl to test the page

curl can do the same as wget does, to just download a page to see if it's ok

$curl  -o log
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 25432    0 25432    0     0  12035      0 --:--:--  0:00:02 --:--:-- 12297
$echo $?

However, it doesn't show the http response code.

To get it, try http HEAD command

$curl -I -X head  2>/dev/null  | grep "200 OK"
HTTP/1.1 200 OK

Or, try http GET command

$curl -I -X get  2>/dev/null  | grep "200 OK"
HTTP/1.1 200 OK

Or this format

curl -s --head  --request GET | grep "200 OK" > /dev/null

Add more options in case there is redirection

$curl -sL -I -X get 2>/dev/null | grep "200 OK"
HTTP/1.1 200 OK

Simple script

url=""if curl -sL -I -X head  $url | grep "200 OK">/dev/null
   echo " $url is online"
   echo "$url is offline"

Base on the script, easily you can do more, for example run a loop for list of urls, or add few lines to capture access time cost for each page.

What a existing script, take a look this: