How to quickly check if a website is up or down

Linux users can easily test a websites availability from the command line by obtaining the status codes from the web server.

The below HTTP response status codes will tell you the status of a website, and most of the time you will see one of the below codes in the browser when you encounter a problem accessing the website, except the first two codes.

  • 200 OK : Standard response for successful HTTP requests.
  • 301 Moved Permanently : The requested page has moved to a new url. This is because you are redirected from a Non-www to www or vice versa.
  • 403 Forbidden : Access is forbidden to the requested page.
  • 404 Not Found : The server can not find the requested page.
  • 500 Internal Server Error : The request was not completed. The server met an unexpected condition.
  • 502 Bad Gateway : The request was not completed. The server received an invalid response from the upstream server.
  • 503 Service Unavailable : The request was not completed. The server is temporarily overloading or down.

Most of us prefer to use the ping command, which performs a basic test to determine if a remote host is available, but it does not check whether the web server is up or not.

In this guide, we will show you several commands to check if a website is down, from the Linux terminal.

We have added various options to check this information for single host and multiple hosts.

If you are maintaining certain websites and want to receive real-time alerts when the website goes down. I recommend using real-time website tracking tools, some of which are free, but most are paid, so choose the one you prefer based on your needs. We will cover this topic in our upcoming article.

Method-1: Check a website availability with fping

fping command is a program such as ping, which uses the Internet Control Message Protocol (ICMP) echo request to determine whether a target host is responding.

fping differs from ping because it allows users to ping any number of hosts in parallel as shown below:

# fping 2daygeek.com linuxtechnews.com magesh.co.in

2daygeek.com is alive
linuxtechnews.com is alive
magesh.co.in is alive

Method-2: Testing a website from Linux command line

The httpie tool is a modern command line http client which makes CLI interaction with the web services.

It provides a simple http command that allows for sending arbitrary HTTP requests using a simple and natural syntax, and displays a colorized output.

It’s not cURL replacement but this is particularly well-suited to JSON-based REST APIs.

# http 2daygeek.com

HTTP/1.1 301 Moved Permanently
CF-RAY: 535b66722ab6e5fc-LHR
Cache-Control: max-age=3600
Connection: keep-alive
Date: Thu, 14 Nov 2019 19:30:28 GMT
Expires: Thu, 14 Nov 2019 20:30:28 GMT
Location: https://2daygeek.com/
Server: cloudflare
Transfer-Encoding: chunked
Vary: Accept-Encoding

Method-3: Health check of web page using curl

curl command is a tool that is used to transfer data either from a server or to the server, using one of the supported protocols such as ftp, ftps, http, https, scp, etc.

The curl command is designed to work without user interaction, and this is inevitable when you are working on a console.

It supports proxy support, user authentication, FTP upload, HTTP post, SSL connections, cookies, file transfer resume, Metalink, and more.

# curl -I https://www.magesh.co.in

HTTP/2 200 
date: Thu, 14 Nov 2019 19:39:47 GMT
content-type: text/html
set-cookie: __cfduid=db16c3aee6a75c46a504c15131ead3e7f1573760386; expires=Fri, 13-Nov-20 19:39:46 GMT; path=/; domain=.magesh.co.in; HttpOnly
vary: Accept-Encoding
last-modified: Sun, 14 Jun 2015 11:52:38 GMT
x-cache: HIT from Backend
cf-cache-status: DYNAMIC
expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
server: cloudflare
cf-ray: 535b74123ca4dbf3-LHR

If you only want to see the HTTP status code instead of the full output, use the following curl command:

# curl -I "www.magesh.co.in" 2>&1 | awk '/HTTP\// {print $2}'
 200

Use the following shell script to display a customized output with the http status code:

# vi curl-url-check.sh

#!/bin/bash
if curl -I "https://www.magesh.co.in" 2>&1 | grep -w "200\|301" ; then
    echo "magesh.co.in is up"
else
    echo "magesh.co.in is down"
fi

Run the script after adding the above script to a file:

# chmod +x curl-url-check.sh

# sh curl-url-check.sh

HTTP/2 200 
magesh.co.in is up

Use the following shell script to view the status of multiple websites using curl command:

# vi curl-url-check-1.sh

#!/bin/bash
for site in www.google.com google.co.in www.xyzzz.com
do
if curl -I "$site" 2>&1 | grep -w "200\|301" ; then
    echo "$site is up"
else
    echo "$site is down"
fi
echo "----------------------------------"
done

Run the script to see the output:

# chmod +x curl-url-check-1.sh

# sh curl-url-check-1.sh

HTTP/1.1 200 OK
www.google.com is up
----------------------------------
HTTP/1.1 301 Moved Permanently
google.co.in is up
----------------------------------
www.xyzzz.com is down
----------------------------------

Method-4: How to check if a website is up or down with wget

wget command (formerly known as Geturl) is a Free, open source, command line download tool which retrieves the files using HTTP, HTTPS and FTP, the most widely-used Internet protocols.

It is a non-interactive command line tool and Its name is derived from “World Wide Web and get”.

wget handles the download pretty good compared with other tools. It’s features include working in the background, recursive download, multiple file downloads, resume downloads, non-interactive downloads & large file downloads.

# wget -S --spider https://www.magesh.co.in

Spider mode enabled. Check if remote file exists.
--2019-11-15 01:22:00--  https://www.magesh.co.in/
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
Resolving www.magesh.co.in (www.magesh.co.in)… 104.18.35.52, 104.18.34.52, 2606:4700:30::6812:2334, …
Connecting to www.magesh.co.in (www.magesh.co.in)|104.18.35.52|:443… connected.
HTTP request sent, awaiting response… 
  HTTP/1.1 200 OK
  Date: Thu, 14 Nov 2019 19:52:01 GMT
  Content-Type: text/html
  Connection: keep-alive
  Set-Cookie: __cfduid=db73306a2f1c72c1318ad4709ef49a3a01573761121; expires=Fri, 13-Nov-20 19:52:01 GMT; path=/; domain=.magesh.co.in; HttpOnly
  Vary: Accept-Encoding
  Last-Modified: Sun, 14 Jun 2015 11:52:38 GMT
  X-Cache: HIT from Backend
  CF-Cache-Status: DYNAMIC
  Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
  Server: cloudflare
  CF-RAY: 535b85fe381ee684-LHR
Length: unspecified [text/html]
Remote file exists and could contain further links,
but recursion is disabled -- not retrieving.

You may have trouble identifying the website status from the above output as it contains so many lines making it difficult to read. Hence, use the following customized command to view only the HTTP status code instead of the full output:

# wget --spider -S "www.magesh.co.in" 2>&1 | awk '/HTTP\// {print $2}'
 200

Use the following bash script to add extra value in the output based on the status code for better readability:

# vi wget-url-check.sh

#!/bin/bash
if wget --spider -S "https://www.google.com" 2>&1 | grep -w "200\|301" ; then
    echo "Google.com is up"
else
    echo "Google.com is down"
fi

You can see the output as soon as the script is executed as shown below:

# chmod +x wget-url-check.sh

# sh wget-url-check.sh

HTTP/1.1 200 OK
Google.com is up

Use the following shell script to view the status of multiple websites:

# vi curl-url-check-1.sh

#!/bin/bash
for site in www.google.com google.co.in www.xyzzz.com
do
if wget --spider -S "$site" 2>&1 | grep -w "200\|301" ; then
    echo "$site is up"
else
    echo "$site is down"
fi
echo "----------------------------------"
done

You can see the output as soon as the script is executed as shown below:

# sh wget-url-check-1.sh

HTTP/1.1 200 OK
www.google.com is up
----------------------------------
HTTP/1.1 301 Moved Permanently
google.co.in is up
----------------------------------
www.xyzzz.com is down
----------------------------------

Method-5: lynx command to check if website is up or down

lynx is a highly configurable text-based web browser for use on cursor-addressable character cell terminals. It’s the oldest web browser, and still under active development.

# lynx -head -dump http://www.magesh.co.in

HTTP/1.1 200 OK
Date: Fri, 15 Nov 2019 08:14:23 GMT
Content-Type: text/html
Connection: close
Set-Cookie: __cfduid=df3cb624024b81df7362f42ede71300951573805662; expires=Sat, 1
4-Nov-20 08:14:22 GMT; path=/; domain=.magesh.co.in; HttpOnly
Vary: Accept-Encoding
Last-Modified: Sun, 14 Jun 2015 11:52:38 GMT
X-Cache: HIT from Backend
CF-Cache-Status: DYNAMIC
Server: cloudflare
CF-RAY: 535fc5704a43e694-LHR

Use the following lynx command if you want to see only the HTTP status code instead of the entire output:

# lynx -head -dump https://www.magesh.co.in 2>&1 | awk '/HTTP\// {print $2}'
 200

If you want to check if a given website is up or down, use the following Bash script:

# vi lynx-url-check.sh

#!/bin/bash
if lynx -head -dump http://www.magesh.co.in 2>&1 | grep -w "200\|301" ; then
    echo "magesh.co.in is up"
else
    echo "magesh.co.in is down"
fi

Once you have added the above script to a file, run the file to see the output:

# sh lynx-url-check.sh

HTTP/1.1 200 OK
magesh.co.in is up

Use the following shell script if you want to see the status of multiple websites:

# vi lynx-url-check-1.sh

#!/bin/bash
for site in http://www.google.com https://google.co.in http://www.xyzzz.com
do
if lynx -head -dump "$site" 2>&1 | grep -w "200\|301" ; then
    echo "$site is up"
else
    echo "$site is down"
fi
echo "----------------------------------"
done

Once you have added the above script to a file, run the file to see the output:

# sh lynx-url-check-1.sh

HTTP/1.0 200 OK
http://www.google.com is up
----------------------------------
HTTP/1.0 301 Moved Permanently
https://google.co.in is up
----------------------------------
www.xyzzz.com is down
----------------------------------

Method-6: ping command to check the url availability

ping command (stands for “Packet Internet Groper”) is a networking utility that’s used to test the target of a host availability/connectivity on an Internet Protocol (IP) network.

It verifies a host availability by sending Internet Control Message Protocol (ICMP) Echo Request packets to the target host and waiting for an ICMP Echo Reply.

# ping -c 5 2daygeek.com

PING 2daygeek.com (104.27.157.177) 56(84) bytes of data.
64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=1 ttl=58 time=228 ms
64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=2 ttl=58 time=227 ms
64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=3 ttl=58 time=250 ms
64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=4 ttl=58 time=171 ms
64 bytes from 104.27.157.177 (104.27.157.177): icmp_seq=5 ttl=58 time=193 ms

--- 2daygeek.com ping statistics ---
5 packets transmitted, 5 received, 0% packet loss, time 13244ms
rtt min/avg/max/mdev = 170.668/213.824/250.295/28.320 ms

Method-7: how to check whether a website is working or not

The Telnet command is an old network protocol used to communicate with another host over a TCP/IP network using the TELNET protocol.

It uses port 23 to connect to other devices, such as computer and network equipment.

Telnet is not a secure protocol and is usually not recommended, because the data sent to the protocol is not encrypted and can be intercepted by hackers.

# telnet google.com 80

Trying 216.58.194.46…
Connected to google.com.
Escape character is '^]'.
^]
telnet> quit
Connection closed.

Method-8: Shell script to check website status

In simple words, a shell script is a file that contains a series of commands. The shell reads this file and executes the commands one by one as they are entered directly on the command line.

To make it more useful we can add some conditions which reduces the tasks of a Linux admin.

If you want to see the status of multiple websites using the wget command, use the following shell script:

# vi wget-url-check-2.sh

#!/bin/bash
for site in www.google.com google.co.in www.xyzzz.com
do
if wget --spider -S "$site" 2>&1 | grep -w "200\|301" > /dev/null ; then
    echo "$site is up"
else
    echo "$site is down"
fi
done

Once you have added the above script to a file, run the file to see the output:

# chmod +x wget-url-check-2.sh

# sh wget-url-check-2.sh

www.google.com is up
google.co.in is up
www.xyzzz.com is down

If you want to see the status of multiple websites using the curl command, use the following bash script:

# vi curl-url-check-2.sh

#!/bin/bash
for site in www.google.com google.co.in www.xyzzz.com
do
if curl -I "$site" 2>&1 | grep -w "200\|301" > /dev/null ; then
    echo "$site is up"
else
    echo "$site is down"
fi
done

Once you have added the above script to a file, run the file to see the output:

# chmod +x curl-url-check-2.sh

# sh curl-url-check-2.sh

www.google.com is up
google.co.in is up
www.xyzzz.com is down

Conclusion

In this guide, we have shown several commands that can be executed from a Linux terminal to test whether a website is up or down. Also, included is a small shell script to check the status of multiple websites simultaneously in each session.

If you have any questions or feedback, feel free to leave a comment and we will get back to you as soon as possible.

About Magesh Maruthamuthu

Love to play with all Linux distribution

View all posts by Magesh Maruthamuthu

Leave a Reply

Your email address will not be published. Required fields are marked *