Wget command in linux (GNU Wget) is a command-line utility for downloading files from the web. With Wget, you can download files using HTTP, HTTPS, and FTP protocols. Wget provides a number of options allowing you to download multiple files, resume downloads, limit the bandwidth, recursive downloads, download in the background, mirror a website and much more.
In this tutorial, we will show you how to use the Wget command through practical examples and detailed explanations of the most common Wget options.
Table of Contents
Installing Wget command in linux
The wget package is pre-installed on most Linux distributions today.
To check whether the Wget package is installed on your system, open up your console, type
wget, and press enter. If you have wget installed, the system will print
wget: missing URL, otherwise, it will print
wget command not found.
If wget is not installed, you can easily install it using the package manager of your distro.
Installing Wget on Ubuntu and Debian
sudo apt install wget
Installing Wget on CentOS and Fedora
sudo yum install wget
Wget Command Syntax
Before going into how to use the
wget command, let’s start by reviewing the basic syntax.
wget utility expressions take the following form:
wget [options] [url]
How to download a file with Wget command?
In it’s simplest form when used without any option, wget will download the resource specified in the [url] to the current directory.
In the following example we are downloading the Linux kernel tar archive:
As you can see from the image above, Wget starts by resolving the IP address of the domain, then connects to the remote server and starts the transfer.
During the download, Wget shows the progress bar alongside with the file name, file size, download speed, and the estimated time to complete the download. Once the download is complete, you can find the downloaded file in your current working directory.
To turn off Wget’s output, use the
If the file already exists, Wget will add
.N (number) at the end of the file name.
Using Wget command to save the downloaded file under different name
To save the downloaded file under a different name, pass the
-O option followed by the chosen name:
wget -O hugo.zip https://github.com/gohugoio/hugo/archive/master.zip
The command above will save the latest hugo zip file from GitHub as
latest-hugo.zip instead of its original name.
Using Wget command to download a file to a specific directory
By default, Wget will save the downloaded file in the current working directory. To save the file to a specific location, use the
wget -P /mnt/iso http://mirrors.mit.edu/centos/7/isos/x86_64/CentOS-7-x86_64-Minimal-1804.iso
With the command above we are telling Wget to save the CentOS 7 iso file to the
How to limit the download speed with Wget?
To limit the download speed, use the
--limit-rate option. By default, the speed is measured in bytes/second. Append
k for kilobytes ,
m for megabytes and
g for gigabytes.
The following command will download the Go binary and limit the download speed to 1mb:
wget --limit-rate=1m https://dl.google.com/go/go1.10.3.linux-amd64.tar.gz
How to resume a download with wget?
You can resume a download using the
-c option. This is
useful if your connection drops during a download of a large file, and
instead of starting the download from scratch, you can continue the
In the following example we are resuming the download of the Ubuntu 18.04 iso file:
wget -c http://releases.ubuntu.com/18.04/ubuntu-18.04-live-server-amd64.iso
If the remote server does not support resuming downloads, Wget will start the download from the beginning and overwrite the existing file.
How to download via FTP with wget command?
To download a file from a password-protected FTP server, specify the username and password as shown below:
wget --ftp-user=FTP_USERNAME --ftp-password=FTP_PASSWORD ftp://ftp.example.com/filename.tar.gz
How to download multiple files in wget?
If you want to download multiple files at once, use the
option followed by the path to a local or external file containing a
list of the URLs to be downloaded. Each URL needs to be on a separate
In the following example we are downloading the Arch Linux, Debian, and Fedora iso files with URLs specified in the
wget -i linux-distros.txt
Linux.txt http://mirrors.edge.kernel.org/archlinux/iso/2018.06.01/archlinux-2018.06.01-x86_64.iso https://cdimage.debian.org/debian-cd/current/amd64/iso-cd/debian-9.4.0-amd64-netinst.iso https://download.fedoraproject.org/pub/fedora/linux/releases/28/Server/x86_64/iso/Fedora-Server-dvd-x86_64-28-1.1.iso
If you specify
- as a filename, URLs will be read from the standard input.
How to download in background with wget?
To download in the background ,use the
-b option. In the following example, we are downloading the OpenSuse iso file in the background:
wget -b https://download.opensuse.org/tumbleweed/iso/openSUSE-Tumbleweed-DVD-x86_64-Current.iso
By default, the output is redirected to
wget-log file in the current directory. To watch the status of the download, use the
tail -f wget-log
How to change the wget user-agent of wget?
Sometimes when downloading a file, the remote server may be set to block the Wget User-Agent. In situations like this to emulate a different browser pass the
wget --user-agent="Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/60.0" http://wget-forbidden.com/
The command above will emulate Firefox 60 requesting the page from
How to create a mirror of a website?
To create a mirror of a website with Wget, use the
wget -m https://example.com
If you want to use the downloaded website for local browsing, you will need to pass a few extra arguments to the command above.
wget -m -k -p https://example.com
-k option will cause Wget to convert the links in the downloaded documents to make them suitable for local viewing. The
-p option will tell wget to download all necessary files for displaying the HTML page.
How to download file if SSL Certificate error occurs?
If you want to download a file over HTTPS from a host that has an invalid SSL certificate, use the
wget --no-check-certificate https://example.com
How to download to standard output with wget?
In the following example, Wget will quietly ( flag
-q) download and output the latest WordPress version to stdout ( flag
-O -) and pipe it to the
tar utility which will extract the archive to the
wget -q -O - "http://wordpress.org/latest.tar.gz" | tar -xzf - -C /var/www
With Wget, you can download multiple files, resume partial downloads, mirror websites, and combine the Wget options according to your needs.