How can I download an entire website on Linux and Windows 10

Sometimes you might want to download an entire website e.g. to archive it or read it offline. This tutorial will show you which application can be used on Windows and Linux.

I will use the tool wget here, which's a command-line program that is available for Windows, Linux, and MAC.

Install wget on Windows

First, download and install wget for Windows on your computer. The installer for the Windows version can be found here:

http://gnuwin32.sourceforge.net/packages/wget.htm

Install wget on Linux

The wget command is available in the base repositories of all major Linux distributions and can be installed with the package manager of the OS.

Debian

sudo apt install wget

Ubuntu

sudo apt install wget

CentOS 8 / RHEL

dnf install wget

OpenSuSE

yast install wget

Download a Website with wget

Open a Terminal window (or a Shell on Linux) and go to the directory where you want to store the downloaded website. Then run the following command to download the website recursively:

wget -r --no-parent http://www.example.com

This will download the pages without altering their HTML source code.

When you want to change the links on the pages automatically to point to the downloaded files then use this command instead:

wget -r --convert-links --no-parent http://www.example.com

When all HTML files shall get a .html extension, then add the "--html-extension" option.

wget -r --convert-links --html-extension --no-parent http://www.example.com

Leave a Comment