Sometimes you might want to download an entire website e.g. to archive it or read it offline. This tutorial will show you which application can be used on Windows and Linux.

I will use the tool wget here, that's a command line program that is available for Windows, Linux, and MAC.

Install wget on Windows

First download and Install wget for Windows on your computer. The installer for the Windows version can be found here:

http://gnuwin32.sourceforge.net/packages/wget.htm

Install wget on Linux

The wget command is available in the base repositories of all major Linux Distributions and can be installed with the package manager of the OS.

Debian

apt-get install wget

Ubuntu

sudo apt-get install wget

CentOS / RHEL

yum install wget

OpenSuSE

yast install wget

Download a Website with wget

Open a Terminal window (or a Shell on Linux) and go to the directory where you want to store the downloaded website. Then run the following command to download the website recursively:

wget -r --no-parent http://www.example.com

This will download the pages without altering their HTML source code.

When you want to change the links on the pages automatically to point to the downloaded files then use this command instead:

wget -r --convert-links --no-parent http://www.example.com

When all HTML files shall get a .html extension, then add the "--html-extension" option.

wget -r --convert-links --html-extension --no-parent http://www.example.com
Tagged on:     

One thought on “How can I download an entire website?

  • June 21, 2016 at 9:37 pm
    Permalink

    wget is an excellent tool

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *