Using Wget to Download a Local Copy of a Website

Why would you want a local copy?


Wget is a fancy tool used to downloads files. This is the tool we’re going to use to download a local copy of a website. It has a ton of features that you can check out here. It comes pre-installed on most *nix systems and is available on OS X and Windows as well.

If you’re on OS X like I am you can download Wget via Homebrew.

brew install wget

The command:

wget \
    --adjust-extension \
    --convert-links \
    --mirror \
    --no-parent \
    --page-requisites \

The Options:

adjust-extension Add ".html" to the end of the local filename if the content-type is html/xhtml and the .html extension is not present.
convert-links Make links suitable for local viewing. This will modify links within the HTML so that they point to local assets.
mirror Recursively download files with infinite depth. Use this option if you want to download the entire website.
no-parent Don't ascend to the parent directly. This is useful if you only want to download a subset of a website such as
page-requisites Download all assets necessary to view the site offline. This tells Wget to download images, sounds, stylesheets, etc so that your local copy is complete.

The result:

You should be able to open the index.html file in your browser and interact with the local copy of the website.