How to Download an Entire Website with WGET

How to Download an Entire Website with WGET

7 Comments on How to Download an Entire Website with WGET

If you have ever had to or needed to copy or move a website, wget is quite handy. Wget is open source, available on Linux, OSX and Windows and is very easy to use. A whole website can be downloaded with one simple command.

Wget is a command line program so the name of the program is typed into a terminal followed by the required arguments. To download a file from a website, the wget command is quite simple:


To download recursively all files linked to a page you want to download, -r is used.

wget -r

In the case a web server does not allow download managers, -U can be used to tell the web server you are using a common web browser:

wget -r -U Mozilla

Some web servers may blacklist an IP if it notices that the all pages are being downloaded quickly. the –wait=15 option will prevent this:

wget --wait=15 -r -U Mozilla

The download rate can also be set with the –limit-rate=64K option. limit rate defaults to bytes so a K must be placed after a number for kilobytes:

wget --wait=15 --limit-rate=64K -r -U Mozilla

To make sure that wget does not download files from parent directories, –no-parent can be used:

wget --wait=15 --limit-rate=64K --no-parent -r -U Mozilla

About the author:

Bill Payne started working as a paid professional software developer at the young age of 12 years old developing simple games and other applications for pre-packaged computers. Bill has since developed software for many industries such as direct sales and the the stock market. Bill has now started sharing his many years of software development experience through a blog on the MPSHouse website and one on one lessons.


  1. Dinos  - 29 September , 2011 - 11:31 am
    Reply /

    Thanks a lot for the article. I do love wget while at the same time I do have one unresolved issue and this is how to handle files with spaces in the filename.
    For example if we have a link to a file “” or “” how do we download the file using wget?

    Thanks in advance …

  2. Bob Pegram  - 30 September , 2011 - 12:18 am
    Reply /

    For those who want a website downloading program with GUI interface, GetLeft works well.

  3. Amer Neely  - 15 January , 2014 - 9:50 am
    Reply /

    I’ve used wget as well several times as a developer to mirror one of my client’s sites, but 1 of them is giving me a problem. It only downloads the files in the top level of an ‘images’ directory. There are 2 sub-directories in it that never get downloaded.

    I’m using ‘wget -m -r -l0 http://xxxxxx…’ on a Max OS X running Lion. Any ideas?

    • bill  - 16 January , 2014 - 5:41 am
      Reply /

      For something like that I use a script and crond. Send the site to the other location using FTP in a script. You can even add mysqldump to the script to send the database.

Leave a comment

Back to Top