using wget to download websites

There are many tools available that help you download the entire website for ofline browsing but nothing can beat wget. Although it has a command line interface but it works like a song.

$ wget --wait=5 --limit-rate=20K -r -p -U Mozilla --no-parent http://the.site.url

For more information do man wget

--wait If you try to download everything some websites might blacklist you. It is recommended that you specify a wait of 20 secs before any new fetch and --wait helps you do that.

--limit-rate You can limit the download speed by specifying an amount which can be expressed in bytes/sec or Kbytes/sec (with the K suffix)

-r Turns on recursive retrieval

-p Ensures that wget downloads all files that are required to display the HTML page correctly like pictures, PDF files etc

-U Some website do not allow downloading HTML if it is not a browser. Wget gives is -U option to fake the browser.

and most important
--no-parent You need to be sure that you are not downloading any other folder above the one which you want to.