Linux, Tools

Grab a static website with Wget linux command

You can get any static websites on your local system with wget linux command. Let say there is simple static web application which you want to run on your local system. One option could be that you save each page and every script and css files embedded on each page as well of that application, and then you have to correct the path of the script and css files if the paths are not relatives. Wget can do this for you. For example: is very good static web application which create awesome css shapes just using drag and drops. If you want to run this application in your local system, in case you don’t have internet access. Using the following wget command you can get this application running on your local system.

wget --recursive --no-clobber --page-requisites --convert-links --no-parent

you can pass several option to wget command. The options are:

--recursive: download the entire Web site.

--domains don't follow links outside

--no-parent: don't follow links outside the directory tutorials/html/.

--page-requisites: get all the elements that compose the page (images, CSS and so on).

--html-extension: save files with the .html extension.

--convert-links: convert links so that they work locally, off-line.

--restrict-file-names=windows: modify filenames so that they will work in Windows as well.

--no-clobber: don't overwrite any existing files (used in case the download is interrupted and


Thats all now you can open index.html file in browser and everything works, in case if this doesn’t work as expected then you can put this grabbed website directory in your localhost document root.