Themes:

Coming soon!

Screen Creature

There'll be a banner here one day... promise.

Guides & Facts

This page is a bit scattered as it's sort of a catch-all for things that don't have a place on my other pages.


Saving websites locally - web preservation with wget

The personal web, though beautiful, is also quite unstable. By this I mean sites can disappear at any time, seemingly randomly. This kinda sucks. You may wish that there are certain sites you can save locally, so this disappearance will never be an issue. Well, lucky for you, imaginary person I have constructed for the purpose of this scenario - this is a guide on how to do exactly that!

The key to it all - wget

Wget download
- This lovely little thing is the key to our whole operation here. Wget is "a free software package for retrieving files using HTTP, HTTPS, FTP and FTPS, the most widely used Internet protocols" (Wget website).

Downloading Wget: if you're running Linux, awesome - you'll likely have it pre-installed. If you're running windows, like me, then it can be kinda confusing to get it running. You'll want This link for the wget for windows installer, and download the top link with the description "Complete package, except sources". Most wget download links point to the wget source code, which you would need to compile to then run (a hassle). This gives you a zip file installer instead, which is what you want. That's the tricky bit, from here look up a "how to run wget on windows" tutorial and it'll take you the rest of the way.


Downloading sites
- Now we have Wget up and running, head over to command line, and navigate to the folder you want to save into using cd. To download a site to get a complete local version, you'll want to use the command "wget -r -k -p site url".

"-r" Means recursive download - this will ensure the download keeps going for all pages on the site rather than just the first page. By default, wget will go 5 levels deep. This should be enough for most sites, but may require fine tuning. If you need to adjust the download depth level, use the command "-l" followed by the level you want to download to ("-l inf" is available for infinite level downloads, but tread very carefully with this. If it gets out of control, Ctrl + c should kill the download (depends on which shell you're using)).

"-k" converts the links on a page to local file urls, so links on the website will work as expected in a local copy.

"-p" downloads all page requisites, so images and css ect is all available offline, creating a full local copy of the site.

And you're done! Enjoy your shiny new preserved site.