Downloading The Contents Of A MediaWiki Install

Here’s an useful wget command: it downloads a static copy of a MediaWiki installation while skipping unimportant pages, such as the talk sections. Downloading large sites can take a long time, so the nohupinstruction lets this command continue even when the user exits from the shell.

nohup wget --recursive --page-requisites --html-extension \
    --convert-links --no-parent -R "*Special*" -R "*action=*" \
    -R "*printable=*" -R "*title=Talk:*" \
    "http://example.com/example/wiki/path/"

You can use this command to archive old wiki installs, or keep a local copy for quick reference.