NoTube uses a Mediawiki installation kindly provided by sti2.org for its internal documentation. As the project draws to a close (we finish officially on January 31st 2012, with our final review in late March) we wanted to make sure we had a copy of everything we had done over the last few years. Much of this is and will remain private to the partners but there are some interesting ideas and usecases we wrote down early on that we don’t want to lose track of. I hadn’t realised that by default Mediawiki has an API, but once I did, it was pretty simple to download all the pages. I’ve put the Ruby script on github in case it’s useful to anyone else. Basically the only fiddly bit is the cookies. You do, of course, need a username and password for the wiki you want to download, but thereafter, there’s an API call you can call recursively to get a list of all pages, and then download them individually.
One thought on “Archiving a Mediawiki Installation”
Why not just grab the xml dump? For the API there are some good Perl tools.
Comments are closed.