I'm searching for a way to dump the complete wiki (all wiki pages).
I've made a page which links to all pages, and tried to download it recursively (up to depth 1) with "wget". I've supplied user name and password on the command line. This is what I get:
HTTP request sent, awaiting response... 401 Zugriff verweigert. Du kannst diese Seite nicht betrachten. Unknown authentication scheme. Username/Password Authentication Failed.
So, any ideas?