Jump to content

Recommended Posts

Posted

Perhaps I am overlooking it, but is there not a way to download the MaNGOS wiki as a single file for import/parsing as a backup for personal use?

Most all wikis I've seen have this feature. I think the wiki dump is saved as an XML or csv file.

Posted

I was hoping there would be a more general way of obtaining the articles. Is there a command or wildcard to be used in the Export Page feature that would allow downloading all the articles? Mediawiki's documentation does not specify if that is permitted.

Posted

No there is no no command to do this.

The only useful advise i found is that you can simply copy the list off Special:AllPages and then paste into an advanced texteditor with an extended Search system so you can use \\n (new line) and \\t (tab).

This solution i could only test with windows, i used notepad++ for this.

Just replace all \\t with \\n and you have a list with some gaps between but all Articles are on an separate line ready to export.

Posted

Thank you most kindly for your help, DK. :)

I'll give your suggestion a try. Notepad++ is also my favorite text editor for Windows. It works great with all the C++, diff, and sql files we use around here.

I hope the downloadable wiki dump will be a feature made available in the near future. Most wikis usually link to their dumps on the Special:Statistics or Special:Files pages. I believe the it's the dumpwiki command that is used to generate the XML file. You can then import it into another wiki or use certain utilities to parse it into other readable forms.

Posted
I hope the downloadable wiki dump will be a feature made available in the near future.

But is this export page not enough?

A little work with the AllPages site and then you got a complete dump to this time where you create it. Such feature i think they won't implement, maybe in another form but as a wikidump this export page is the feature already.

But i must admit a better (and maybe shorter way) to create a complete dump will be also nice.

Posted

Just for personal use, MCP. It's easier for me to have the documentation on my e-reader or old laptop for offline study. I use tools like BZReader or WikiParser to convert the XML dump into readable form.

If there is some security concern over making a dump publicly available, I would not wish to compromise that.

@DarkKnight

The Export Page feature will get the job done. I'm not sure if links and page ordering are preserved, but it's better than manually copying.

@Sleighyah

Thanks! :)

Posted

The export feature preserves everything, except binary files. Those sadly always have to be copied manual. I am using export/import myself to move Wiki data around. No issues yet.

Posted

Thanks for that info, TheLuda. I don't really need to preserve binaries so much as the actual text and the links that tie it all together for navigation.

I appreciate all the advice and help you've all offered. :)

Looks like I'll be spending some time parsing pages for export.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Privacy Policy Terms of Use