Backing up the wiki: Difference between revisions

Jump to navigation Jump to search
Content added Content deleted
(Created page with "== Backing up the wiki == Any confirmed user on this wiki is able to create an XML, JSON or image backup with Special:DataDump.<!-- Depending on settings in Special:Manage...")
 
No edit summary
Line 1: Line 1:
== Backing up the wiki ==
=== Mediawiki Dump Generator ===
You can easily generate a database dump and file dump using the Mediawiki Client Tools' [[github:mediawiki-client-tools/mediawiki-dump-generator|Mediawiki Dump Generator]] Python 3 dumpgenerator script, (full instructions are at this link).
Any confirmed user on this wiki is able to create an XML, JSON or image backup with [[Special:DataDump]].<!-- Depending on settings in Special:ManageWiki. -->


The result will include an XML dump with full page history, a dump of all images and files along with associated descriptions and a siteinfo.json file containing information about features, such as the installed extensions and skins.
Basic site info, such as a list of the extensions in use, is downloaded to the JSON file. The XML dump does not contain user accounts, etc.


To dump a private wiki you will have to use a login that has at least read permission on the wiki.
=== WikiTeam's dumpgenerator.py ===
Another method is to use the [https://github.com/WikiTeam/wikiteam WikiTeam] Python (only with version 2) [https://raw.githubusercontent.com/WikiTeam/wikiteam/master/dumpgenerator.py dumpgenerator.py] script from the command-line.

Example usage, this will produce a JSON file, an XML dump with page histories and a folder of files:<br /><code><nowiki>python2 dumpgenerator.py --xmlrevisions --xml --images --api=https://sdiy.info/w/api.php</nowiki></code>

However large wikis may fail to export leaving an incomplete XML dump. The presence of a siteinfo.json file probably indicates a succesful XML dump.

Full instructions are at the WikiTeam [https://github.com/WikiTeam/wikiteam/wiki/Tutorial tutorial].


== Restoring from backup ==<!-- untested -->
== Restoring from backup ==<!-- untested -->
Line 25: Line 18:


Afterwards run <code>php maintenance/rebuildrecentchanges.php</code> in order to update the content of Special:Recentchanges.
Afterwards run <code>php maintenance/rebuildrecentchanges.php</code> in order to update the content of Special:Recentchanges.

== External links ==
* [https://meta.miraheze.org/wiki/Backups Backups], Miraheze Meta
* [https://www.mediawiki.org/wiki/Manual:Backing_up_a_wiki Manual:Backing up a wiki], MediaWiki


[[Category:Meta]]
[[Category:Meta]]

Revision as of 15:31, 8 September 2023

Mediawiki Dump Generator

You can easily generate a database dump and file dump using the Mediawiki Client Tools' Mediawiki Dump Generator Python 3 dumpgenerator script, (full instructions are at this link).

The result will include an XML dump with full page history, a dump of all images and files along with associated descriptions and a siteinfo.json file containing information about features, such as the installed extensions and skins.

To dump a private wiki you will have to use a login that has at least read permission on the wiki.

Restoring from backup

See MediaWiki.org for more detailed instructions, (specifically Manual:Importing XML dumps and Manual:importImages.php).

After installing MediaWiki and extensions, in the shell use importDump.php to import the XML, this can take a long time. e.g. from the mediawiki folder
php maintenance/importDump.php --conf LocalSettings.php --dry-run < your_dumpfile.xml

If that works repeat without --dry-run. It won't matter if the XML dump file has the file extension .gz or .bz2 (is compressed).

Due to the bug T206683 it may be necessary to also include --user-prefix="" in the command.

Afterwards use ImportImages.php to import the images
php maintenance/importImages.php your_files/

Afterwards run php maintenance/rebuildrecentchanges.php in order to update the content of Special:Recentchanges.

External links