Backing up the wiki: Difference between revisions

Jump to navigation Jump to search
Content added Content deleted
No edit summary
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
=== WikiTeam3 ===
== Mediawiki Dump Generator ==
You can easily generate a database dump and file dump using the Mediawiki Client Tools' [https://github.com/mediawiki-client-tools/mediawiki-dump-generator Mediawiki Dump Generator] Python 3 dumpgenerator script, (full instructions are at this link). The result will include an XML dump with full page history, a dump of all images and files along with associated descriptions and a siteinfo.json file containing information about features, such as the installed extensions and skins.
You can easily generate a database dump and file dump using Save the Web Project's [https://github.com/saveweb/wikiteam3/blob/v4-main/README.md WikiTeam3] Python script, (full instructions are at that link).


{{note|Windows is unsupported. Use [https://learn.microsoft.com/en-us/windows/wsl/install WSL] to run WikiTeam3 on Windows.}}
e.g. <code><nowiki>dumpgenerator --xml --xmlrevisions --images --delay 0.0 https://SUBDOMAIN.miraheze.org</nowiki></code>

e.g. <code>wikiteam3dumpgenerator <nowiki>https://WIKINAME.miraheze.org</nowiki> --xml --images --bypass-cdn-image-compression --force</code>


To dump a private wiki you will have to use a login that has at least read permission on the wiki.
To dump a private wiki you will have to use a login that has at least read permission on the wiki.


e.g. <code><nowiki>dumpgenerator --xml --xmlrevisions --images https://SUBDOMAIN.miraheze.org --user USER --pass PASSWORD</nowiki></code>
e.g. <code>wikiteam3dumpgenerator <nowiki>https://WIKINAME.miraheze.org</nowiki> --xml --images --bypass-cdn-image-compression --force --user USER --pass PASSWORD</code>

* --xml exports an XML dump using Special:Export
* --images generates an image dump
* --bypass-cdn-image-compression doesn't drop any images compressed by a content delivery network
* --force generates a dump even if there is one already at Internet Archive

If you encounter any problem with running the script, please [https://github.com/saveweb/wikiteam3/issues raise a new issue] at the Save the Web Project's [https://github.com/saveweb/wikiteam3/blob/v4-main/README.md WikiTeam3] GitHub repository.


== Restoring from backup ==
== Restoring from backup ==
''See [https://www.mediawiki.org/wiki/MediaWiki MediaWiki.org], specifically [https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps Manual:Importing XML dumps]'' and ''[https://www.mediawiki.org/wiki/Manual:ImportImages.php Manual:importImages.php].''
See [https://www.mediawiki.org/wiki/MediaWiki MediaWiki.org], specifically [https://www.mediawiki.org/wiki/Manual:Importing_XML_dumps Manual:Importing XML dumps] and [https://www.mediawiki.org/wiki/Manual:ImportImages.php Manual:importImages.php].''


== External links ==
== External links ==

Revision as of 15:04, 23 June 2024

WikiTeam3

You can easily generate a database dump and file dump using Save the Web Project's WikiTeam3 Python script, (full instructions are at that link).

 Windows is unsupported. Use WSL to run WikiTeam3 on Windows.

e.g. wikiteam3dumpgenerator https://WIKINAME.miraheze.org --xml --images --bypass-cdn-image-compression --force

To dump a private wiki you will have to use a login that has at least read permission on the wiki.

e.g. wikiteam3dumpgenerator https://WIKINAME.miraheze.org --xml --images --bypass-cdn-image-compression --force --user USER --pass PASSWORD

  • --xml exports an XML dump using Special:Export
  • --images generates an image dump
  • --bypass-cdn-image-compression doesn't drop any images compressed by a content delivery network
  • --force generates a dump even if there is one already at Internet Archive

If you encounter any problem with running the script, please raise a new issue at the Save the Web Project's WikiTeam3 GitHub repository.

Restoring from backup

See MediaWiki.org, specifically Manual:Importing XML dumps and Manual:importImages.php.

External links