Backing up the wiki: Difference between revisions

Jump to navigation Jump to search
Content added Content deleted
No edit summary
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
=== WikiTeam3 ===
=== WikiTeam3 ===
You can easily generate a database dump and file dump using Save the Web Project's [https://github.com/saveweb/wikiteam3/blob/v4-main/README.md WikiTeam3] Python script, (full instructions are at that link).
You can easily generate a database dump and file dump using Save the Web Project's [https://github.com/saveweb/wikiteam3/ WikiTeam3] Python script, (full instructions are at that link).


{{note|<nowiki>Windows: When using --images, because NTFS does not allow characters such as :*?"<>| in filenames, some files may not be downloaded, please check the errors.log file.</nowiki>}}
{{note|Windows is unsupported. Use [https://learn.microsoft.com/en-us/windows/wsl/install WSL] to run WikiTeam3 on Windows.}}


e.g. <code>wikiteam3dumpgenerator <nowiki>https://WIKINAME.miraheze.org</nowiki> --xml --images --bypass-cdn-image-compression --force</code>
e.g. <code><nowiki>wikiteam3dumpgenerator https://WIKINAME.miraheze.org --xml --xmlrevisions --images --bypass-cdn-image-compression --force</nowiki></code>


To dump a private wiki you will have to use a login that has at least read permission on the wiki.
To dump a private wiki you will have to use a login that has at least read permission on the wiki.


e.g. <code>wikiteam3dumpgenerator <nowiki>https://WIKINAME.miraheze.org</nowiki> --xml --images --bypass-cdn-image-compression --force --user USER --pass PASSWORD</code>
e.g. <code><nowiki>wikiteam3dumpgenerator https://WIKINAME.miraheze.org --xml --xmlrevisions --images --bypass-cdn-image-compression --force --user USER --pass PASSWORD</nowiki></code>


* --xml exports an XML dump using Special:Export
* --xml exports an XML dump, uses Special:Export by default when no other xmldump method is specified.
* --xmlrevisions uses API:Allrevisions xmldump method. Recommended as it's quicker and puts almost no pressure on the MediaWiki backend compared to Special:Export.
* --images generates an image dump
* --images generates an image dump
* --bypass-cdn-image-compression doesn't drop any images compressed by a content delivery network
* --bypass-cdn-image-compression appends random parameters to URL when downloading image
* --force generates a dump even if there is one already at Internet Archive
* --force generates a dump even if there is one already at Internet Archive


If you encounter any problem with running the script, please [https://github.com/saveweb/wikiteam3/issues raise a new issue] at the Save the Web Project's [https://github.com/saveweb/wikiteam3/blob/v4-main/README.md WikiTeam3] GitHub repository.
If you encounter any problem with running the script, please [https://github.com/saveweb/wikiteam3/issues raise a new issue] at the Save the Web Project's [https://github.com/saveweb/wikiteam3/ saveweb/WikiTeam3] GitHub repository.


== Restoring from backup ==
== Restoring from backup ==

Latest revision as of 10:18, 1 July 2024

WikiTeam3

You can easily generate a database dump and file dump using Save the Web Project's WikiTeam3 Python script, (full instructions are at that link).

 Windows: When using --images, because NTFS does not allow characters such as :*?"<>| in filenames, some files may not be downloaded, please check the errors.log file.

e.g. wikiteam3dumpgenerator https://WIKINAME.miraheze.org --xml --xmlrevisions --images --bypass-cdn-image-compression --force

To dump a private wiki you will have to use a login that has at least read permission on the wiki.

e.g. wikiteam3dumpgenerator https://WIKINAME.miraheze.org --xml --xmlrevisions --images --bypass-cdn-image-compression --force --user USER --pass PASSWORD

  • --xml exports an XML dump, uses Special:Export by default when no other xmldump method is specified.
  • --xmlrevisions uses API:Allrevisions xmldump method. Recommended as it's quicker and puts almost no pressure on the MediaWiki backend compared to Special:Export.
  • --images generates an image dump
  • --bypass-cdn-image-compression appends random parameters to URL when downloading image
  • --force generates a dump even if there is one already at Internet Archive

If you encounter any problem with running the script, please raise a new issue at the Save the Web Project's saveweb/WikiTeam3 GitHub repository.

Restoring from backup

See MediaWiki.org, specifically Manual:Importing XML dumps and Manual:importImages.php.

External links