Note: You are viewing an old revision of this page. View the current version.
Note: Most of the actions on this page require administrative privileges. They won't work unless you have set an admin username and password in the PhpWiki config file.
A Wiki SandBox is very easy to clean. Here you can restore it to pristine condition by loading the default from pgsrc:
These links lead to zip files, generated on the fly, which contain all the pages in your Wiki. The zip file will be downloaded to your local computer.
This ZIP Snapshot contains only the latest versions of each page, while this ZIP Dump contains all archived versions.
(If the PhpWiki is configured to allow it,) anyone can download a zip file.
If your php has zlib support, the files in the archive will be compressed, otherwise they will just be stored.
Here you can dump pages of your Wiki into a directory of your choice.
The most recent version of each page will written out to the directory, one page per file. Your server must have write permissions to the directory!
If you have dumped a set of pages from PhpWiki, you can reload them here. Note that pages in your database will be overwritten; thus, if you dumped your HomePage when you load it from this form it will overwrite the one in your database now. If you want to be selective just delete the pages from the directory (or zip file) which you don't want to load.
Here you can upload ZIP archives, or individual files from your (client) machine.
Here you can load ZIP archives, individual files or entire directories. The file or directory must be local to the http server. You can also use this form to load from an http: or ftp: URL.
Currently the pages are stored, one per file, as MIME (RFC:2045) e-mail (RFC:822) messages. The content-type application/x-phpwiki is used, and page meta-data is encoded in the content-type parameters. (If the file contains several versions of a page, it will have type multipart/mixed, and contain several sub-parts, each with type application/x-phpwiki.) The message body contains the page text.
Serialized Files
The dump to directory command used to dump the pages as php serialized() strings. For humans, this made the files very hard to read, and nearly impossible to edit.
Plain Files
Before that the page text was just dumped to a file--this means that all page meta-data was lost. Note that when loading plain files, the page name is deduced from the file name.
The upload and load functions will automatically recognize each of these three types of files, and handle them accordingly.
This will generate a directory of static pages suitable for distribution on disk where no web server is available.