contributed code changes including a new
+ database library file 'dbalib.php' that uses the new interface
+ library for DBM files. The changes in config.php still default to
+ the dbmlib.php library for now, and the user must set 'dba' in
+ config.php if they are using PHP 4.0.4 or later. Tested this (just
+ barely) on a newly built PHP 4.0.4p1 on my RH6.2 box.
+
+2001-01-15 07:32 ahollosi
+
+ * lib/stdlib.php: ExtractWikiPageLinks now recognizes references of
+ the form [\d+]
+
+2001-01-11 04:54 ahollosi
+
+ * locale/README: initial commit - text by Jan Nieuwenhuizen and me
+
+2001-01-09 14:02 wainstead
+
+ * lib/dbmlib.php: Reverted to version 1.4; there is a bug in Jan
+ Hidders' patch for dbmlib. It causes a stack overflow in
+ stdlib.php line 318.
+
+2001-01-09 13:22 ahollosi
+
+ * lib/stdlib.php: added description to GeneratePage()
+
+2001-01-06 14:30 wainstead
+
+ * INSTALL.flatfile: Installation directions for flat file Wikis.
+
+2001-01-04 13:37 ahollosi
+
+ * lib/mysql.php: yet another E_NOTICED fixed and some comments
+ added.
+
+2001-01-04 13:34 ahollosi
+
+ * lib/transform.php: ZERO/SINGLE_DEPTH renamed into
+ ZERO/NESTED_LEVEL empty lines are now treated as tag '' (i.e. no
+ tag) instead of '' normal text is now treated as '
' instead
+ of tag '' added and corrected some comments, some code cleanup
+
+2001-01-04 13:32 ahollosi
+
+ * lib/config.php: ZERO/SINGLE_DEPTH renamed into ZERO/NESTED_LEVEL
+
+2001-01-04 13:32 ahollosi
+
+ * lib/stdlib.php: moved UpdateRecentChanges() to savepage.php
+ ZERO/SINGLE_DEPTH renamed into ZERO/NESTED_LEVEL added and
+ corrected some comments, some code cleanup
+
+2001-01-04 13:30 ahollosi
+
+ * lib/savepage.php: moved UpdateRecentChanges() to savepage.php
+
+2001-01-01 19:10 wainstead
+
+ * lib/search.php: There were two concatenation operators in a row,
+ on lines 8 and 9. This caused a syntax error.
+
+2001-01-01 18:34 ahollosi
+
+ * lib/: mysql.php, stdlib.php: squashed some E_NOTICE messages
+ about unset variables
+
+2001-01-01 18:18 ahollosi
+
+ * lib/db_filesystem.php: changed two more calls to is_dir() and
+ is_file() to file_exists in order to avoid warnings
+
+2001-01-01 18:13 ahollosi
+
+ * lib/: db_filesystem.php, editpage.php, ziplib.php: cleaned up
+ some warnings reported by David LeBlanc
+
+2000-12-30 16:48 ahollosi
+
+ * lib/stdlib.php: ParseAndLink() didn't handle unnamed phpwiki:
+ links correctly - fixed.
+
+2000-12-30 16:42 ahollosi
+
+ * lib/stdlib.php: cleaned up ParseAndLink() function, consequences:
+ - link['type']='unknwon' no longer exists (never happened anyway) -
+ link['type']='wiki' is now link['type']='wiki-simple' - "phpwiki:"
+ may appear in unnamed links as well - LinkURL() function now takes
+ second argument
+
+2000-12-30 16:09 ahollosi
+
+ * lib/: config.php, display.php, fullsearch.php, search.php: some
+ code cleanup -- mostly cosmetic changes
+
+2000-12-22 16:10 ahollosi
+
+ * CREDITS: added Antti Kaihola and Jan Hidders
+
+2000-12-22 15:18 ahollosi
+
+ * locale/: es/LC_MESSAGES/phpwiki.mo, nl/LC_MESSAGES/phpwiki.mo,
+ nl/LC_MESSAGES/phpwiki.php, po/de.po, po/es.po, po/nl.po,
+ po/phpwiki.pot: ran translate.sh to update .mo & .php files
+
+2000-12-14 22:11 wainstead
+
+ * lib/dbmlib.php: Initial commit of Jan Hidders' changes to
+ dbmlib.php. He added the code needed for the incoming/outgoing/most
+ popular nearby features. The only change I made was to revert the
+ signature for InsertPage, which he added a fourth parameter to. In
+ fact it should have a (shudder) global variable $WikiPageStore,
+ which makes it consistent with SaveCopyToArchive.
+
+2000-12-12 17:06 wainstead
+
+ * lib/dbmlib.php: Brought the function list in the lead comment box
+ up to date.
+
+2000-12-12 16:53 wainstead
+
+ * DBLIB.txt: Updated against the mysql.php file. This should be
+ accurate and finished for 1.2.
+
+2000-12-12 16:24 wainstead
+
+ * README: Added a paragraph about the feature differences between
+ the different database implementations.
+
+2000-12-12 16:09 wainstead
+
+ * images/wikibase.png: New 50x50 logo in reverse color (white on
+ dark gray).
+
+2000-12-12 12:11 wainstead
+
+ * README: Brought the manifest up to date. Some files were removed,
+ others added to the lists of files in the distribution.
+
+2000-12-11 09:00 ahollosi
+
+ * locale/po/nl.po: update to Dutch translation by Jan Nieuwenhuizen
+
+2000-12-06 18:12 ahollosi
+
+ * lib/stdlib.php: fixed bug in _iftoken() -
+ variable-in-string-brackets were used in a way that causes problems
+ for PHP3. Fix by Jan Hidders.
+
+2000-12-06 05:59 ahollosi
+
+ * lib/stdlib.php: fixed another bug in ExtractWikiPageLinks():
+ wiki_unknown-named was not recognized and named wiki links had the
+ wrong linktext inserted into the wikilinks table
+
+2000-11-24 17:07 wainstead
+
+ * lib/mysql.php:
+ Added semicolon on line 153, which caused the file to not compile
+ correctly. .
+
+2000-11-22 17:17 ahollosi
+
+ * lib/stdlib.php: fix newline bug in UpdateRecentChanges / fix
+ for-loop boundary
+
+2000-11-18 08:50 ahollosi
+
+ * pgsrc/FindPage, lib/fullsearch.php, lib/mysql.php: more
+ sophisticated search: match individual words, also excluding words
+ possible
+
+2000-11-16 22:04 wainstead
+
+ * CREDITS: Added Scott to the credits for patches.
+
+2000-11-16 22:01 wainstead
+
+ * lib/config.php: Patch from "Scott R. Anderson"
+ which allows the use of $PHP_AUTH_USER if it's set; i.e. if a user
+ is logged in their name will appear in RecentChanges instead of
+ REMOTE_HOST or REMOTE_ADDR.
+
+2000-11-16 03:52 ahollosi
+
+ * lib/stdlib.php: fix for ExtractWikiPageLinks(): didn't care for
+ '[[' escapes and thus detected invalid links
+
+2000-11-13 16:52 ahollosi
+
+ * locale/de/templates/browse.html: corrected title link
+
+2000-11-13 09:54 ahollosi
+
+ * lib/: config.php, mysql.php: added config options for all mysql
+ table names
+
+2000-11-13 06:22 ahollosi
+
+ * locale/: translate.sh, de/LC_MESSAGES/phpwiki.mo,
+ de/LC_MESSAGES/phpwiki.php, de/templates/browse.html,
+ de/templates/editpage.html, es/LC_MESSAGES/phpwiki.mo,
+ es/LC_MESSAGES/phpwiki.php, es/templates/browse.html,
+ es/templates/editpage.html, nl/LC_MESSAGES/phpwiki.mo,
+ nl/LC_MESSAGES/phpwiki.php, nl/templates/browse.html,
+ nl/templates/editpage.html, po/de.po, po/es.po, po/nl.po,
+ po/phpwiki.pot: new run of translate.sh / updated templates
+
+2000-11-13 06:01 ahollosi
+
+ * locale/translate.sh: fixed bug which omitted last string in
+ */LC_MESSAGE/phpwiki.php
+
+2000-11-13 05:59 ahollosi
+
+ * admin.php: localized remove function
+
+2000-11-13 05:37 ahollosi
+
+ * locale/po/de.po: fixed errors pointed out by Markus Guske
+
+2000-11-11 07:15 ahollosi
+
+ * lib/stdlib.php: bugfix in ExtractWikiPageLinks for
+ [text|WikiPage]
+
+2000-11-09 11:29 ahollosi
+
+ * admin.php, templates/browse.html: Added safety step for 'remove
+ page' feature
+
+2000-11-08 21:57 wainstead
+
+ * lib/mysql.php: Updated the list of functions in the header
+ comment.
+
+2000-11-08 12:07 ahollosi
+
+ * pgsrc/: TestPage, TextFormattingRules: update for new syntax:
+ !http and [named internal link|WikiPage]
+
+2000-11-08 11:52 ahollosi
+
+ * lib/stdlib.php: added named internal links e.g. [wow|FrontPage]
+ -- patch idea from Antti Kaihola
+
+2000-11-08 11:48 ahollosi
+
+ * admin.php: fix for RemovePage when get_magic_quotes_gpc()==1
+
+2000-11-08 11:19 ahollosi
+
+ * templates/README: explained new ###IF### syntax
+
+2000-11-08 10:54 ahollosi
+
+ * pgsrc/PhpWikiAdministration: added admin page -- should be locked
+ for normal users
+
+2000-11-08 10:50 ahollosi
+
+ * templates/editpage.html: updated to new ###IF### syntax
+
+2000-11-08 10:49 ahollosi
+
+ * templates/browse.html: included admin section
+
+2000-11-08 10:43 ahollosi
+
+ * admin.php: removed default user/pwd
+
+2000-11-08 10:40 ahollosi
+
+ * lib/mysql.php: added function RemovePage()
+
+2000-11-08 10:40 ahollosi
+
+ * lib/: config.php, editpage.php, savepage.php, stdlib.php,
+ transform.php: updates due to new admin structure
+
+2000-11-08 10:34 ahollosi
+
+ * index.php: adapted, so that index.php can be included by
+ admin.php
+
+2000-11-08 10:32 ahollosi
+
+ * admin.php: initial commit of admin.php
+
+2000-11-08 10:30 ahollosi
+
+ * admin/: dumpserial.php, loadserial.php, lockpage.php, zip.php:
+ renaming files to scheme used in lib/: dropping wiki_ prefix
+
+2000-11-07 00:13 wainstead
+
+ * lib/msql.php: Added GetWikiPageLinks, SetWikiPageLinks,
+ eliminated error messages, though the functionality is still not
+ there.
+
+2000-11-06 12:31 ahollosi
+
+ * locale/de/pgsrc/: EditiereText, FrischeSeiten, GaesteBuch,
+ MeistBesucht, SandKiste, StartSeite, WieManWikiBenutzt: translated
+ into German
+
+2000-11-06 11:31 ahollosi
+
+ * CREDITS: wikified CREDITS
+
+2000-11-06 11:14 ahollosi
+
+ * pgsrc/: AddingPages, ConvertSpacesToTabs, FindPage, FrontPage,
+ HowToUseWiki, MoreAboutMechanics, RecentChanges,
+ TextFormattingRules, WabiSabi, WikiWikiWeb: removed tab syntax,
+ bold ''' -> bold __, plus other minor corrections
+
+2000-11-05 22:19 wainstead
+
+ * HISTORY: Updates for 1.1.9.
+
+2000-11-05 21:51 wainstead
+
+ * pgsrc/ReleaseNotes: Added a few words about 1.1.9.
+
+2000-11-05 21:46 wainstead
+
+ * pgsrc/AddingPages: Changed the example to point to the current
+ PhpWiki homepage.
+
+2000-11-05 21:46 wainstead
+
+ * pgsrc/TextFormattingRules: Removed last remaining references to
+ using tabs, and put in an example of the new term-definition
+ syntax.
+
+2000-11-03 00:50 wainstead
+
+ * schemas/schema.minisql: Added/tested wikiscore table.
+
+2000-11-03 00:27 wainstead
+
+ * lib/msql.php: Fixed all instances that caused E_NOTICE messages.
+
+2000-11-02 23:51 wainstead
+
+ * CREDITS: Credited Sandino for the Spanish language translations.
+
+2000-11-02 16:15 ahollosi
+
+ * locale/: translate.sh, de/LC_MESSAGES/phpwiki.mo,
+ de/LC_MESSAGES/phpwiki.php, de/pgsrc/EditiereText,
+ de/pgsrc/FrischeSeiten, de/pgsrc/GaesteBuch, de/pgsrc/GuterStil,
+ de/pgsrc/KonvertiereLeerzeichenZuTabs, de/pgsrc/MeistBesucht,
+ de/pgsrc/PhpWiki, de/pgsrc/SandKiste, de/pgsrc/SeiteFinden,
+ de/pgsrc/SeitenErzeugen, de/pgsrc/StartSeite,
+ de/pgsrc/TextFormatierungsRegeln, de/pgsrc/WabiSabi,
+ de/pgsrc/WieManWikiBenutzt, de/pgsrc/WikiTechnik,
+ de/pgsrc/WikiWikiWeb, de/templates/browse.html,
+ de/templates/editlinks.html, de/templates/editpage.html,
+ de/templates/message.html, es/LC_MESSAGES/phpwiki.mo,
+ es/LC_MESSAGES/phpwiki.php, es/pgsrc/AgregarPaginas,
+ es/pgsrc/BuenEstilo, es/pgsrc/BuscarPagina, es/pgsrc/CajaDeArena,
+ es/pgsrc/CambiosRecientes, es/pgsrc/ComoUsarWiki,
+ es/pgsrc/ConvierteEspaciosEnTabs, es/pgsrc/EditarElTexto,
+ es/pgsrc/KBrown, es/pgsrc/MasAcercadeLaMecanica,
+ es/pgsrc/MasPopulares, es/pgsrc/PhpWiki,
+ es/pgsrc/ReglasDeFormatoDeTexto, es/pgsrc/VisitantesRecientes,
+ es/pgsrc/WabiSabi, es/pgsrc/WikiWikiWeb, es/templates/browse.html,
+ es/templates/editlinks.html, es/templates/editpage.html,
+ es/templates/message.html, nl/LC_MESSAGES/phpwiki.mo,
+ nl/LC_MESSAGES/phpwiki.php, po/de.po, po/es.po, po/nl.po,
+ po/phpwiki.pot: Spanish pages by Sandino Araico S�nchez
+ Initial set of German pages by me (Arno)
+
+2000-11-01 23:34 wainstead
+
+ * schemas/schema.psql: Added "drop table wikiscore;"
+
+2000-11-01 23:23 wainstead
+
+ * lib/pgsql.php: Incoming, outgoing, and most popular top 5's
+ appear to work now.
+
+2000-11-01 23:06 wainstead
+
+ * schemas/schema.psql: Added the wikiscore table; stripped out the
+ grant statements that gave me access to all postgresql databases
+ worldwide ;-)
+
+2000-11-01 23:03 wainstead
+
+ * pgsrc/TextFormattingRules: Corrected the URL to the example
+ image.
+
+2000-11-01 22:05 wainstead
+
+ * lib/pgsql.php: Fixed typo: "port" was spelled "pport". Probably
+ from the last patch.
+
+2000-11-01 06:31 ahollosi
+
+ * index.php, lib/dbmlib.php, lib/diff.php, lib/display.php,
+ lib/editpage.php, lib/pageinfo.php, lib/savepage.php,
+ lib/stdlib.php: fixed E_NOTICE warnings
+
+2000-11-01 05:24 ahollosi
+
+ * lib/config.php: use $SCRIPT_NAME instead of $REQUEST_URI removed
+ $ServerAddress - not used anywhere (simplifies $ScriptUrl="" case)
+ added ALT tag to logo image
+
+2000-10-31 15:24 ahollosi
+
+ * lib/config.php: restructured config.php completely -- hope this
+ is more readable / useable
+
+2000-10-31 14:24 ahollosi
+
+ * lib/transform.php: added "!" syntax for URLs as well, i.e.
+ !http://some.site/ does NOT create a hyperlink
+
+2000-10-31 14:23 ahollosi
+
+ * lib/stdlib.php: "phpwiki:" protocol patch -- omitted stdlib.php
+ in previous commit - *sigh*
+
+2000-10-31 12:07 ahollosi
+
+ * lib/config.php: added "phpwiki:" protocol within named links
+ [name|uri] UpdateRecentChanges() uses "phpwiki:" instead of raw url
+ now
+
+2000-10-31 11:18 ahollosi
+
+ * lib/config.php: Changed $WikiNameRegexp: doesn't use "\b"
+ (word-boundary) anymore. Necessary because '_' is interpreted as
+ word-character too and thus e.g. "__WikiName__" is not recognized
+ as link. Note that now "previously_not_a_WikiName" renders the
+ "WikiName" part as link now too.
+
+2000-10-30 02:41 ahollosi
+
+ * lib/config.php: fix for bug #117729 (fake author)
+
+2000-10-28 13:44 ahollosi
+
+ * lib/: config.php, pgsql.php: pgsql patch (version7) from
+ kbrown@sandino.net
+
+2000-10-26 11:47 ahollosi
+
+ * lib/savepage.php: fix: savepage didn't check FLAG_PAGE_LOCKED
+
+2000-10-26 11:38 ahollosi
+
+ * lib/editpage.php, lib/savepage.php, locale/translate.sh,
+ locale/nl/LC_MESSAGES/phpwiki.mo,
+ locale/nl/LC_MESSAGES/phpwiki.php,
+ locale/nl/templates/editpage.html, locale/po/nl.po,
+ locale/po/phpwiki.pot: another gettext() patch from Jan (fix plus
+ translation of savepage)
+
+2000-10-26 07:34 ahollosi
+
+ * lib/stdlib.php: fix: added "global $WikiNameRegexp" in
+ ExtractWikiPageLinks()
+
+2000-10-25 10:48 ahollosi
+
+ * lib/stdlib.php: adapted to changes in transform.php Inline images
+ now have an ALT tag Also, [name|http:image] uses name as ALT tag
+ now RenderQuickSearch() and RenderFullSearch() create a submit
+ button
+
+2000-10-25 10:45 ahollosi
+
+ * lib/transform.php: Heavy modification based on Neil Brown's
+ tokenize() patch. Much cleaner structure now -- why didn't we see
+ the obvious? Images of reference links [\d+] are inlined too now
+
+2000-10-25 10:41 ahollosi
+
+ * lib/config.php: added $InlineImages and $WikiNameRegexp
+
+2000-10-25 05:58 ahollosi
+
+ * index.php: set_magic_quotes_runtime(0) added (bug reported by
+ Hawk Newton)
+
+2000-10-25 01:06 wainstead
+
+ * HISTORY: Updated for 1.1.8
+
+2000-10-25 00:56 wainstead
+
+ * CREDITS: CREDITS is reborn!
+
+2000-10-25 00:19 wainstead
+
+ * pgsrc/RecentChanges: Updated the list of pages to include
+ PhpWiki.
+
+2000-10-25 00:13 wainstead
+
+ * pgsrc/PhpWiki: Added a page to define the WikiWord "PhpWiki."
+
+2000-10-24 06:32 ahollosi
+
+ * lib/transform.php, pgsrc/TestPage: killed the
+ !WikiName,!WikiName,!WikiNameSameStem bug for good. -- added
+ examples to TestPage
+
+2000-10-24 05:55 ahollosi
+
+ * lib/pageinfo.php, locale/nl/LC_MESSAGES/phpwiki.mo,
+ locale/nl/LC_MESSAGES/phpwiki.php, locale/po/nl.po,
+ locale/po/phpwiki.pot: Jan added some gettext() for pageinfo.php
+
+2000-10-23 16:55 ahollosi
+
+ * pgsrc/ReleaseNotes: corrected h1,h2,h3 markup example
+
+2000-10-23 12:52 ahollosi
+
+ * lib/transform.php, pgsrc/TestPage: New: tabless definition lists
+ (even nested): ;Term:definition And: ul,ol list types can be
+ mixed - we only look at the last character. Changes e.g. from
+ "**#*" to "###*" go unnoticed. and wouldn't make a difference to
+ the HTML layout anyway.
+
+2000-10-23 08:53 wainstead
+
+ * pgsrc/TextFormattingRules: Updated rules for the 1.1.8 release.
+
+2000-10-22 20:58 wainstead
+
+ * CREDITS: No longer maintained. TODO will happen on Sourceforge.
+
+2000-10-22 15:52 ahollosi
+
+ * locale/: translate.sh, nl/pgsrc/GoedeStijl,
+ nl/pgsrc/HoeWikiTeGebruiken, nl/pgsrc/JanNieuwenhuizen,
+ nl/pgsrc/MeerOverTechnieken, nl/pgsrc/PaginasToevoegen,
+ nl/pgsrc/RecenteBezoekers, nl/pgsrc/RecenteVeranderingen,
+ nl/pgsrc/TekstFormatteringsRegels, nl/pgsrc/UitgaveNoten,
+ nl/pgsrc/WabiSabi, nl/pgsrc/WikiWikiWeb, nl/pgsrc/ZoekPagina,
+ nl/templates/editlinks.html: commit of latest update to Dutch pages
+ by Jan
+
+2000-10-22 15:33 ahollosi
+
+ * lib/: config.php, setupwiki.php: Some generic pages are included
+ in English, ignoring the language setting (avoids unnecessary
+ duplication) - these pages are: TestPage, SteveWainstead,
+ ReleaseNotes. Removed "nl" counterparts.
+
+2000-10-22 09:30 ahollosi
+
+ * templates/editlinks.html, templates/editpage.html,
+ templates/message.html, locale/nl/templates/editlinks.html,
+ locale/nl/templates/editpage.html,
+ locale/nl/templates/message.html: adjusted layout (smaller logo, no
+ use of table anymore)
+
+2000-10-21 00:14 wainstead
+
+ * DBLIB.txt: Added 3 functions. No descriptions yet.
+
+2000-10-20 07:49 ahollosi
+
+ * templates/browse.html: get rid of headline table, logo now
+ smaller
+
+2000-10-20 07:48 ahollosi
+
+ * images/wikibase.png: logo now sports a black border
+
+2000-10-20 07:42 ahollosi
+
+ * lib/diff.php, lib/editpage.php, lib/fullsearch.php, lib/msql.php,
+ lib/mysql.php, lib/setupwiki.php, lib/stdlib.php,
+ locale/translate.sh, locale/nl/LC_MESSAGES/phpwiki.mo,
+ locale/nl/LC_MESSAGES/phpwiki.php,
+ locale/nl/pgsrc/HoeWikiTeGebruiken,
+ locale/nl/pgsrc/JanNieuwenhuizen, locale/nl/pgsrc/MeestBezocht,
+ locale/nl/pgsrc/PaginasToevoegen, locale/nl/pgsrc/RecenteBezoekers,
+ locale/nl/pgsrc/TekstFormatteringsRegels,
+ locale/nl/pgsrc/VeranderTekst,
+ locale/nl/pgsrc/VertaalSpatiesNaarTabs, locale/nl/pgsrc/WabiSabi,
+ locale/nl/pgsrc/WikiWikiWeb, pgsrc/TestPage,
+ locale/nl/templates/browse.html, locale/nl/templates/editpage.html,
+ locale/po/nl.po, locale/po/phpwiki.pot: second int. patch from Jan
+ (slightly modified)
+
+2000-10-19 18:25 ahollosi
+
+ * lib/: db_filesystem.php, dbmlib.php, editpage.php, msql.php,
+ mysql.php, stdlib.php: ExitWiki() function replaces simple calls to
+ exit()
+
+2000-10-19 17:49 ahollosi
+
+ * locale/nl/LC_MESSAGES/: phpwiki.mo, phpwiki.php: forgot to run
+ translate.sh
+
+2000-10-19 17:36 ahollosi
+
+ * lib/config.php, lib/diff.php, lib/display.php, lib/editpage.php,
+ lib/pageinfo.php, lib/savepage.php, lib/stdlib.php,
+ locale/translate.sh, locale/nl/LC_MESSAGES/phpwiki.mo,
+ locale/nl/LC_MESSAGES/phpwiki.php, locale/nl/pgsrc/GoedeStijl,
+ locale/nl/pgsrc/HoeWikiTeGebruiken,
+ locale/nl/pgsrc/JanNieuwenhuizen,
+ locale/nl/pgsrc/MeerOverTechnieken, locale/nl/pgsrc/MeestBezocht,
+ locale/nl/pgsrc/PaginasToevoegen, locale/nl/pgsrc/RecenteBezoekers,
+ locale/nl/pgsrc/RecenteVeranderingen,
+ locale/nl/pgsrc/TekstFormatteringsRegels,
+ locale/nl/pgsrc/UitgaveNoten, locale/nl/pgsrc/VeranderTekst,
+ locale/nl/pgsrc/VertaalSpatiesNaarTabs, locale/nl/pgsrc/VoorPagina,
+ locale/nl/pgsrc/WabiSabi, locale/nl/pgsrc/WikiWikiWeb,
+ locale/nl/pgsrc/ZandBak, locale/nl/pgsrc/ZoekPagina,
+ locale/po/nl.po, locale/po/phpwiki.pot,
+ locale/nl/templates/browse.html,
+ locale/nl/templates/editlinks.html,
+ locale/nl/templates/editpage.html,
+ locale/nl/templates/message.html: internationalization patch (based
+ on Jan Nieuwenhuizen's original patch)
+
+2000-10-18 18:31 wainstead
+
+ * pgsrc/SteveWainstead: Changed the address for bug reporting.
+
+2000-10-11 10:08 ahollosi
+
+ * lib/transform.php: added Neil Brown's nested-DefinitionLists
+ patch -- slightly modified
+
+2000-10-11 09:57 ahollosi
+
+ * index.php: added Neil Brown's search-button patch -- slightly
+ modified
+
+2000-10-09 22:59 wainstead
+
+ * lib/config.php: Changed back to dynamically setting the hostname;
+ using 'http:' is commented out and left as an option.
+
+2000-10-08 16:05 wainstead
+
+ * lib/config.php: Updated the paths to image files, which now live
+ in the images/ subdirectory.
+
+2000-10-08 15:59 wainstead
+
+ * HISTORY: Moved these into the images/ directory.
+
+2000-10-08 15:58 wainstead
+
+ * images/: png.png, signature.png, wikibase.png: Moved these out of
+ the root directory to improve the project structure.
+
+2000-10-08 15:46 wainstead
+
+ * lib/config.php: $ServerAddress is now set to "" by default, which
+ should work in most cases; the comments and if/else block remain
+ (commented out).
+
+2000-10-08 15:19 wainstead
+
+ * INSTALL.pgsql: Updated and improved.
+
+2000-10-08 15:08 wainstead
+
+ * INSTALL: Reread and updated.
+
+2000-10-08 14:27 wainstead
+
+ * INSTALL: Missed an occurance of the extension php3.
+
+2000-10-08 14:12 wainstead
+
+ * HISTORY, INSTALL, INSTALL.mSQL, INSTALL.mysql, INSTALL.pgsql,
+ README, lib/config.php, lib/display.php, lib/pageinfo.php,
+ pgsrc/MoreAboutMechanics, pgsrc/TestPage, templates/README: Changed
+ occurances of *php3 to *php.
+
+2000-10-08 13:48 wainstead
+
+ * index.php: Renamed from index.php3
+
+2000-10-08 13:43 wainstead
+
+ * HISTORY, templates/editpage.html: These have been moved to
+ lib/*.php.
+
+2000-10-08 13:33 wainstead
+
+ * lib/: config.php, db_filesystem.php, dbmlib.php, diff.php,
+ display.php, editlinks.php, editpage.php, fullsearch.php, msql.php,
+ mysql.php, pageinfo.php, pgsql.php, savepage.php, search.php,
+ setupwiki.php, stdlib.php, transform.php, ziplib.php: All
+ wiki_*.php3 files are now renamed to lib/*.php. The files I
+ commited a while back, lib/*.inc, were dropped because you can
+ still see the source of the pages that way, in a default Apache
+ setup.
+
+2000-10-08 13:30 wainstead
+
+ * pgsrc/TestPage: Added [[not linked to] and !NotLinkedTo.
+
+2000-09-23 11:06 ahollosi
+
+ * pgsrc/RecentChanges: added SandBox and MostPopular
+
+2000-09-23 10:56 ahollosi
+
+ * pgsrc/FrontPage: added links to ReleaseNotes and SandBox
+
+2000-09-23 10:56 ahollosi
+
+ * pgsrc/SandBox: added SandBox file for experimentation
+
+2000-09-23 10:32 ahollosi
+
+ * templates/browse.html: added scored related pages
+
+2000-09-23 10:31 ahollosi
+
+ * schemas/schema.mysql: added wikiscore table
+
+2000-09-21 15:44 ahollosi
+
+ * INSTALL, INSTALL.mysql: bringing documentation up to date
+
+2000-09-03 21:38 wainstead
+
+ * pgsrc/RecentChanges: Moved the search to the bottom, since it's
+ not what people come to the page to see.
+
+2000-08-28 22:42 aredridel
+
+ * admin/: wiki_dumpHTML.php, wiki_port1_0.php,
+ wiki_rebuilddbms.php: Changed short tags to long and ALT tag for logo so that phpwiki
+ becomes fully HTML compliant, also added
+
+2000-06-29 21:57 wainstead
+
+ * templates/: browse.html, editlinks.html, editpage.html,
+ message.html: Applied a patch that adds ###LOGO### to the
+ templating system. The value of $logo is now used instead of the
+ logo being hard coded.
+
+2000-06-29 20:44 wainstead
+
+ * DBLIB.txt: Documented the four new functions IncreaseHitCount,
+ GetHitCount, InitMostPopular, and MostPopularNextMatch.
+
+2000-06-29 12:19 ahollosi
+
+ * templates/browse.html: added diff link
+
+2000-06-29 10:43 ahollosi
+
+ * schemas/schema.mysql: version is no longer part of primary key of
+ archive. otherwise multiple versions got stored - and a random
+ version retrieved.
+
+2000-06-29 00:30 wainstead
+
+ * schemas/schema.minisql: Updated the column size of "line" to 128
+ in both WIKIPAGES and ARCHIVEPAGES tables.
+
+2000-06-28 23:08 wainstead
+
+ * DBLIB.txt: Minor updates to the text.
+
+2000-06-28 12:40 wainstead
+
+ * HISTORY: Updates. Nothing big.
+
+2000-06-26 17:26 ahollosi
+
+ * pgsrc/MostPopular: Added support for hitcount and MostPopular
+ (%%Mostpopular%% markup) quick hack - works only with mySQL so far
+
+2000-06-26 17:23 ahollosi
+
+ * pgsrc/: FindPage, RecentChanges: changed markup for search boxes
+ to %%Search%%, %%Fullsearch%%
+
+2000-06-26 16:05 ahollosi
+
+ * pgsrc/RecentChanges: fixed date check in UpdateRecentChanges
+
+2000-06-25 23:56 wainstead
+
+ * schemas/schema.minisql: This is still in transition. Do not use
+ in a production setting.
+
+2000-06-25 15:40 wainstead
+
+ * schemas/schema.minisql: Reworked the schema to get around mSQL's
+ limitation where you cannot use TEXT in a LIKE clause. Lines of a
+ page are now stored in a separate table.
+
+2000-06-25 03:34 wainstead
+
+ * schemas/schema.minisql: Upped the size of the "searchterms"
+ field.
+
+2000-06-24 02:41 wainstead
+
+ * schemas/schema.minisql: First cut at a schema for a mSQL-based
+ PhpWiki.
+
+2000-06-22 23:20 wainstead
+
+ * pgsrc/ReleaseNotes: Fixed a syntax error with the h1, h2, h3
+ tags.
+
+2000-06-22 23:15 wainstead
+
+ * pgsrc/: ReleaseNotes, SteveWainstead: updated the information
+
+2000-06-22 22:57 wainstead
+
+ * README: Added wiki_pageinfo.php3 and its description
+
+2000-06-22 22:54 wainstead
+
+ * INSTALL.pgsql: Went through all the steps myself of installing a
+ Postgresql based Wiki. No problems, mostly.
+
+2000-06-22 22:36 wainstead
+
+ * pgsrc/RecentChanges: changed the date from April 22 to a generic
+ thing
+
+2000-06-21 18:59 ahollosi
+
+ * INSTALL.mysql, schemas/schema.mysql: changed to new db schema
+
+2000-06-21 15:33 ahollosi
+
+ * templates/browse.html: added support for wiki_pageinfo.php3
+
+2000-06-20 00:39 wainstead
+
+ * schemas/schema.psql: Table 'archive' is now identical to table
+ 'wiki' and I added all the GRANT statements needed for user
+ 'nobody'. This might change again in the future.
+
+2000-06-19 23:41 wainstead
+
+ * DBLIB.txt: Corrected the entry for 'flags' which is an integer,
+ not a string
+
+2000-06-19 22:29 wainstead
+
+ * DBLIB.txt: Added a description of the primary data structure,
+ $pagehash. Each field of the hash and the data type it contains is
+ described.
+
+2000-06-19 16:18 ahollosi
+
+ * README: added info for mySQL and PostgreSQL
+
+2000-06-18 12:05 ahollosi
+
+ * templates/README: initial commit: README describes template
+ placehodlers
+
+2000-06-18 11:12 ahollosi
+
+ * templates/: browse.html, editlinks.html, editpage.html,
+ message.html: added support for HTML templates
+
+2000-06-18 01:25 wainstead
+
+ * INSTALL.pgsql: Added a few more SQL commands to give user
+ 'nobody' permission to change tables.
+
+2000-06-12 00:19 wainstead
+
+ * schemas/schema.psql: Added the ID.
+
+2000-06-12 00:14 wainstead
+
+ * INSTALL.pgsql: Initial commit of the INSTALL file for Postgresql.
+
+2000-06-11 14:30 wainstead
+
+ * INSTALL: Added a note about requiring PHP 3.0.9 or greater.
+
+2000-06-09 23:49 wainstead
+
+ * schemas/schema.psql: Initial version of the schema for
+ Postgresql.
+
+2000-06-09 06:17 ahollosi
+
+ * DBLIB.txt: corrected description of InitTitleSearch and
+ InitFullSearch plus minor additions throughout file
+
+2000-06-08 18:11 ahollosi
+
+ * HISTORY: beautified 1.1.5 entry
+
+2000-06-08 18:11 ahollosi
+
+ * INSTALL.mysql: changed mySQL schema: column 'data' renamed 'hash'
+ because 'dada' is a reserved word.
+
+2000-06-08 17:49 wainstead
+
+ * HISTORY: Newest version, 1.1.5
+
+2000-06-05 16:54 wainstead
+
+ * DBLIB.txt: This is the first draft of the contract for the
+ database interface. When this document is finalized, it will allow
+ porters to write a PHP file that interfaces with any data store
+ (like DBM files, RDBMS's, etc) without needing to change any other
+ files in the Wiki source tree.
+
+2000-06-03 07:20 ahollosi
+
+ * pgsrc/RecentChanges: fix: UpdateRecentChanges wouldn't delete old
+ entries
+
+2000-06-03 07:19 ahollosi
+
+ * pgsrc/ConvertSpacesToTabs: adjusted to new rules: [ ] -> [[o]
+
+2000-06-02 19:37 ahollosi
+
+ * pgsrc/TextFormattingRules: Added headings (!,!!,!!!), suppression
+ of wiki linking (!WikiName), and linebreaks (%%%)
+
+2000-06-02 12:05 ahollosi
+
+ * INSTALL.mysql: initial commit for 1.1.4
+
+2000-06-02 11:59 ahollosi
+
+ * HISTORY, README, pgsrc/AddingPages, pgsrc/ConvertSpacesToTabs,
+ pgsrc/EditText, pgsrc/FindPage, pgsrc/FrontPage, pgsrc/GoodStyle,
+ pgsrc/HowToUseWiki, pgsrc/MoreAboutMechanics, pgsrc/RecentChanges,
+ pgsrc/RecentVisitors, pgsrc/ReleaseNotes, pgsrc/SteveWainstead,
+ pgsrc/TextFormattingRules, pgsrc/WabiSabi: initial commit for 1.1.4
+
+2000-06-02 11:46 ahollosi
+
+ * LICENSE: initial import into cvs
+
+2000-06-02 11:46 ahollosi
+
+ * HISTORY, README, INSTALL, pgsrc/AddingPages,
+ pgsrc/ConvertSpacesToTabs, pgsrc/EditText, pgsrc/FindPage,
+ pgsrc/FrontPage, pgsrc/GoodStyle, pgsrc/HowToUseWiki,
+ pgsrc/MoreAboutMechanics, pgsrc/RecentChanges,
+ pgsrc/RecentVisitors, pgsrc/ReleaseNotes, pgsrc/SteveWainstead,
+ pgsrc/TextFormattingRules, pgsrc/WabiSabi: Initial revision
+
diff --git a/docroot/phpwiki/DBLIB.txt b/docroot/phpwiki/DBLIB.txt
new file mode 100755
index 0000000..e02f6bd
--- /dev/null
+++ b/docroot/phpwiki/DBLIB.txt
@@ -0,0 +1,177 @@
+This is a description of the database interface for PhpWiki. Regardless
+of what kind of data store is used (RDBMS, DBM files, flat text files)
+you should be able to write a library that supports that data store.
+
+A few notes:
+
+* While most functions specify a "db reference" as the first value
+ passed in, this can be any kind of data type that your functions
+ know about. For example, in the DBM implementation this is a hash of
+ integers that refer to open database files, but in the MySQL
+ version it's an associative array that contains the DB information.
+
+* Functions that return the page data must return a hash (associative
+ array) of all the data, where 'content' == the text of the page in Wiki
+ markup, 'version' is an integer representing the version, 'author'
+ the IP address or host name of the previous author and so on. See
+ the next paragraph for a precise description.
+
+* The data structure. This is commonly named $pagehash in the source
+ code; it's an associative array with values that are integers,
+ strings and arrays (i.e. a heterogenous data structure). Here's a
+ current description:
+
+ $pagehash = {
+ author => string,
+ content => array (where each element is a line of the page),
+ created => integer (a number in Unix time since the Epoch),
+ flags => integer,
+ lastmodified => integer (also Unix time),
+ pagename => string,
+ version => integer,
+ refs => array (where each element is a reference)
+ };
+
+The functions are:
+
+ OpenDataBase($dbname)
+ takes: a string, the name of the database
+ returns: a reference to the database (a handle)
+
+
+ CloseDataBase($dbi)
+ takes: a reference to the database (handle)
+ returns: the value of the close. For databases with persistent
+ connections, this doesn't return anything.
+
+
+ MakeDBHash($pagename, $pagehash)
+ takes: page name, page array
+ returns: an encoded version of the $pagehash suitable for
+ insertion into the data store. This is an internal helper
+ function used mainly for the RDBMSs.
+
+ MakePageHash($dbhash)
+ takes: an array that came from the database
+ returns: the $pagehash data structure used by the
+ application. This function undoes what MakeDBHash does.
+
+ RetrievePage($dbi, $pagename, $pagestore)
+ takes: db reference, string which is the name of a page, and a
+ string indicating which store to fetch the page from (live or archive).
+ returns: a PHP associative array containing the page data
+ (text, version, author, etc)
+
+
+ InsertPage($dbi, $pagename, $pagehash)
+ takes: db reference, page name (string), associative array
+ of all page data
+ returns: nothing (hmm. It should probably return true/false)
+
+ SaveCopyToArchive($dbi, $pagename, $pagehash)
+ Similar to InsertPage but for handling the archive store. The
+ goal here was to separate the two (live db and archive db) in
+ case there were different storage formats (for example, the
+ archive might only store diffs of the pages). However this is
+ not the case in the implementations.
+
+ IsWikiPage($dbi, $pagename)
+ takes: db reference, string containing page name
+ returns: true or false, if the page already exists in the live db.
+
+ IsInArchive($dbi, $pagename)
+ takes: db reference, string containing page name
+ returns: true or false, if the page already exists in the archive.
+
+ InitTitleSearch($dbi, $search)
+ takes: db reference, search string
+ returns: a handle to identify the query and the current position
+ within the result set.
+
+ RemovePage($dbi, $pagename)
+ takes: db reference, name of the page
+ returns: nothing
+ This deletes a page from both the live and archive page stores.
+
+ TitleSearchNextMatch($dbi, &$pos)
+ takes: db reference, reference to a hash created by
+ InitTitleSearch
+ returns: the next page name that contains a match to the search term
+ (advances $pos to next result field as well)
+
+ MakeSQLSearchClause($search, $column)
+ takes: a search string, column name
+ returns: a SQL query string suitable for a database query
+
+ InitFullSearch($dbi, $search)
+ takes: db reference, string containing search term
+ returns: similar to InitTitleSearch: a handle to identify the
+ query and the current position within the result set.
+
+
+ FullSearchNextMatch($dbi, &$pos)
+ takes: db reference, reference to a hash created by
+ InitFullSearch
+ returns: an associative array, where:
+ 'name' -- contains the page name
+ 'hash' -- contains the hash of the page data
+ (advances $pos to next result field as well)
+
+
+ MakeBackLinkSearchRegexp($pagename)
+ takes: a page name
+ returns: A PCRE suitable for searching for a link to the given page
+ within page (wiki-markup) text.
+
+ InitBackLinkSearch($dbi, $pagename)
+ takes: db reference, page name
+ returns: a handle to identify the query and the current position
+ within the result set.
+
+ BackLinkSearchNextMatch($dbi, &$pos)
+ takes: db reference, reference to a hash created by
+ InitBackLinkSearch
+ returns: the next page name that contains a link to the specified page.
+ (advances $pos to next result field as well)
+
+
+ IncreaseHitCount($dbi, $pagename)
+ takes: db reference, string (name of a page)
+ returns: nothing (MySQL implementation returns the last result
+ set but it is not used by the caller)
+
+
+ GetHitCount($dbi, $pagename)
+ takes: db reference, string (page name)
+ returns: an integer, the number of hits the page has received
+
+
+ InitMostPopular($dbi, $limit)
+ takes: a db reference and an integer, which is the limit of the
+ number of pages you want returned.
+ returns: the result set from the query
+
+
+ MostPopularNextMatch($dbi, $res)
+ takes: db reference, the result set returned by InitMostPopular
+ returns: the next row from the result set, as a PHP array type
+
+ GetAllWikiPageNames($dbi)
+ takes: db reference
+ returns: an array containing all page names
+
+ GetWikiPageLinks($dbi, $pagename)
+ takes: db reference, page name
+ returns: a two-dimensional array containing outbound links
+ ordered by score desc ('out'); inbound links ordered by score
+ desc ('in'); inbound or outbound links ordered by most number of
+ page views ('popular').
+
+ SetWikiPageLinks($dbi, $pagename, $linklist)
+ takes: db reference, page name, list of pages linking to this
+ one
+ This deletes the existing list of linking pages and inserts all
+ the page names in $linklist.
+
+$Id$
+
diff --git a/docroot/phpwiki/HISTORY b/docroot/phpwiki/HISTORY
new file mode 100755
index 0000000..b4d8f50
--- /dev/null
+++ b/docroot/phpwiki/HISTORY
@@ -0,0 +1,253 @@
+* Don't show signature image if $SignatureImg (in config.php) is left unset
+* Much improved german pgsrc/
+* Spelling fixes in english pgsrc/*
+* Minor security fixes:
+ + pagename must now be url-encoded
+ + added permission checks in admin/{load,dump}serial.php
+* Bugs fixed
+ + hang on full zip dump
+ + hang on diff
+ + unzip failed on some old zip-dumps
+ + check for DB files in /tmp was broken
+ + encode HTML entities within edit textarea
+ + ensure page linkification in "Describe foo here."
+ + fixes to transform code to improve HTML validity
+
+1.2.0 02/01/01
+* Support for PHP 4.0.4 (using the dba_* interface for DBM files),
+ thanks to Joel Uckelman
+* Swedish translation added, thanks to Jon �slund
+* dbmlib.php has all functions in mysql/postgresql, thanks to Jan Hidder
+* German version updated
+* Dutch translation updated
+* Spanish version updated
+* More robust support for flat file based Wiki (INSTALL.flatfile
+ included)
+* "named internal links," i.e. [wow | FrontPage]
+* New IF syntax added to templates
+* New PhpWikiAdministration page added
+* New term/defintion syntax (for tags)
+* Plenty of bug fixes
+
+1.1.9 11/05/00:
+* Spanish language support added, thanks to Sandino Araico
+ S�nchez
+* German language support thanks to Arno Hollosi
+* Postgresql version brought up to date (plus fixes from Sandino)
+* Neil Brown contributed a patch Arno worked in to heavily modify
+ lib/transform.php, much cleaner structure now
+* Various page updates to English pages
+* Schema update for mSQL
+* Assorted E_NOTICE warnings fixed throughout (though still not done)
+* URL no longer stored in page source of RecentChanges
+* various bugs squashed
+
+1.1.8 10/25/00:
+* Internationalization, with support for Dutch, and an architecture
+ to add more languages easily
+* Term/defintion tags updated to next and use tabless markup
+* MostPopular works for all implementations, except flat files
+* Flat file database support; it's not yet complete but the basic Wiki
+ functionality is there, thanks to Ari
+* New zip file format for page dumps follows standard email format
+* Removed tabs from all default pages
+* Added whitespace padding to pages after they are serialized and
+ written to the DBM file; this goes a long way towards fixing the
+ memory leak problems for DBM-based Wikis.
+* Numerous bug fixes, as always
+* Some refactoring of the database interface
+
+1.1.7 07/15/00: A lot was added since the 1.1.6b release. Diffs are
+ the handiwork of Jeff Dairiki, though Arno wrote the second
+ revision. Features and changes include:
+
+* Page diffs, with color
+* "MostPopular" page added which dynamically tracks most viewed pages
+ (MySQL only so far)
+* Admin functions: page dumps, page loads, Zip dumps, page locking
+* MySQL, DBM, mSQL and Postgresql support all functional and appear stable
+* Full HTML compliance in page output
+* Tabless markup language introduced for and
+* Fixed raw HTML exploit in [info] page
+* Perl script included to reduce the size of a DBM file
+* documentation updates
+* page source updates
+* gazillion feature enhancements and bug fixes, no doubt necessitating
+ another gazillion feature enhancements and bug fixes ;-)
+
+1.1.6b 06/27/00: The wrong wiki_config.php3 file was included in 1.1.6,
+and this release corrects that; also in my hurry, I included all the CVS
+directories and files, and a test file. That stuff was left out.
+
+1.1.6 06/26/00: Added templates, Postgresql support, mSQL support, new
+database schema, new date storage format, an "info" link on all pages,
+and introduced some new bugs (RecentChanges is broken ;-)
+
+1.1.5 06/08/00: Here are the comments from the CVS logs:
+
+fixed magic_quotes_gpc=1 bug for $pagename
+fixed raw-HTML exploit for $pagename
+fixed javascript: links exploit
+Concurrent editing of pages is detected now - fixes LostUpdateProblem
+(note: EditLinks is *not* treated this way yet)
+search term is now preg_quote()'ed instead of chars removed
+bugfix: UpdateRecentChanges didn't link names of new-style pages.
+Fixed FindPage and search boxes
+Added headings (!,!!,!!!), suppression of wiki linking (!WikiName), and linebreaks (%%%)
+changed mySQL schema: column 'data' renamed 'hash' because 'dada' is a
+reserved word. (update your tables!)
+
+This release should work fine with the new linking scheme, but then
+again, hey, it's beta!
+
+1.1.4 05/11/00: I added the new linking
+scheme, which largely follows the scheme of Wikic
+(http://wiki.cs.uiuc.edu/RefactoringBrowser/Wiki+Syntax). Both "classic
+Wiki" linking and the new linking are supported; you can now also link
+things by using square brackets, like this:
+
+[this is a page link]
+[this is an external link | http://wcsb.org/]
+[ftp://ftp.redhat.com/]
+
+Reference links are still supported.
+
+1.1.3 04/22/00: I rewrote UpdateRecentChanges completely; it's more
+efficient now because it only loops over the lines once, and entries are
+now newest-first instead of oldest-first.
+
+1.1.2 04/20/00: I finally solved the problem once and for all (I hope!)
+with loading pages into a brand new wiki. Vim allows you to change the
+file formats so I wrote a two line ex script to convert all the pages to
+dos format. (This gives them the CR/NL, unlike Un*x).
+
+1.1.1 04/15/00: I changed the way Wiki markup is converted and
+displayed. Before pages were rendered line by line; now it accumulates
+all the HTML in a variable and does one "echo" to display the page.
+While this might be a bit slower (it will use a little bit more memory)
+this means PhpWiki can be modified so the HTML can be written to a file.
+a whole PhpWiki site might either be served as static HTML, or
+periodically exported to disk after a period of activity. This is the
+secod beta (more or less) of the 1.1 release.
+
+1.1.0 04/10/00: Support for MySQL added. Thanks to Arno Hollosi for
+his excellent work! He also provided patches to clean up the wiki_setup
+procedure and fix rendering bugs with italics and bold text, amongst
+others. Alister provided patches for arbitrary
+numbers of reference links, fixing a rotten logic error on my part.
+Removed "static" declarations to help the PHP4 porters.
+
+1.03 03/21/00: Refactored index.php3 and wiki_display.php3, which
+had dbm function calls in them. Thanks to Christian Lindig
+ for pointing this out. This should make it
+a little easier to port to a different database.
+
+1.02 02/02/00: Disabled embedded HTML, due to security holes
+described in this CERT advisory: http://www.cert.org/advisories/CA-2000-02.html
+You can re-enable it by removing the comment tags in wiki_display.php3.
+Please be certain of what you are doing if you allow this!
+
+1.01 02/01/11: Fixed bug where header rules ( ) were inserted
+whenever four or more dashes occured; this only works if it starts the
+line now. Thanks to Gerry Barksdale.
+
+1.00 01/25/00: Changed the names of all files except index.php3; I
+prefaced them all with "wiki_" to avoid collisions with other files
+that might be in the include path. Thanks to Grant Morgan for the
+suggestion. A few corrections to the default pages; I think the
+small rendering problems are due to Unix's lack of a carriage
+return.
+
+0.99 01/20/00: Added a logic change suggested by Clifford Adams,
+where a copy is saved to the archive if the previous author was
+different. A different person that is. Fixed a rendering bug. This was
+breaking: http://c2.com/cgi-bin/wiki followed by
+http://c2.com/cgi-bin/wiki?PhpWiki on the same line. Because PHP only
+can do *global* search and replace, the second URL was incompletely
+swapped and linked. Using rsort() on the array of matches worked.
+Added a patch from Grant Morgan for servers with magic_quotes_gpc set.
+
+0.98 01/18/00: Added code to build the $ServerAddress dyanamically. Now,
+PhpWiki will work as soon as it's untarred. No configuration should be
+necessary.
+
+0.97 01/16/00: Added a feature suggested by Clifford Adams. It stores
+the author's remote address and disables the EditCopy the next time they
+edit a page. Added support and debugged it. A new Wiki will
+load a set of new pages, so the Wiki is ready to go out of the box.
+
+0.96 01/15/00: Added EditCopy. This uses a second DBM file and could use
+some more pounding. I also found a bug when two URL's appear on the same
+line, like:
+http://foo.com/ http://foo.com/WikiWikiWeb
+In this case the second URL will not be linked correctly due to PHP's
+replace-all-or-nothing regular expression functions.
+
+0.95 01/04/00: Severe reworking of the list code (UL amd OL tags). I
+added a stack class and "implemented recursion," which seemed the
+logical way to accomplish nested tags. There are a couple of minor bugs
+to work out, and I have to get DL tags working. I changed some constants
+to define()'s instead. There are magic numbers in stdlib.php3 that
+probably should be defined. I also used while() loops while doing
+push/pop operations which also gives me the willies, but I put in bounds
+checking.
+
+0.94: 12/22/99 Mostly code cleanups; added code for waiting on the dbm
+file if not available; added more comments.
+
+0.93: 12/21/99 Added full text search. Moved configuration code to a new
+file, config.php3. Fixed another bug in RecentChanges. Page titles now
+link to full search, which is a bit more useful. Added code to create a
+new RecentChanges if none existed.
+
+0.92: 12/20/99 Added REMOTE_HOST to RecentChanges edits; fixed a bug
+where I typed in PUT instead of POST in editpage.php3; patched
+RecentChanges so hopefully the lines won't get screwed up anymore.
+
+0.91: 12/20/99 Fixed bug that occurred when creating new pages.
+
+0.90: 12/19/99 Added user-defined links (i.e. [1], [2] etc) and embedded
+images. This is the first beta release. Only an issue with second-level
+lists remains, and a couple of other minor things.
+
+0.81: 12/19/99 Fixed another wiki word linking issue. It arose from
+having similar patterns in words in the same line.
+
+0.80: 12/18/99 I'm bumping up the version to .8 because it's that close.
+I finally solved the crucial linking problem, by reinventing how
+classic Wiki does it ;-) URL's are first replaced with tokens, then Wiki
+words are linked, then linked URL's are put back. I improved the code a
+great deal for the differnet "modes" of display text; I have a function
+maintain the mode, and whether end tags are needed or not.
+
+0.07: 12/18/99 Fixed minor bug in preformatted/bulleted code output; fixed
+reverse linking of page titles
+
+0.06: 12/15/99: Added ChangeSpacesToTabs
+
+0.05: 12/14/99: Added title searches, RecentChanges, fixed numerous bugs like
+only trying to render valid page names, the last-edited-date, removed
+the navigation toolbars, linked the new logo to FrontPage, and a half
+dozen other odd things. It's almost ready.
+
+0.04: 12/12/99: Several additions to markup:
+ italics
+ bold
+ preformatted text
+ unordered lists (one level only, tow levels breaks)
+
+In addition, wiki links have been further debugged, but I still
+haven't solved the problem with wiki-links inside URLs.
+
+0.03: 12/4/99: Pages are now serialized arrays instead of text blobs. Some
+markup rules have been implemented. HTML is escaped; Wiki linking works
+but
+undefined pages do not have the question mark after them (they look like
+a regular link to a Wiki page.) URL's hyperlink and should accept most
+legal
+URL's.
+
+0.02: 12/3/99: Basic page editing/saving/displaying is now working.
+
+Prior to 0.02, there were no numbered releases, and in fact there
+was only one release, so I guess that would be 0.01 :-)
diff --git a/docroot/phpwiki/INSTALL b/docroot/phpwiki/INSTALL
new file mode 100755
index 0000000..3e3a9fc
--- /dev/null
+++ b/docroot/phpwiki/INSTALL
@@ -0,0 +1,98 @@
+0. INSTALLATION
+
+PhpWiki requires PHP version 3.0.9 or greater, since it uses the
+preg_*() family of functions.
+
+Untar/gzip this file into the directory where you want it to live.
+That's it.
+
+bash$ gzip -d phpwiki-X.XX.tar.gz
+bash$ tar -xvf phpwiki-X.XX.tar
+
+To improve efficiency, edit lib/config.php and set the $ServerAddress
+by hand; this will save a regexp call on every invocation.
+
+Example:
+Let's say you own the web server http://www.foo.com/. You untar in the
+server's root directory; then you should be able to just go to your new
+Wiki:
+
+http://www.foo.com/phpwiki/index.php
+
+If you configure your server to recognize index.php as the index of a
+directory, you can just do:
+
+http://www.foo.com/phpwiki/
+
+If you get a blank page, PhpWiki tried to open the wrong DBM file
+type, most likely. Edit the file lib/config.php and set DBM_FILE_TYPE
+to the correct type for your system. 'gdbm' or 'db2' usually work.
+
+1. CONFIGURATION
+
+The first time you run this Wiki it will load a set of basic pages from
+the pgsrc/ directory. These should be enough to get your Wiki started.
+
+PhpWiki will create some DBM files in /tmp. They contain the pages of the
+live site, archived pages, and some additional information.
+
+If you don't want the DBM files to live in /tmp you must make sure the web
+server can read/write to your chosen location. It's probably a bad idea
+to leave it in /tmp. (Again, edit lib/config.php).
+
+For example, you create a subdirectory called "pages" in the wiki
+directory made when you untarred PhpWiki. Move the DBM files there.
+The files are called: wikipagesdb, wikiarchivedb, wikilinksdb,
+wikihottopicsdb, and wikihitcountdb. The files should already have proper
+rights and owners, as they were created by the web server. Otherwise
+change them accordingly so your web server can read/write the DBM
+files. (Note you must be root to move files created by the web server).
+
+Then you must ensure that the web server can access the "pages" directory
+and can create new files in it. These can be achieved e.g. by doing
+
+bash$ chown nobody:youraccount pages
+bash$ chmod 755 pages
+
+if your web server runs as user 'nobody'. This is necessary so that
+the server can also create/set the lock file (PHP has a built in
+locking mechanism for DBM file access). Or if you're really lazy and
+don't worry much about security:
+
+bash$ chmod 777 pages
+
+Note: this is insecure. The proper way is to let the directory be owned
+by the web servers GUID and give it read and write access.
+
+
+
+2. ALLOWING EMBEDDED HTML
+
+PhpWiki ships with this featured disabled by default. According to CERT
+(http://www.cert.org/advisories/CA-2000-02.html) malicious users can embed
+HTML in your pages that allow pure evil to happen. You can uncomment the
+"elseif" in lib/transform.php to allow embedded HTML; but you should NEVER
+do this if your Wiki is publically accessible.
+
+
+3. ETC
+
+Installing PHP is beyond the scope of this document :-)
+You should visit http://www.php.net/ if you don't have PHP.
+Note that you should have the web server configured to allow index.php
+as the root document of a directory.
+
+This web application was written under PHP version 3.0.12 and
+the latest build of PHP4. It's tested under the following systems:
+
+MySQL + Debian
+mSQL + Red Hat 4.1
+DBM or Postgresql on Red Hat 6.2
+
+It reportedly works on Windows with Apache+PHP, which amazes me.
+
+That should be all. Send patches, bugs etc. to:
+
+phpwiki-talk@lists.sourceforge.net
+
+FIN
diff --git a/docroot/phpwiki/INSTALL.flatfile b/docroot/phpwiki/INSTALL.flatfile
new file mode 100755
index 0000000..92164e6
--- /dev/null
+++ b/docroot/phpwiki/INSTALL.flatfile
@@ -0,0 +1,132 @@
+If you cannot run PhpWiki on top of a relational database like
+MySQL or Postgresql, and your system does not support DBM files
+or (worse) has a broken implementation like NDBM on Solaris, then
+a flat file Wiki should work for you. Note that as of 1.2 most of the
+Wiki functionality is there... the MostPopular is not implemented yet
+so you will want to delete that link from the FrontPage (or better
+yet, write it and mail us a patch ;-)
+
+Installation is similar to using a DBM file for storing the pages.
+You should read the main INSTALL file before this one (it's not long
+and complicated so go ahead and we'll wait for you right here).
+
+First, edit lib/config.php and set the database to "file":
+
+ $WhichDatabase = 'file'; // use one of "dbm", "mysql", "pgsql", "msql",
+ // or "file"
+
+
+Now, the key thing is you need a directory that the web server can
+read and write to. This is where it will store current and archived
+pages.
+
+If you have root access the next section applies to you. If you don't
+have root access, skip down to the section "I DON'T HAVE ROOT ACCESS"
+to see what options you have.
+
+Choose where you want to have the pages stored; on my system I put
+them in a directory under the PhpWiki root directory. That is, I
+installed my PhpWiki in /home/swain/public_html/flatfiletest/phpwiki.
+I created a directory called "pages" like this:
+
+[root@localhost phpwiki]# mkdir pages
+
+This creates a new directory:
+
+[swain@localhost phpwiki]$ ls -l
+total 65
+-rw-r--r-- 1 swain swain 1776 Dec 22 16:10 CREDITS
+-rw-r--r-- 1 swain swain 6323 Dec 12 16:53 DBLIB.txt
+-rw-r--r-- 1 swain swain 10373 Nov 5 22:19 HISTORY
+-rw-r--r-- 1 swain swain 3241 Oct 8 15:08 INSTALL
+-rw-r--r-- 1 swain swain 1241 Oct 8 14:12 INSTALL.mSQL
+-rw-r--r-- 1 swain swain 1584 Oct 8 14:12 INSTALL.mysql
+-rw-r--r-- 1 swain swain 2001 Oct 8 15:19 INSTALL.pgsql
+-rw-r--r-- 1 swain swain 18106 Jun 2 2000 LICENSE
+-rw-r--r-- 1 swain swain 2873 Dec 12 16:24 README
+drwxrwxr-x 2 swain swain 1024 Jan 1 18:46 admin
+-rw-r--r-- 1 swain swain 2366 Nov 13 05:59 admin.php
+drwxrwxr-x 2 swain swain 1024 Jan 1 18:46 images
+-rw-r--r-- 1 swain swain 1305 Nov 8 10:34 index.php
+drwxrwxr-x 2 swain swain 1024 Jan 3 22:44 lib
+drwxrwxr-x 6 swain swain 1024 Jan 1 18:46 locale
+drwxrwxr-x 4 swain swain 1024 Jan 1 18:50 pages
+drwxrwxr-x 2 swain swain 1024 Jan 1 18:46 pgsrc
+drwxrwxr-x 2 swain swain 1024 Jan 1 18:46 schemas
+drwxrwxr-x 2 swain swain 1024 Jan 1 18:46 templates
+
+Next, I'm going to change the owner of the directory. Your web server
+probably runs as user "nobody," so I log in as root and run the chown
+command:
+
+[swain@localhost phpwiki]$ su
+Password:
+[root@localhost phpwiki]# chown nobody:nobody pages
+
+Now the directory is read/writable by "nobody" and should work
+fine. If your web server runs as a different user substitute the
+appropriate name.
+
+
+I DON'T HAVE ROOT ACCESS...
+
+If you do not have root access to your machine you are in a tougher
+situation. What you can do is give the directory read/write permission
+to anybody, but for security reasons this is a bad idea.
+
+The second thing you can do is have your systems administrator install
+PhpWiki for you, or at least follow the steps above to create a
+directory owned by the web server.
+
+Another solution is to let the web server create the directory for
+you. The drawback to this approach is that you won't be able to edit
+the files or copy them from the command line, but most people can live
+with this limitation. (This is how you would do it on SourceForge, by
+the way; they have a cron job that sweeps the filesystem every few
+hours looking for things that are set world writable and change the
+permission.) This will require you to TEMPORARILY make the phpwiki/
+directory world writable:
+
+cd ..
+chmod o+wr phpwiki
+cd phpwiki/
+
+and create a PHP file like this:
+
+
+
+
+Make a directory
+
+
+
+
+ /*
+ I created this to set up server-writable files
+ for the Wiki. You shouldn't have world writable files.
+ */
+
+ $int = mkdir("pages", 0775);
+ if ($int) { echo "mkdir returned $int (success)\n"; }
+
+?>
+
+
+Put the file in the phpwiki/ directory and call it through a web
+browser. This should create a directory owned by the web server in the
+phpwiki/ directory.
+
+IMPORTANT
+Now you need to restore the permissions of the phpwiki directory
+itself:
+
+cd ..
+chmod 755 phpwiki
+
+If you have problems after all of this, try contacting the
+phpwiki-talk list at phpwiki-talk@lists.sourceforge.net.
+
+Steve Wainstead
+swain@panix.com
+
+$Id$
\ No newline at end of file
diff --git a/docroot/phpwiki/INSTALL.mSQL b/docroot/phpwiki/INSTALL.mSQL
new file mode 100755
index 0000000..5c32c56
--- /dev/null
+++ b/docroot/phpwiki/INSTALL.mSQL
@@ -0,0 +1,53 @@
+mSQL support is fairly stable. However, due to the limitations of
+mSQL's SQL syntax, certain features found in the MySQL and Postgresql
+versions are not available. This is not to say they can't be done, but
+it will require a lot more code in msql.php to compensate for the lack
+of advanced SQL syntax. Simplicity is one of mSQL's virtues however.
+
+Setting up mSQL is beyond the scope of this document. See
+http://www.hughes.com.au/ for information on downloading and
+instructions.
+
+Create the database. You might need system privledges to do this:
+
+[root@localhost phpwiki]# msqladmin create wiki
+Database "wiki" created.
+[root@localhost phpwiki]#
+
+Load the database schema (here I'm in the phpwiki/ directory created
+after untarring the application):
+
+[swain@localhost phpwiki]$ msql wiki < schemas/schema.minisql
+
+You will see a few error messages like this:
+
+mSQL > ->
+
+ERROR : Unknown table "wiki"
+
+This is normal because the schema file drops the table and then
+creates it... dropping a nonexistent table is a nonfatal error and you
+don't have to worry about it. You should see a lot of these:
+
+mSQL > -> ->
+Query OK. 1 row(s) modified or retrieved.
+
+
+
+
+mSQL > ->
+Bye!
+
+
+
+Now the database is created; edit lib/config.php and comment out the
+DBM file settings. Uncomment the mSQL settings, making sure the values
+are correct.
+
+That should be all! Try accessing your Wiki now. Read INSTALL and
+README for more information, plus the comments in lib/config.php.
+
+--Steve Wainstead
+swain@panix.com
+
+$Id$
\ No newline at end of file
diff --git a/docroot/phpwiki/INSTALL.mssql b/docroot/phpwiki/INSTALL.mssql
new file mode 100755
index 0000000..b9dac16
--- /dev/null
+++ b/docroot/phpwiki/INSTALL.mssql
@@ -0,0 +1,85 @@
+Note: this is the email I got when the files were contributed. I cannot test
+the code, since I don't have access to mssql, so ymmv. The files referenced
+here are at:
+
+lib/mssql.php
+admin/translate_mysql.pl (you probably won't need this)
+
+The code for lib/config.php has already been added.
+
+~swain
+
+
+From: Andrew.Pearson@barclayscapital.com
+To: swain@panix.com
+Cc: phpwiki-talk@lists.sourceforge.net
+Subject: PHPWiki with Microsoft SQL-Server
+Date: Tue, 1 May 2001 16:26:50 +0100
+
+
+
+My colleague John Clayton set up PHPWiki for our development team using
+Apache and MySQL on Windows NT 4. He then left, and I was asked to port
+this to IIS and SQLServer. Please note this is not a reflection of the
+Apache and MySQL products, which were performing the task admirably, but had
+more to do with consistency of our environment. Since PHP does work with
+SQL-Server, the whole migration took about a day. Here are the steps I
+carried out:
+
+1. Wrote a sql-server library called mssql.php to reside in wiki\lib.
+ <>
+2. Added the following clause to wiki\lib\config.php
+ // MS SQLServer settings
+ } elseif ($WhichDatabase == 'mssql') {
+ $WikiPageStore = "wiki";
+ $ArchivePageStore = "archive";
+ $WikiLinksStore = "wikilinks";
+ $WikiScoreStore = "wikiscore";
+ $HitCountStore = "hitcount";
+ $mssql_server = 'servername';
+ $mssql_user = 'wikiweb';
+ $mssql_pwd = 'wikiweb';
+ $mssql_db = 'wiki';
+ include "lib/mssql.php";
+}
+
+3. Set $WhichDatabase='mssql' in config.php
+
+4. Dumped out the mysql wiki database (mysqldump --user=john
+--host=localhost wiki) and wrote the following perl script to convert to
+sql-server compatible sql
+ <>
+
+5. Loaded the translated db script into SQL-Server and granted relevant
+permissions/logins etc.
+
+6. Set "magic_quotes_sybase=On" in php.ini to handle embedded quote
+characters in strings. This is because SQL-Server, like Sybase, uses ''
+instead of \' within strings to cope with embedded quotes.
+
+We had some problems initially with the PHP extension dll for sql-server,
+but I installed a newer version from http://www.mm4.de. In fact I unpacked
+their whole php4.0.5-rc1 distribution.
+
+I make no claims about all this working 100%, but our existing site seems to
+work okay in its new IIS/SQL-Server home :-)
+
+Andrew Pearson
+Barclays Capital, UK
+
+
+
+
+--------------------------------------------------------------------------------------
+For more information about Barclays Capital, please
+visit our web site at http://www.barcap.com.
+
+
+Internet communications are not secure and therefore the Barclays Group
+does not accept legal responsibility for the contents of this message.
+Any views or opinions presented are solely those of the author and do
+not necessarily represent those of the Barclays Group unless otherwise
+specifically stated.
+
+--------------------------------------------------------------------------------------
+
diff --git a/docroot/phpwiki/INSTALL.mysql b/docroot/phpwiki/INSTALL.mysql
new file mode 100755
index 0000000..ec24258
--- /dev/null
+++ b/docroot/phpwiki/INSTALL.mysql
@@ -0,0 +1,61 @@
+
+Installing phpwiki with mySQL
+-----------------------------
+
+This assumes that you have a working mySQL server and client setup.
+Installing mySQL is beyond the scope of this document.
+For more information on mySQL go to http://www.mysql.org/
+
+1. If you do not have a suitable database already, create one (using
+ the root or other privileged account you set up when mySQL was
+ installed.)
+
+ mysqladmin -uuser -ppassword create phpwiki
+
+2. If necessary create a user for that database which has the rights
+ to select, insert, update, delete (again using the root
+ administration account).
+
+ mysql -uuser -ppassword phpwiki
+
+ A mySQL grant statement for this user would look like this:
+
+ GRANT select, insert, update, delete
+ ON phpwiki.*
+ TO wikiuser@localhost
+ IDENTIFIED BY 'password';
+
+3. Create tables inside your database (still using the root account).
+
+ mysql -uuser -ppassword phpwiki
+ AddType application/x-httpd-php3 .php3
+ AddType application/x-httpd-php3 .php
+ AddType application/x-httpd-php3-source .phps
+
+
+(This is from a stock 6.2 Red Hat distro, which ships with an rpm of
+PHP 3.0.12, but should give you an idea. I had to add the line for
+.php).
+
+Also note that Postgresql by default has a hard limit of 8K per
+row. This is a Really Bad Thing. You can change that when you compile
+Postgresql to allow 32K per row, but supposedly performance
+suffers. The 7.x release of Postgresql is supposed to fix this.
+
+It's probably a good idea to install PhpWiki as-is first, running it
+off the DBM file. This way you can test most of the functionality of
+the Wiki.
+
+Once that's done and you have the basic stuff done that's listed in
+the INSTALL, the time comes to move to Postgresql.
+
+Edit lib/config.php and edit $WhichDatabase for Postgresql. The lines
+are clearly commented and you should have no problem with this.
+
+Next you need to create a database called "wiki".
+
+bash$ createdb wiki
+
+Now run the script schemas/schema.psql
+
+bash$ psql wiki -f schemas/schema.psql
+
+For some reason I had to stop/start the database so that these changes took
+effect.. after that just open up the Wiki in your browser and you should
+have a brand-new PhpWiki running!
+
+If you find something I missed, please let me know.
+Steve Wainstead
+swain@wcsb.org
+
+Report bugs to phpwiki-talk@lists.sourceforge.net
+
+$Id$
\ No newline at end of file
diff --git a/docroot/phpwiki/LICENSE b/docroot/phpwiki/LICENSE
new file mode 100755
index 0000000..befaf30
--- /dev/null
+++ b/docroot/phpwiki/LICENSE
@@ -0,0 +1,367 @@
+ GNU GENERAL PUBLIC LICENSE
+ Version 2, June 1991
+
+ Copyright (C) 1989, 1991 Free Software Foundation, Inc.
+ 59 Temple Place, Suite 330, Boston, MA
+02111-1307 USA
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+ Preamble
+
+ The licenses for most software are designed to take away your
+freedom to share and change it. By contrast, the GNU General Public
+License is intended to guarantee your freedom to share and change free
+software--to make sure the software is free for all its users. This
+General Public License applies to most of the Free Software
+Foundation's software and to any other program whose authors commit to
+using it. (Some other Free Software Foundation software is covered by
+the GNU Library General Public License instead.) You can apply it to
+your programs, too.
+
+ When we speak of free software, we are referring to freedom, not
+price. Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+this service if you wish), that you receive source code or can get it
+if you want it, that you can change the software or use pieces of it
+in new free programs; and that you know you can do these things.
+
+ To protect your rights, we need to make restrictions that forbid
+anyone to deny you these rights or to ask you to surrender the rights.
+These restrictions translate to certain responsibilities for you if you
+distribute copies of the software, or if you modify it.
+
+ For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must give the recipients all the rights that
+you have. You must make sure that they, too, receive or can get the
+source code. And you must show them these terms so they know their
+rights.
+
+ We protect your rights with two steps: (1) copyright the software, and
+(2) offer you this license which gives you legal permission to copy,
+distribute and/or modify the software.
+
+ Also, for each author's protection and ours, we want to make certain
+that everyone understands that there is no warranty for this free
+software. If the software is modified by someone else and passed on, we
+want its recipients to know that what they have is not the original, so
+that any problems introduced by others will not reflect on the original
+authors' reputations.
+
+ Finally, any free program is threatened constantly by software
+patents. We wish to avoid the danger that redistributors of a free
+program will individually obtain patent licenses, in effect making the
+program proprietary. To prevent this, we have made it clear that any
+patent must be licensed for everyone's free use or not licensed at all.
+
+ The precise terms and conditions for copying, distribution and
+modification follow.
+
+ GNU GENERAL PUBLIC LICENSE
+ TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
+
+ 0. This License applies to any program or other work which contains
+a notice placed by the copyright holder saying it may be distributed
+under the terms of this General Public License. The "Program", below,
+refers to any such program or work, and a "work based on the Program"
+means either the Program or any derivative work under copyright law:
+that is to say, a work containing the Program or a portion of it,
+either verbatim or with modifications and/or translated into another
+language. (Hereinafter, translation is included without limitation in
+the term "modification".) Each licensee is addressed as "you".
+
+Activities other than copying, distribution and modification are not
+covered by this License; they are outside its scope. The act of
+running the Program is not restricted, and the output from the Program
+is covered only if its contents constitute a work based on the
+Program (independent of having been made by running the Program).
+Whether that is true depends on what the Program does.
+
+ 1. You may copy and distribute verbatim copies of the Program's
+source code as you receive it, in any medium, provided that you
+conspicuously and appropriately publish on each copy an appropriate
+copyright notice and disclaimer of warranty; keep intact all the
+notices that refer to this License and to the absence of any warranty;
+and give any other recipients of the Program a copy of this License
+along with the Program.
+
+You may charge a fee for the physical act of transferring a copy, and
+you may at your option offer warranty protection in exchange for a fee.
+
+ 2. You may modify your copy or copies of the Program or any portion
+of it, thus forming a work based on the Program, and copy and
+distribute such modifications or work under the terms of Section 1
+above, provided that you also meet all of these conditions:
+
+ a) You must cause the modified files to carry prominent notices
+ stating that you changed the files and the date of any change.
+
+ b) You must cause any work that you distribute or publish, that in
+ whole or in part contains or is derived from the Program or any
+ part thereof, to be licensed as a whole at no charge to all third
+ parties under the terms of this License.
+
+ c) If the modified program normally reads commands interactively
+ when run, you must cause it, when started running for such
+ interactive use in the most ordinary way, to print or display an
+ announcement including an appropriate copyright notice and a
+ notice that there is no warranty (or else, saying that you provide
+ a warranty) and that users may redistribute the program under
+ these conditions, and telling the user how to view a copy of this
+ License. (Exception: if the Program itself is interactive but
+ does not normally print such an announcement, your work based on
+ the Program is not required to print an announcement.)
+
+These requirements apply to the modified work as a whole. If
+identifiable sections of that work are not derived from the Program,
+and can be reasonably considered independent and separate works in
+themselves, then this License, and its terms, do not apply to those
+sections when you distribute them as separate works. But when you
+distribute the same sections as part of a whole which is a work based
+on the Program, the distribution of the whole must be on the terms of
+this License, whose permissions for other licensees extend to the
+entire whole, and thus to each and every part regardless of who wrote
+it.
+
+Thus, it is not the intent of this section to claim rights or contest
+your rights to work written entirely by you; rather, the intent is to
+exercise the right to control the distribution of derivative or
+collective works based on the Program.
+
+In addition, mere aggregation of another work not based on the Program
+with the Program (or with a work based on the Program) on a volume of
+a storage or distribution medium does not bring the other work under
+the scope of this License.
+
+ 3. You may copy and distribute the Program (or a work based on it,
+under Section 2) in object code or executable form under the terms of
+Sections 1 and 2 above provided that you also do one of the following:
+
+ a) Accompany it with the complete corresponding machine-readable
+ source code, which must be distributed under the terms of Sections
+ 1 and 2 above on a medium customarily used for software interchange;
+or,
+
+ b) Accompany it with a written offer, valid for at least three
+ years, to give any third party, for a charge no more than your
+ cost of physically performing source distribution, a complete
+ machine-readable copy of the corresponding source code, to be
+ distributed under the terms of Sections 1 and 2 above on a medium
+ customarily used for software interchange; or,
+
+ c) Accompany it with the information you received as to the offer
+ to distribute corresponding source code. (This alternative is
+ allowed only for noncommercial distribution and only if you
+ received the program in object code or executable form with such
+ an offer, in accord with Subsection b above.)
+
+The source code for a work means the preferred form of the work for
+making modifications to it. For an executable work, complete source
+code means all the source code for all modules it contains, plus any
+associated interface definition files, plus the scripts used to
+control compilation and installation of the executable. However, as a
+special exception, the source code distributed need not include
+anything that is normally distributed (in either source or binary
+form) with the major components (compiler, kernel, and so on) of the
+operating system on which the executable runs, unless that component
+itself accompanies the executable.
+
+If distribution of executable or object code is made by offering
+access to copy from a designated place, then offering equivalent
+access to copy the source code from the same place counts as
+distribution of the source code, even though third parties are not
+compelled to copy the source along with the object code.
+
+ 4. You may not copy, modify, sublicense, or distribute the Program
+except as expressly provided under this License. Any attempt
+otherwise to copy, modify, sublicense or distribute the Program is
+void, and will automatically terminate your rights under this License.
+However, parties who have received copies, or rights, from you under
+this License will not have their licenses terminated so long as such
+parties remain in full compliance.
+
+ 5. You are not required to accept this License, since you have not
+signed it. However, nothing else grants you permission to modify or
+distribute the Program or its derivative works. These actions are
+prohibited by law if you do not accept this License. Therefore, by
+modifying or distributing the Program (or any work based on the
+Program), you indicate your acceptance of this License to do so, and
+all its terms and conditions for copying, distributing or modifying
+the Program or works based on it.
+
+ 6. Each time you redistribute the Program (or any work based on the
+Program), the recipient automatically receives a license from the
+original licensor to copy, distribute or modify the Program subject to
+these terms and conditions. You may not impose any further
+restrictions on the recipients' exercise of the rights granted herein.
+You are not responsible for enforcing compliance by third parties to
+this License.
+
+ 7. If, as a consequence of a court judgment or allegation of patent
+infringement or for any other reason (not limited to patent issues),
+conditions are imposed on you (whether by court order, agreement or
+otherwise) that contradict the conditions of this License, they do not
+excuse you from the conditions of this License. If you cannot
+distribute so as to satisfy simultaneously your obligations under this
+License and any other pertinent obligations, then as a consequence you
+may not distribute the Program at all. For example, if a patent
+license would not permit royalty-free redistribution of the Program by
+all those who receive copies directly or indirectly through you, then
+the only way you could satisfy both it and this License would be to
+refrain entirely from distribution of the Program.
+
+If any portion of this section is held invalid or unenforceable under
+any particular circumstance, the balance of the section is intended to
+apply and the section as a whole is intended to apply in other
+circumstances.
+
+It is not the purpose of this section to induce you to infringe any
+patents or other property right claims or to contest validity of any
+such claims; this section has the sole purpose of protecting the
+integrity of the free software distribution system, which is
+implemented by public license practices. Many people have made
+generous contributions to the wide range of software distributed
+through that system in reliance on consistent application of that
+system; it is up to the author/donor to decide if he or she is willing
+to distribute software through any other system and a licensee cannot
+impose that choice.
+
+This section is intended to make thoroughly clear what is believed to
+be a consequence of the rest of this License.
+
+ 8. If the distribution and/or use of the Program is restricted in
+certain countries either by patents or by copyrighted interfaces, the
+original copyright holder who places the Program under this License
+may add an explicit geographical distribution limitation excluding
+those countries, so that distribution is permitted only in or among
+countries not thus excluded. In such case, this License incorporates
+the limitation as if written in the body of this License.
+
+ 9. The Free Software Foundation may publish revised and/or new
+versions
+of the General Public License from time to time. Such new versions will
+be similar in spirit to the present version, but may differ in detail to
+address new problems or concerns.
+
+Each version is given a distinguishing version number. If the Program
+specifies a version number of this License which applies to it and "any
+later version", you have the option of following the terms and
+conditions
+either of that version or of any later version published by the Free
+Software Foundation. If the Program does not specify a version number
+of
+this License, you may choose any version ever published by the Free
+Software
+Foundation.
+
+ 10. If you wish to incorporate parts of the Program into other free
+programs whose distribution conditions are different, write to the
+author
+to ask for permission. For software which is copyrighted by the Free
+Software Foundation, write to the Free Software Foundation; we sometimes
+make exceptions for this. Our decision will be guided by the two goals
+of preserving the free status of all derivatives of our free software
+and
+of promoting the sharing and reuse of software generally.
+
+ NO WARRANTY
+
+ 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO
+WARRANTY
+FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
+OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
+PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER
+EXPRESSED
+OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
+MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK
+AS
+TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
+PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
+REPAIR OR CORRECTION.
+
+ 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN
+WRITING
+WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
+REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR
+DAMAGES,
+INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES
+ARISING
+OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT
+LIMITED
+TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
+YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY
+OTHER
+PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
+POSSIBILITY OF SUCH DAMAGES.
+
+ END OF TERMS AND CONDITIONS
+
+ How to Apply These Terms to Your New Programs
+
+ If you develop a new program, and you want it to be of the greatest
+possible use to the public, the best way to achieve this is to make it
+free software which everyone can redistribute and change under these
+terms.
+
+ To do so, attach the following notices to the program. It is safest
+to attach them to the start of each source file to most effectively
+convey the exclusion of warranty; and each file should have at least
+the "copyright" line and a pointer to where the full notice is found.
+
+
+ Copyright (C) 19yy
+
+ This program is free software; you can redistribute it and/or modify
+ it under the terms of the GNU General Public License as published by
+ the Free Software Foundation; either version 2 of the License, or
+ (at your option) any later version.
+
+ This program is distributed in the hope that it will be useful,
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
+ GNU General Public License for more details.
+
+ You should have received a copy of the GNU General Public License
+ along with this program; if not, write to the Free Software
+ Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307
+USA
+
+
+Also add information on how to contact you by electronic and paper mail.
+
+If the program is interactive, make it output a short notice like this
+when it starts in an interactive mode:
+
+ Gnomovision version 69, Copyright (C) 19yy name of author
+ Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type
+`show w'.
+ This is free software, and you are welcome to redistribute it
+ under certain conditions; type `show c' for details.
+
+The hypothetical commands `show w' and `show c' should show the
+appropriate
+parts of the General Public License. Of course, the commands you use
+may
+be called something other than `show w' and `show c'; they could even be
+mouse-clicks or menu items--whatever suits your program.
+
+You should also get your employer (if you work as a programmer) or your
+school, if any, to sign a "copyright disclaimer" for the program, if
+necessary. Here is a sample; alter the names:
+
+ Yoyodyne, Inc., hereby disclaims all copyright interest in the program
+ `Gnomovision' (which makes passes at compilers) written by James
+Hacker.
+
+ , 1 April 1989
+ Ty Coon, President of Vice
+
+This General Public License does not permit incorporating your program
+into
+proprietary programs. If your program is a subroutine library, you may
+consider it more useful to permit linking proprietary applications with
+the
+library. If this is what you want to do, use the GNU Library General
+Public License instead of this License.
+
diff --git a/docroot/phpwiki/README b/docroot/phpwiki/README
new file mode 100755
index 0000000..5c2809f
--- /dev/null
+++ b/docroot/phpwiki/README
@@ -0,0 +1,70 @@
+This web application is licensed under the Gnu Public License, which
+should be included in the same directory as this README. A copy
+can be found at http://www.gnu.org/copyleft/gpl.txt.
+
+See INSTALL for installation notes.
+See INSTALL.mysql for using PhpWiki with MySQL.
+See INSTALL.pgsql for using PhpWiki with PostgreSQL
+See INSTALL.mSQL for using PhpWiki with mSQL.
+
+For a list of current bugs see:
+https://sourceforge.net/bugs/?group_id=6121
+
+The out-of-the-box version uses a dbm file in the /tmp directory; you may
+wish a more permanent place for yours, but make sure it's read/writable
+by your web server!
+
+NOTE: Not all database versions are equal. The MySQL and Postgresql
+implementations have the full set of features; DBM and mSQL are
+missing only a few, and the flat file implementation is solid
+and waiting for your improvement. All are suitable for production.
+
+NOTE 2: Not all the admin functions are implemented, but the page
+locking sure is nice.
+
+MANIFEST:
+
+index.php: the "main page", really a set of branching instructions
+admin.php: entry page for doing wiki administration
+
+lib/config.php: configuration options, constants, global variables
+lib/db_filesystem.php support for flat file Wiki
+lib/dbmlib.php: database access functions for dbm files
+lib/display.php: display a page (this calls "lib/transform.php")
+lib/editlinks.php: edit the embedded links of a page
+lib/editpage.php: edit a page
+lib/fullsearch.php: full page text search
+lib/mysql.php: database access functions for mySQL
+lib/pageinfo.php: gives detailed low-level info on the page structure
+lib/pgsql.php: database access functions for PostgreSQL
+lib/savepage.php: save a page to db, thank user
+lib/search.php: page title search
+lib/setupwiki.php: load a set of pages from ./pgsrc/ directory
+lib/stdlib.php: standard library of functions (non-db related)
+lib/transform.php: convert wiki markup into HTML
+lib/ziplib.php: support for zip/unzip, used for page dumps
+
+admin/:
+admin/dumpserial.php: dump the Wiki out as serialize() pages
+admin/loadserial.php: load Wiki pages that were dumped with dumpserial
+admin/lockpage.php: lock a page so it cannot be edited
+admin/shrinkdbm.pl: Perl script to reduce size of DBM files
+admin/wiki_dumpHTML.php: dump the Wiki out as HTML pages
+admin/wiki_port1_0.php: import a 1.0 PhpWiki database
+admin/wiki_rebuilddbms.php: rebuild DBM files to reclaim disk space
+admin/zip.php3: create a Zip archive of all Wiki pages
+
+templates/:
+browse.html: for rendering most pages
+editlinks.html: template for editing references
+editpage.html: template for form for editing pages
+message.html: error/system message template
+
+schemas/: SQL schemas for the RDBMSs
+
+
+Steve Wainstead
+swain@wcsb.org
+http://wcsb.org/~swain/
+
+$Id$
diff --git a/docroot/phpwiki/UPGRADING.readme b/docroot/phpwiki/UPGRADING.readme
new file mode 100755
index 0000000..6ce0600
--- /dev/null
+++ b/docroot/phpwiki/UPGRADING.readme
@@ -0,0 +1,27 @@
+MySQL
+
+ The MySQL schema has changed since PhpWiki 1.2.0.
+
+ If you're upgrading from PhpWiki 1.2.0 and you use the MySQL back end,
+ you need to update the schema.
+
+ As long as you use the stock table names you can just do something like:
+
+ mysql -u -p wiki < schemas/update.mysql.1.2.0-1.2.1
+
+ If you don't use the stock table names, look at the script in
+ schemas/update.mysql.1.2.0-1.2.1 and use it as a guide.
+
+
+Flat File
+
+ We are now (since 1.2.0) urlencoding the characters '%', '/', '\\',
+ and ':' when forming filenames from page names. (This is to fix a bug
+ having to do with page names containing slashes.) If you currently have
+ any page names with any of those special characters, they will not be
+ visible to PhpWiki after you upgrade. (Any pages without those funny
+ characters in their names will be unaffected.)
+
+ If you do have pages with slashes, colons or percent signs in their names,
+ you should probably make a backup dump of your database before upgrading
+ and re-load the database after upgrading.
diff --git a/docroot/phpwiki/admin.php b/docroot/phpwiki/admin.php
new file mode 100755
index 0000000..ef7118f
--- /dev/null
+++ b/docroot/phpwiki/admin.php
@@ -0,0 +1,70 @@
+";
+ $url = rawurlencode($remove);
+ $html .= sprintf(gettext ("Click %shere%s to remove the page now."),
+ "", " ");
+ $html .= "\n";
+ $html .= gettext ("Otherwise press the \"Back\" button of your browser.");
+ } else {
+ $html = gettext ("Function not yet implemented.");
+ }
+ GeneratePage('MESSAGE', $html, gettext ("Remove page"), 0);
+ ExitWiki('');
+ } elseif (isset($removeok)) {
+ if (get_magic_quotes_gpc())
+ $removeok = stripslashes($removeok);
+ RemovePage($dbi, $removeok);
+ $html = sprintf(gettext ("Removed page '%s' succesfully."),
+ htmlspecialchars($removeok));
+ GeneratePage('MESSAGE', $html, gettext ("Remove page"), 0);
+ ExitWiki('');
+ }
+
+ include('index.php');
+?>
diff --git a/docroot/phpwiki/admin/dumpserial.php b/docroot/phpwiki/admin/dumpserial.php
new file mode 100755
index 0000000..c35f0eb
--- /dev/null
+++ b/docroot/phpwiki/admin/dumpserial.php
@@ -0,0 +1,44 @@
+
+
+\n");
+ else
+ $html = "Created directory '$directory' for the page dump... \n";
+ } else {
+ $html = "Using directory '$directory' \n";
+ }
+
+ $numpages = count($pages);
+ for ($x = 0; $x < $numpages; $x++) {
+ $pagename = htmlspecialchars($pages[$x]);
+ $filename = preg_replace('/^\./', '%2e', rawurlencode($pages[$x]));
+ $html .= " $pagename ... ";
+ if($pagename != $filename)
+ $html .= "saved as $filename ... ";
+
+ $data = serialize(RetrievePage($dbi, $pages[$x], $WikiPageStore));
+ if ($fd = fopen("$directory/$filename", "w")) {
+ $num = fwrite($fd, $data, strlen($data));
+ $html .= "$num bytes written \n";
+ } else {
+ ExitWiki("couldn't open file '$directory/$filename' for writing \n");
+ }
+ }
+
+ $html .= "
Dump complete. ";
+ GeneratePage('MESSAGE', $html, 'Dump serialized pages', 0);
+ ExitWiki('');
+?>
diff --git a/docroot/phpwiki/admin/loadserial.php b/docroot/phpwiki/admin/loadserial.php
new file mode 100755
index 0000000..5c7df3f
--- /dev/null
+++ b/docroot/phpwiki/admin/loadserial.php
@@ -0,0 +1,42 @@
+
+\n";
+
+ if (! file_exists($directory)) {
+ echo "No such directory '$directory'. \n";
+ exit;
+ }
+
+ $handle = opendir($directory);
+
+ while ($file = readdir($handle)) {
+
+ if ($file[0] == ".")
+ continue;
+
+ $html .= "Reading '$file'... \n";
+
+ $data = implode("", file("$directory/$file"));
+ $pagehash = unserialize($data);
+
+ // at this point there needs to be some form of verification
+ // that we are about to insert a page.
+
+ $pagename = rawurldecode($file);
+ $html .= "inserting file '".htmlspecialchars($pagename)."' into the database... \n";
+ InsertPage($dbi, $pagename, $pagehash);
+ }
+ closedir($handle);
+
+ $html .= "
Load complete. ";
+ GeneratePage('MESSAGE', $html, 'Load serialized pages', 0);
+ ExitWiki('');
+?>
diff --git a/docroot/phpwiki/admin/lockpage.php b/docroot/phpwiki/admin/lockpage.php
new file mode 100755
index 0000000..0190bf8
--- /dev/null
+++ b/docroot/phpwiki/admin/lockpage.php
@@ -0,0 +1,22 @@
+
+
diff --git a/docroot/phpwiki/admin/shrinkdbm.pl b/docroot/phpwiki/admin/shrinkdbm.pl
new file mode 100755
index 0000000..3be82b0
--- /dev/null
+++ b/docroot/phpwiki/admin/shrinkdbm.pl
@@ -0,0 +1,55 @@
+#!/usr/bin/perl -w
+
+# $Id$
+
+# shrink a DBM file
+# Steve Wainstead, July 2000
+# this script is public domain and has no warranty at all.
+
+use strict;
+use Fcntl;
+use GDBM_File;
+use Getopt::Std;
+use vars ('$opt_o', '$opt_i');
+my (%old_db, %new_db);
+
+# $opt_i == input file
+# $opt_o == output file
+getopts('i:o:');
+
+# less confusing names
+my $input_db_file = $opt_i;
+my $output_db_file = $opt_o;
+
+
+die <<"USAGE" unless ($input_db_file and $output_db_file);
+Usage: $0 -i -o
+ where: infile is a GDBM file and,
+ outfile is the name of the new file to write to.
+
+The idea is to copy the old DB file to a new one and thereby
+save space.
+
+USAGE
+
+# open old file
+tie (%old_db, "GDBM_File", $input_db_file, O_RDWR, 0666)
+ or die "Can't tie $input_db_file: $!\n";
+
+print "There are ", scalar(keys %old_db), " keys in $input_db_file\n";
+
+# open new file, deleting it first if it's already there
+if (-e $output_db_file) { unlink $opt_o; }
+tie (%new_db, "GDBM_File", $output_db_file, O_RDWR|O_CREAT, 0666)
+ or die "Can't tie $input_db_file: $!\n";
+
+# copy the files
+while (my($key, $value) = each(%old_db)) {
+ $new_db{$key} = $value;
+}
+
+print "There are now ", scalar(keys %old_db), " keys in $input_db_file\n";
+print "There are ", scalar(keys %new_db), " keys in $output_db_file\n";
+untie(%old_db);
+untie(%new_db);
+
diff --git a/docroot/phpwiki/admin/translate_mysql.pl b/docroot/phpwiki/admin/translate_mysql.pl
new file mode 100755
index 0000000..0c1c2f5
--- /dev/null
+++ b/docroot/phpwiki/admin/translate_mysql.pl
@@ -0,0 +1,26 @@
+
+# Convert MySQL wiki database dump to a Microsoft SQL-Server compatible SQL script
+# NB This is not a general-purpose MySQL->SQL-Server conversion script
+
+# Author: Andrew K. Pearson
+# Date: 01 May 2001
+
+# Example usage: perl translate_mysql.pl dump.sql > dump2.sql
+
+# NB I did not use sed because the version I have is limited to input lines of <1K in size
+
+while (<>)
+{
+ $newvalue = $_;
+
+ $newvalue =~ s/\\\"/\'\'/g;
+ $newvalue =~ s/\\\'/\'\'/g;
+ $newvalue =~ s/\\n/\'+char(10)+\'/g;
+ $newvalue =~ s/TYPE=MyISAM;//g;
+ $newvalue =~ s/int\(.+\)/int/g;
+ $newvalue =~ s/mediumtext/text/g;
+ $newvalue =~ s/^#/--/g;
+
+ print $newvalue;
+}
+
diff --git a/docroot/phpwiki/admin/wiki_dumpHTML.php b/docroot/phpwiki/admin/wiki_dumpHTML.php
new file mode 100755
index 0000000..958884c
--- /dev/null
+++ b/docroot/phpwiki/admin/wiki_dumpHTML.php
@@ -0,0 +1,7 @@
+
+\n";
+ echo "Got: $dumpHTML $directory \n";
+
+?>
diff --git a/docroot/phpwiki/admin/wiki_port1_0.php b/docroot/phpwiki/admin/wiki_port1_0.php
new file mode 100755
index 0000000..36ae371
--- /dev/null
+++ b/docroot/phpwiki/admin/wiki_port1_0.php
@@ -0,0 +1,69 @@
+
+
+
+
+Importing phpwiki 1.0.x dbm files
+
+
+
+\n";
+
+ $newhash['version'] = isset($pagehash['version']) ?
+ $pagehash['version'] : 1;
+ $newhash['author'] = isset($pagehash['author']) ?
+ $pagehash['author'] : '1.0 wiki setup page';
+ $newhash['created'] = time();
+ $newhash['lastmodified'] = time();
+ $newhash['flags'] = 0;
+ $newhash['pagename'] = $pagename;
+ $newhash['refs'] = array();
+ for ($i=1; $i <= 4; $i++) {
+ if (isset($pagehash['r$i']))
+ $newhash['refs'][$i] = $pagehash['r$i'];
+ }
+ $content = implode("\n", $pagehash['text']);
+ $content = str_replace("[", "[[", $content);
+ $newhash['content'] = explode("\n", $content);
+
+ InsertPage($dbi, $pagename, $newhash);
+ }
+
+
+ echo "opening dbm file: $portdbmfile ... \n";
+
+ if (! file_exists($portdbmfile)) {
+ echo "File '$portdbmfile' does not exist. \n";
+ exit;
+ }
+
+ if (! ($dbmh = dbmopen($portdbmfile, "r"))) {
+ echo "Cannot open '$portdbmfile' \n";
+ exit;
+ }
+
+ echo " ok ($dbmh)\n";
+
+ $namelist = array();
+ $ctr = 0;
+
+ $namelist[$ctr] = $key = dbmfirstkey($dbmh);
+ port1_0renderhash($dbi, $dbmh, $key);
+ while ($key = dbmnextkey($dbmh, $key)) {
+ $ctr++;
+ $namelist[$ctr] = $key;
+ port1_0renderhash($dbi, $dbmh, $key);
+ }
+
+ dbmclose($dbmh);
+?>
+
+
Done.
+
+
diff --git a/docroot/phpwiki/admin/wiki_rebuilddbms.php b/docroot/phpwiki/admin/wiki_rebuilddbms.php
new file mode 100755
index 0000000..342e184
--- /dev/null
+++ b/docroot/phpwiki/admin/wiki_rebuilddbms.php
@@ -0,0 +1,6 @@
+
+\n";
+
+?>
diff --git a/docroot/phpwiki/admin/zip.php b/docroot/phpwiki/admin/zip.php
new file mode 100755
index 0000000..0a17767
--- /dev/null
+++ b/docroot/phpwiki/admin/zip.php
@@ -0,0 +1,84 @@
+ $pagehash['lastmodified'],
+ 'is_ascii' => 1);
+ if (($pagehash['flags'] & FLAG_PAGE_LOCKED) != 0)
+ $attrib['write_protected'] = 1;
+
+ $content = MailifyPage($pagehash, $oldpagehash);
+
+ $zip->addRegularFile( encode_pagename_for_wikizip($pagehash['pagename']),
+ $content, $attrib);
+ }
+ $zip->finish();
+}
+
+
+if(defined('WIKI_ADMIN'))
+ MakeWikiZip(($zip == 'all'));
+
+CloseDataBase($dbi);
+exit;
+?>
diff --git a/docroot/phpwiki/bullet.gif b/docroot/phpwiki/bullet.gif
new file mode 100755
index 0000000..81ac440
--- /dev/null
+++ b/docroot/phpwiki/bullet.gif
Binary files differ
diff --git a/docroot/phpwiki/fdimage.gif b/docroot/phpwiki/fdimage.gif
new file mode 100755
index 0000000..d827063
--- /dev/null
+++ b/docroot/phpwiki/fdimage.gif
Binary files differ
diff --git a/docroot/phpwiki/folder.gif b/docroot/phpwiki/folder.gif
new file mode 100755
index 0000000..4826460
--- /dev/null
+++ b/docroot/phpwiki/folder.gif
Binary files differ
diff --git a/docroot/phpwiki/images/logo.gif b/docroot/phpwiki/images/logo.gif
new file mode 100755
index 0000000..dc1f23a
--- /dev/null
+++ b/docroot/phpwiki/images/logo.gif
Binary files differ
diff --git a/docroot/phpwiki/images/png.png b/docroot/phpwiki/images/png.png
new file mode 100755
index 0000000..841dae6
--- /dev/null
+++ b/docroot/phpwiki/images/png.png
Binary files differ
diff --git a/docroot/phpwiki/images/signature.png b/docroot/phpwiki/images/signature.png
new file mode 100755
index 0000000..8a1bd28
--- /dev/null
+++ b/docroot/phpwiki/images/signature.png
Binary files differ
diff --git a/docroot/phpwiki/images/ubix.gif b/docroot/phpwiki/images/ubix.gif
new file mode 100755
index 0000000..6b4a00e
--- /dev/null
+++ b/docroot/phpwiki/images/ubix.gif
Binary files differ
diff --git a/docroot/phpwiki/images/wikibase.png b/docroot/phpwiki/images/wikibase.png
new file mode 100755
index 0000000..0ca1f92
--- /dev/null
+++ b/docroot/phpwiki/images/wikibase.png
Binary files differ
diff --git a/docroot/phpwiki/index.php b/docroot/phpwiki/index.php
new file mode 100755
index 0000000..2ef8ba8
--- /dev/null
+++ b/docroot/phpwiki/index.php
@@ -0,0 +1,53 @@
+
+
+
diff --git a/docroot/phpwiki/lib/backlinks.php b/docroot/phpwiki/lib/backlinks.php
new file mode 100755
index 0000000..356a648
--- /dev/null
+++ b/docroot/phpwiki/lib/backlinks.php
@@ -0,0 +1,37 @@
+"
+ . sprintf(gettext("Pages which link to %s") . " .....",
+ $pagelink)
+ . "
\n\n" );
+
+ // search matching pages
+ $query = InitBackLinkSearch($dbi, $pagename);
+ $found = 0;
+ while ($page = BackLinkSearchNextMatch($dbi, $query)) {
+ $found++;
+ $html .= "" . LinkExistingWikiWord($page) . " \n";
+ }
+
+ $html .= " \n \n"
+ . sprintf(gettext ("%d pages link to %s."),
+ $found, $pagelink)
+ . "\n";
+
+ GeneratePage('MESSAGE', $html, $title, 0);
+?>
diff --git a/docroot/phpwiki/lib/config.php b/docroot/phpwiki/lib/config.php
new file mode 100755
index 0000000..e1ca4f7
--- /dev/null
+++ b/docroot/phpwiki/lib/config.php
@@ -0,0 +1,310 @@
+\n"; };
+ }
+ rcs_id('$Id$');
+ // end essential internal stuff
+
+
+ /////////////////////////////////////////////////////////////////////
+ // Part One:
+ // Constants and settings. Edit the values below for your site.
+ /////////////////////////////////////////////////////////////////////
+
+
+ // URL of index.php e.g. http://yoursite.com/phpwiki/index.php
+ // you can leave this empty - it will be calculated automatically
+ $ScriptUrl = "http://www.ubixos.com/phpwiki/index.php";
+ // URL of admin.php e.g. http://yoursite.com/phpwiki/admin.php
+ // you can leave this empty - it will be calculated automatically
+ // if you fill in $ScriptUrl you *MUST* fill in $AdminUrl as well!
+ $AdminUrl = "http://www.ubixos.com/phpwiki/admin.php";
+
+ // Select your language - default language "C": English
+ // other languages available: Dutch "nl", Spanish "es", German "de",
+ // and Swedish "sv"
+ $LANG="C";
+
+ /////////////////////////////////////////////////////////////////////
+ // Part Two:
+ // Database section
+ // set your database here and edit the according section below.
+ // For PHP 4.0.4 and later you must use "dba" if you are using
+ // DBM files for storage. "dbm" uses the older deprecated interface.
+ // The option 'default' will choose either dbm or dba, depending on
+ // the version of PHP you are running.
+ /////////////////////////////////////////////////////////////////////
+
+ $WhichDatabase = 'mysql'; // use one of "dbm", "dba", "mysql",
+ // "pgsql", "msql", "mssql", or "file"
+
+ // DBM and DBA settings (default)
+ if ($WhichDatabase == 'dbm' or $WhichDatabase == 'dba' or
+ $WhichDatabase == 'default') {
+ $DBMdir = "/tmp";
+ $WikiPageStore = "wiki";
+ $ArchivePageStore = "archive";
+ $WikiDB['wiki'] = "$DBMdir/wikipagesdb";
+ $WikiDB['archive'] = "$DBMdir/wikiarchivedb";
+ $WikiDB['wikilinks'] = "$DBMdir/wikilinksdb";
+ $WikiDB['hottopics'] = "$DBMdir/wikihottopicsdb";
+ $WikiDB['hitcount'] = "$DBMdir/wikihitcountdb";
+
+ // this is the type of DBM file on your system. For most Linuxen
+ // 'gdbm' is fine; 'db2' is another common type. 'ndbm' appears
+ // on Solaris but won't work because it won't store pages larger
+ // than 1000 bytes.
+ define("DBM_FILE_TYPE", 'gdbm');
+
+ // try this many times if the dbm is unavailable
+ define("MAX_DBM_ATTEMPTS", 20);
+
+ // for PHP3 use dbmlib, else use dbalib for PHP4
+ if ($WhichDatabase == 'default') {
+ if ( floor(phpversion()) == 3) {
+ $WhichDatabase = 'dbm';
+ } else {
+ $WhichDatabase = 'dba';
+ }
+ }
+
+ if ($WhichDatabase == 'dbm') {
+ include "lib/dbmlib.php";
+ } else {
+ include "lib/dbalib.php";
+ }
+
+ // MySQL settings -- see INSTALL.mysql for details on using MySQL
+ } elseif ($WhichDatabase == 'mysql') {
+ // MySQL server host:
+ $mysql_server = 'localhost';
+
+ // username as used in step 2 of INSTALL.mysql:
+ $mysql_user = 'ubixos';
+
+ // password of above user (or leave blank if none):
+ $mysql_pwd = 'osubix';
+
+ // name of the mysql database
+ // (this used to default to 'wiki' prior to phpwiki-1.2.2)
+ $mysql_db = 'ubixos_wiki';
+
+ // Names of the tables.
+ // You probably don't need to change these. If you do change
+ // them you will also have to make corresponding changes in
+ // schemas/schema.mysql before you initialize the database.
+ $WikiPageStore = "wiki";
+ $ArchivePageStore = "archive";
+ $WikiLinksStore = "wikilinks";
+ $WikiScoreStore = "wikiscore";
+ $HitCountStore = "hitcount";
+
+ include "lib/mysql.php";
+
+ // PostgreSQL settings -- see INSTALL.pgsql for more details
+ } elseif ($WhichDatabase == 'pgsql') {
+ $pg_dbhost = "localhost";
+ $pg_dbport = "5432";
+ $WikiDataBase = "wiki"; // name of the database in Postgresql
+ $WikiPageStore = "wiki";
+ $ArchivePageStore = "archive";
+ $WikiLinksPageStore = "wikilinks";
+ $HotTopicsPageStore = "hottopics";
+ $HitCountPageStore = "hitcount";
+ include "lib/pgsql.php";
+
+ // MiniSQL (mSQL) settings -- see INSTALL.msql for details on using mSQL
+ } elseif ($WhichDatabase == 'msql') {
+ $msql_db = "wiki";
+ $WikiPageStore = array();
+ $ArchivePageStore = array();
+ $WikiPageStore['table'] = "wiki";
+ $WikiPageStore['page_table'] = "wikipages";
+ $ArchivePageStore['table'] = "archive";
+ $ArchivePageStore['page_table'] = "archivepages";
+ // should be the same as wikipages.line
+ define("MSQL_MAX_LINE_LENGTH", 128);
+ include "lib/msql.php";
+
+ // Filesystem DB settings
+ } elseif ($WhichDatabase == 'file') {
+ $DBdir = "/tmp/wiki";
+ $WikiPageStore = "wiki";
+ $ArchivePageStore = "archive";
+ $WikiDB['wiki'] = "$DBdir/pages";
+ $WikiDB['archive'] = "$DBdir/archive";
+ $WikiDB['wikilinks'] = "$DBdir/links";
+ $WikiDB['hottopics'] = "$DBdir/hottopics";
+ $WikiDB['hitcount'] = "$DBdir/hitcount";
+ include "lib/db_filesystem.php";
+
+ // MS SQLServer settings
+ } elseif ($WhichDatabase == 'mssql') {
+ $WikiPageStore = "wiki";
+ $ArchivePageStore = "archive";
+ $WikiLinksStore = "wikilinks";
+ $WikiScoreStore = "wikiscore";
+ $HitCountStore = "hitcount";
+ $mssql_server = 'servername';
+ $mssql_user = '';
+ $mssql_pwd = '';
+ $mssql_db = '';
+ include "lib/mssql.php";
+
+ } else die("Invalid '\$WhichDatabase' in lib/config.php");
+
+
+ /////////////////////////////////////////////////////////////////////
+ // Part Three:
+ // Miscellaneous
+ /////////////////////////////////////////////////////////////////////
+
+ // logo image (path relative to index.php)
+ $logo = "images/wikibase.png";
+
+ // Signature image which is shown after saving an edited page
+ // If this is left blank (or unset), the signature will be omitted.
+ $SignatureImg = "images/signature.png";
+
+ // date & time formats used to display modification times, etc.
+ // formats are given as format strings to PHP date() function
+ $datetimeformat = "F j, Y"; // may contain time of day
+ $dateformat = "F j, Y"; // must not contain time
+
+ // this defines how many page names to list when displaying
+ // the MostPopular pages; the default is to show the 20 most popular pages
+ define("MOST_POPULAR_LIST_LENGTH", 20);
+
+ // this defines how many page names to list when displaying related pages
+ define("NUM_RELATED_PAGES", 5);
+
+ // number of user-defined external references, i.e. "[1]"
+ define("NUM_LINKS", 12);
+
+ // allowed protocols for links - be careful not to allow "javascript:"
+ // within a named link [name|uri] one more protocol is defined: phpwiki
+ $AllowedProtocols = "http|https|mailto|ftp|news|gopher";
+
+ // URLs ending with the following extension should be inlined as images
+ $InlineImages = "png|jpg|gif";
+
+ // Perl regexp for WikiNames
+ // (? gettext("templates/browse.html"),
+ "EDITPAGE" => gettext("templates/editpage.html"),
+ "EDITLINKS" => gettext("templates/editlinks.html"),
+ "MESSAGE" => gettext("templates/message.html")
+ );
+
+ /* WIKI_PGSRC -- specifies the source for the initial page contents
+ * of the Wiki. The setting of WIKI_PGSRC only has effect when
+ * the wiki is accessed for the first time (or after clearing the
+ * database.) WIKI_PGSRC can either name a directory or a zip file.
+ * In either case WIKI_PGSRC is scanned for files --- one file per page.
+ *
+ * If the files appear to be MIME formatted messages, they are
+ * scanned for application/x-phpwiki content-types. Any suitable
+ * content is added to the wiki.
+ * The files can also be plain text files, in which case the page name
+ * is taken from the file name.
+ */
+
+ define('WIKI_PGSRC', gettext("./pgsrc")); // Default (old) behavior.
+ //define('WIKI_PGSRC', './wiki.zip'); // New style.
+
+ // DEFAULT_WIKI_PGSRC is only used when the language is *not*
+ // the default (English) and when reading from a directory:
+ // in that case some English pages are inserted into the wiki as well
+ // DEFAULT_WIKI_PGSRC defines where the English pages reside
+ define('DEFAULT_WIKI_PGSRC', "./pgsrc");
+
+
+
+ //////////////////////////////////////////////////////////////////////
+ // you shouldn't have to edit anyting below this line
+ function compute_default_scripturl() {
+ global $SERVER_PORT, $SERVER_NAME, $SCRIPT_NAME, $HTTPS;
+ if (!empty($HTTPS) && $HTTPS != 'off') {
+ $proto = 'https';
+ $dflt_port = 443;
+ }
+ else {
+ $proto = 'http';
+ $dflt_port = 80;
+ }
+ $port = ($SERVER_PORT == $dflt_port) ? '' : ":$SERVER_PORT";
+ return $proto . '://' . $SERVER_NAME . $port . $SCRIPT_NAME;
+ }
+
+ if (empty($ScriptUrl)) {
+ $ScriptUrl = compute_default_scripturl();
+ }
+ if (defined('WIKI_ADMIN') && !empty($AdminUrl))
+ $ScriptUrl = $AdminUrl;
+
+ $FieldSeparator = "\263";
+
+ if (isset($PHP_AUTH_USER)) {
+ $remoteuser = $PHP_AUTH_USER;
+ } else {
+
+ // Apache won't show REMOTE_HOST unless the admin configured it
+ // properly. We'll be nice and see if it's there.
+
+ getenv('REMOTE_HOST') ? ($remoteuser = getenv('REMOTE_HOST'))
+ : ($remoteuser = getenv('REMOTE_ADDR'));
+ }
+
+ // constants used for HTML output. HTML tags may allow nesting
+ // other tags always start at level 0
+ define("ZERO_LEVEL", 0);
+ define("NESTED_LEVEL", 1);
+
+ // constants for flags in $pagehash
+ define("FLAG_PAGE_LOCKED", 1);
+?>
diff --git a/docroot/phpwiki/lib/db_filesystem.php b/docroot/phpwiki/lib/db_filesystem.php
new file mode 100755
index 0000000..271422e
--- /dev/null
+++ b/docroot/phpwiki/lib/db_filesystem.php
@@ -0,0 +1,326 @@
+\n"), htmlspecialchars($page));
+ continue;
+ }
+
+ while (list($i, $line) = each($pagedata['content'])) {
+ if (preg_match($pos['search'], $line))
+ return $page;
+ }
+ }
+ return 0;
+ }
+
+ function IncreaseHitCount($dbi, $pagename) {
+ return;
+return;
+ // kluge: we ignore the $dbi for hit counting
+ global $WikiDB;
+
+ $hcdb = OpenDataBase($WikiDB['hitcount']);
+
+ if (dbmexists($hcdb['active'], $pagename)) {
+ // increase the hit count
+ $count = dbmfetch($hcdb['active'], $pagename);
+ $count++;
+ dbmreplace($hcdb['active'], $pagename, $count);
+ } else {
+ // add it, set the hit count to one
+ $count = 1;
+ dbminsert($hcdb['active'], $pagename, $count);
+ }
+
+ CloseDataBase($hcdb);
+ }
+
+ function GetHitCount($dbi, $pagename) {
+ return;
+ // kluge: we ignore the $dbi for hit counting
+ global $WikiDB;
+
+ $hcdb = OpenDataBase($WikiDB['hitcount']);
+ if (dbmexists($hcdb['active'], $pagename)) {
+ // increase the hit count
+ $count = dbmfetch($hcdb['active'], $pagename);
+ return $count;
+ } else {
+ return 0;
+ }
+
+ CloseDataBase($hcdb);
+ }
+
+
+ function InitMostPopular($dbi, $limit) {
+ return;
+ $pagename = dbmfirstkey($dbi['hitcount']);
+ $res[$pagename] = dbmfetch($dbi['hitcount'], $pagename);
+ while ($pagename = dbmnextkey($dbi['hitcount'], $pagename)) {
+ $res[$pagename] = dbmfetch($dbi['hitcount'], $pagename);
+ echo "got $pagename with value " . $res[$pagename] . " \n";
+ }
+
+ rsort($res);
+ reset($res);
+ return($res);
+ }
+
+ function MostPopularNextMatch($dbi, $res) {
+ return;
+ // the return result is a two element array with 'hits'
+ // and 'pagename' as the keys
+
+ if (list($index1, $index2, $pagename, $hits) = each($res)) {
+ echo "most popular next match called \n";
+ echo "got $pagename, $hits back \n";
+ $nextpage = array(
+ "hits" => $hits,
+ "pagename" => $pagename
+ );
+ return $nextpage;
+ } else {
+ return 0;
+ }
+ }
+
+ function GetAllWikiPagenames($dbi) {
+ $namelist = array();
+ $d = opendir($dbi);
+ while($entry = readdir($d)) {
+ if ($entry == '.' || $entry == '..')
+ continue;
+ $pagename = rawurldecode($entry);
+ if ($entry != EncodePagename($pagename)) {
+ printf(gettext("%s: Bad filename in database \n"),
+ htmlspecialchars("$dbi/$entry"));
+ continue;
+ }
+ $namelist[] = $pagename;
+ }
+
+ return $namelist;
+ }
+?>
diff --git a/docroot/phpwiki/lib/dbalib.php b/docroot/phpwiki/lib/dbalib.php
new file mode 100755
index 0000000..cd246c1
--- /dev/null
+++ b/docroot/phpwiki/lib/dbalib.php
@@ -0,0 +1,310 @@
+ MAX_DBM_ATTEMPTS) {
+ ExitWiki("Cannot open database '$key' : '$file', giving up.");
+ }
+ sleep(1);
+ }
+ }
+ return $dbi;
+ }
+
+
+ function CloseDataBase($dbi) {
+ reset($dbi);
+ while (list($dbmfile, $dbihandle) = each($dbi)) {
+ dba_close($dbihandle);
+ }
+ return;
+ }
+
+
+ // take a serialized hash, return same padded out to
+ // the next largest number bytes divisible by 500. This
+ // is to save disk space in the long run, since DBM files
+ // leak memory.
+ function PadSerializedData($data) {
+ // calculate the next largest number divisible by 500
+ $nextincr = 500 * ceil(strlen($data) / 500);
+ // pad with spaces
+ $data = sprintf("%-${nextincr}s", $data);
+ return $data;
+ }
+
+ // strip trailing whitespace from the serialized data
+ // structure.
+ function UnPadSerializedData($data) {
+ return chop($data);
+ }
+
+
+
+ // Return hash of page + attributes or default
+ function RetrievePage($dbi, $pagename, $pagestore) {
+ if ($data = dba_fetch($pagename, $dbi[$pagestore])) {
+ // unserialize $data into a hash
+ $pagehash = unserialize(UnPadSerializedData($data));
+ $pagehash['pagename'] = $pagename;
+ return $pagehash;
+ } else {
+ return -1;
+ }
+ }
+
+
+ // Either insert or replace a key/value (a page)
+ function InsertPage($dbi, $pagename, $pagehash) {
+ $pagedata = PadSerializedData(serialize($pagehash));
+
+ if (!dba_insert($pagename, $pagedata, $dbi['wiki'])) {
+ if (!dba_replace($pagename, $pagedata, $dbi['wiki'])) {
+ ExitWiki("Error inserting page '$pagename'");
+ }
+ }
+ }
+
+
+ // for archiving pages to a seperate dbm
+ function SaveCopyToArchive($dbi, $pagename, $pagehash) {
+ global $ArchivePageStore;
+
+ $pagedata = PadSerializedData(serialize($pagehash));
+
+ if (!dba_insert($pagename, $pagedata, $dbi[$ArchivePageStore])) {
+ if (!dba_replace($pagename, $pagedata, $dbi['archive'])) {
+ ExitWiki("Error storing '$pagename' into archive");
+ }
+ }
+ }
+
+
+ function IsWikiPage($dbi, $pagename) {
+ return dba_exists($pagename, $dbi['wiki']);
+ }
+
+
+ function IsInArchive($dbi, $pagename) {
+ return dba_exists($pagename, $dbi['archive']);
+ }
+
+
+ // setup for title-search
+ function InitTitleSearch($dbi, $search) {
+ $pos['search'] = '=' . preg_quote($search) . '=i';
+ $pos['key'] = dba_firstkey($dbi['wiki']);
+
+ return $pos;
+ }
+
+ // iterating through database
+ function TitleSearchNextMatch($dbi, &$pos) {
+ while ($pos['key']) {
+ $page = $pos['key'];
+ $pos['key'] = dba_nextkey($dbi['wiki']);
+
+ if (preg_match($pos['search'], $page)) {
+ return $page;
+ }
+ }
+ return 0;
+ }
+
+ // setup for full-text search
+ function InitFullSearch($dbi, $search) {
+ return InitTitleSearch($dbi, $search);
+ }
+
+ //iterating through database
+ function FullSearchNextMatch($dbi, &$pos) {
+ while ($pos['key']) {
+ $key = $pos['key'];
+ $pos['key'] = dba_nextkey($dbi['wiki']);
+
+ $pagedata = dba_fetch($key, $dbi['wiki']);
+ // test the serialized data
+ if (preg_match($pos['search'], $pagedata)) {
+ $page['pagename'] = $key;
+ $pagedata = unserialize(UnPadSerializedData($pagedata));
+ $page['content'] = $pagedata['content'];
+ return $page;
+ }
+ }
+ return 0;
+ }
+
+ ////////////////////////
+ // new database features
+
+ // Compute PCRE suitable for searching for links to the given page.
+ function MakeBackLinkSearchRegexp($pagename) {
+ global $WikiNameRegexp;
+
+ $quoted_pagename = preg_quote($pagename, '/');
+ if (preg_match("/^$WikiNameRegexp\$/", $pagename)) {
+ // FIXME: This may need modification for non-standard (non-english) $WikiNameRegexp.
+ return "/(?\n";
+ $count = dba_fetch($pagename, $dbi['hitcount']);
+ $count++;
+ dba_replace($pagename, $count, $dbi['hitcount']);
+ } else {
+ // add it, set the hit count to one
+ // echo "adding $pagename to hitcount... \n";
+ $count = 1;
+ dba_insert($pagename, $count, $dbi['hitcount']);
+ }
+ }
+
+ function GetHitCount($dbi, $pagename) {
+
+ if (dba_exists($pagename, $dbi['hitcount'])) {
+ // increase the hit count
+ $count = dba_fetch($pagename, $dbi['hitcount']);
+ return $count;
+ } else {
+ return 0;
+ }
+ }
+
+ function InitMostPopular($dbi, $limit) {
+ // iterate through the whole dbm file for hit counts
+ // sort the results highest to lowest, and return
+ // n..$limit results
+
+ $pagename = dba_firstkey($dbi['hitcount']);
+ $res[$pagename] = dba_fetch($pagename, $dbi['hitcount']);
+
+ while ($pagename = dba_nextkey($dbi['hitcount'])) {
+ $res[$pagename] = dba_fetch($pagename, $dbi['hitcount']);
+ //echo "got $pagename with value " . $res[$pagename] . " \n";
+ }
+
+ arsort($res);
+ return($res);
+ }
+
+ function MostPopularNextMatch($dbi, &$res) {
+
+ // the return result is a two element array with 'hits'
+ // and 'pagename' as the keys
+
+ if (count($res) == 0)
+ return 0;
+
+ if (list($pagename, $hits) = each($res)) {
+ //echo "most popular next match called \n";
+ //echo "got $pagename, $hits back \n";
+ $nextpage = array(
+ "hits" => $hits,
+ "pagename" => $pagename
+ );
+ // $dbm_mostpopular_cntr++;
+ return $nextpage;
+ } else {
+ return 0;
+ }
+ }
+
+ function GetAllWikiPagenames($dbi) {
+ $namelist = array();
+ $ctr = 0;
+
+ $namelist[$ctr] = $key = dba_firstkey($dbi);
+
+ while ($key = dba_nextkey($dbi)) {
+ $ctr++;
+ $namelist[$ctr] = $key;
+ }
+
+ return $namelist;
+ }
+
+?>
diff --git a/docroot/phpwiki/lib/dbmlib.php b/docroot/phpwiki/lib/dbmlib.php
new file mode 100755
index 0000000..e1c3b49
--- /dev/null
+++ b/docroot/phpwiki/lib/dbmlib.php
@@ -0,0 +1,541 @@
+ MAX_DBM_ATTEMPTS) {
+ ExitWiki("Cannot open database '$key' : '$file', giving up.");
+ }
+ sleep(1);
+ }
+ }
+ return $dbi;
+ }
+
+
+ function CloseDataBase($dbi) {
+ reset($dbi);
+ while (list($dbmfile, $dbihandle) = each($dbi)) {
+ dbmclose($dbihandle);
+ }
+ return;
+ }
+
+
+ // take a serialized hash, return same padded out to
+ // the next largest number bytes divisible by 500. This
+ // is to save disk space in the long run, since DBM files
+ // leak memory.
+ function PadSerializedData($data) {
+ // calculate the next largest number divisible by 500
+ $nextincr = 500 * ceil(strlen($data) / 500);
+ // pad with spaces
+ $data = sprintf("%-${nextincr}s", $data);
+ return $data;
+ }
+
+ // strip trailing whitespace from the serialized data
+ // structure.
+ function UnPadSerializedData($data) {
+ return chop($data);
+ }
+
+
+
+ // Return hash of page + attributes or default
+ function RetrievePage($dbi, $pagename, $pagestore) {
+ if ($data = dbmfetch($dbi[$pagestore], $pagename)) {
+ // unserialize $data into a hash
+ $pagehash = unserialize(UnPadSerializedData($data));
+ $pagehash['pagename'] = $pagename;
+ return $pagehash;
+ } else {
+ return -1;
+ }
+ }
+
+
+ // Either insert or replace a key/value (a page)
+ function InsertPage($dbi, $pagename, $pagehash, $pagestore='wiki') {
+
+ if ($pagestore == 'wiki') { // a bit of a hack
+ $linklist = ExtractWikiPageLinks($pagehash['content']);
+ SetWikiPageLinks($dbi, $pagename, $linklist);
+ }
+
+ $pagedata = PadSerializedData(serialize($pagehash));
+
+ if (dbminsert($dbi[$pagestore], $pagename, $pagedata)) {
+ if (dbmreplace($dbi[$pagestore], $pagename, $pagedata)) {
+ ExitWiki("Error inserting page '$pagename'");
+ }
+ }
+ }
+
+
+ // for archiving pages to a separate dbm
+ function SaveCopyToArchive($dbi, $pagename, $pagehash) {
+ global $ArchivePageStore;
+
+ $pagedata = PadSerializedData(serialize($pagehash));
+
+ if (dbminsert($dbi[$ArchivePageStore], $pagename, $pagedata)) {
+ if (dbmreplace($dbi['archive'], $pagename, $pagedata)) {
+ ExitWiki("Error storing '$pagename' into archive");
+ }
+ }
+ }
+
+
+ function IsWikiPage($dbi, $pagename) {
+ return dbmexists($dbi['wiki'], $pagename);
+ }
+
+
+ function IsInArchive($dbi, $pagename) {
+ return dbmexists($dbi['archive'], $pagename);
+ }
+
+
+ function RemovePage($dbi, $pagename) {
+
+ dbmdelete($dbi['wiki'], $pagename); // report error if this fails?
+ dbmdelete($dbi['archive'], $pagename); // no error if this fails
+ dbmdelete($dbi['hitcount'], $pagename); // no error if this fails
+
+ $linkinfo = RetrievePage($dbi, $pagename, 'wikilinks');
+
+ // remove page from fromlinks of pages it had links to
+ if (is_array($linkinfo)) { // page exists?
+ $tolinks = $linkinfo['tolinks'];
+ reset($tolinks);
+ while (list($tolink, $dummy) = each($tolinks)) {
+ $tolinkinfo = RetrievePage($dbi, $tolink, 'wikilinks');
+ if (is_array($tolinkinfo)) { // page found?
+ $oldFromlinks = $tolinkinfo['fromlinks'];
+ $tolinkinfo['fromlinks'] = array(); // erase fromlinks
+ reset($oldFromlinks);
+ while (list($fromlink, $dummy) = each($oldFromlinks)) {
+ if ($fromlink != $pagename) // not to be erased?
+ $tolinkinfo['fromlinks'][$fromlink] = 1; // put link back
+ } // put link info back in DBM file
+ InsertPage($dbi, $tolink, $tolinkinfo, 'wikilinks');
+ }
+ }
+
+ // remove page itself
+ dbmdelete($dbi['wikilinks'], $pagename);
+ }
+ }
+
+
+ // setup for title-search
+ function InitTitleSearch($dbi, $search) {
+ $pos['search'] = '=' . preg_quote($search) . '=i';
+ $pos['key'] = dbmfirstkey($dbi['wiki']);
+
+ return $pos;
+ }
+
+
+ // iterating through database
+ function TitleSearchNextMatch($dbi, &$pos) {
+ while ($pos['key']) {
+ $page = $pos['key'];
+ $pos['key'] = dbmnextkey($dbi['wiki'], $pos['key']);
+
+ if (preg_match($pos['search'], $page)) {
+ return $page;
+ }
+ }
+ return 0;
+ }
+
+
+ // setup for full-text search
+ function InitFullSearch($dbi, $search) {
+ return InitTitleSearch($dbi, $search);
+ }
+
+
+ //iterating through database
+ function FullSearchNextMatch($dbi, &$pos) {
+ while ($pos['key']) {
+ $key = $pos['key'];
+ $pos['key'] = dbmnextkey($dbi['wiki'], $pos['key']);
+
+ $pagedata = dbmfetch($dbi['wiki'], $key);
+ // test the serialized data
+ if (preg_match($pos['search'], $pagedata)) {
+ $page['pagename'] = $key;
+ $pagedata = unserialize(UnPadSerializedData($pagedata));
+ $page['content'] = $pagedata['content'];
+ return $page;
+ }
+ }
+ return 0;
+ }
+
+
+ ////////////////////////
+ // new database features
+
+ // Compute PCRE suitable for searching for links to the given page.
+ function MakeBackLinkSearchRegexp($pagename) {
+ global $WikiNameRegexp;
+
+ // Note that in (at least some) PHP 3.x's, preg_quote only takes
+ // (at most) one argument. Also it doesn't quote '/'s.
+ // It does quote '='s, so we'll use that for the delimeter.
+ $quoted_pagename = preg_quote($pagename);
+ if (preg_match("/^$WikiNameRegexp\$/", $pagename)) {
+ # FIXME: This may need modification for non-standard (non-english) $WikiNameRegexp.
+ return "=(?\n";
+ $count = dbmfetch($dbi['hitcount'], $pagename);
+ $count++;
+ dbmreplace($dbi['hitcount'], $pagename, $count);
+ } else {
+ // add it, set the hit count to one
+ $count = 1;
+ dbminsert($dbi['hitcount'], $pagename, $count);
+ }
+ }
+
+
+ function GetHitCount($dbi, $pagename) {
+
+ if (dbmexists($dbi['hitcount'], $pagename)) {
+ // increase the hit count
+ $count = dbmfetch($dbi['hitcount'], $pagename);
+ return $count;
+ } else {
+ return 0;
+ }
+ }
+
+
+ function InitMostPopular($dbi, $limit) {
+ // iterate through the whole dbm file for hit counts
+ // sort the results highest to lowest, and return
+ // n..$limit results
+
+ // Because sorting all the pages may be a lot of work
+ // we only get the top $limit. A page is only added if it's score is
+ // higher than the lowest score in the list. If the list is full then
+ // one of the pages with the lowest scores is removed.
+
+ $pagename = dbmfirstkey($dbi['hitcount']);
+ $score = dbmfetch($dbi['hitcount'], $pagename);
+ $res = array($pagename => (int) $score);
+ $lowest = $score;
+
+ while ($pagename = dbmnextkey($dbi['hitcount'], $pagename)) {
+ $score = dbmfetch($dbi['hitcount'], $pagename);
+ if (count($res) < $limit) { // room left in $res?
+ if ($score < $lowest)
+ $lowest = $score;
+ $res[$pagename] = (int) $score; // add page to $res
+ } elseif ($score > $lowest) {
+ $oldres = $res; // save old result
+ $res = array();
+ $removed = 0; // nothing removed yet
+ $newlowest = $score; // new lowest score
+ $res[$pagename] = (int) $score; // add page to $res
+ reset($oldres);
+ while(list($pname, $pscore) = each($oldres)) {
+ if (!$removed and ($pscore = $lowest))
+ $removed = 1; // don't copy this entry
+ else {
+ $res[$pname] = (int) $pscore;
+ if ($pscore < $newlowest)
+ $newlowest = $pscore;
+ }
+ }
+ $lowest = $newlowest;
+ }
+ }
+
+ arsort($res); // sort
+ reset($res);
+
+ return($res);
+ }
+
+
+ function MostPopularNextMatch($dbi, &$res) {
+
+ // the return result is a two element array with 'hits'
+ // and 'pagename' as the keys
+
+ if (list($pagename, $hits) = each($res)) {
+ $nextpage = array(
+ "hits" => $hits,
+ "pagename" => $pagename
+ );
+ return $nextpage;
+ } else {
+ return 0;
+ }
+ }
+
+
+ function GetAllWikiPagenames($dbi) {
+ $namelist = array();
+ $ctr = 0;
+
+ $namelist[$ctr] = $key = dbmfirstkey($dbi);
+
+ while ($key = dbmnextkey($dbi, $key)) {
+ $ctr++;
+ $namelist[$ctr] = $key;
+ }
+
+ return $namelist;
+ }
+
+
+ ////////////////////////////////////////////
+ // functionality for the wikilinks DBM file
+
+ // format of the 'wikilinks' DBM file :
+ // pagename =>
+ // { tolinks => ( pagename => 1}, fromlinks => { pagename => 1 } }
+
+ // takes a page name, returns array of scored incoming and outgoing links
+ function GetWikiPageLinks($dbi, $pagename) {
+
+ $linkinfo = RetrievePage($dbi, $pagename, 'wikilinks');
+ if (is_array($linkinfo)) { // page exists?
+ $tolinks = $linkinfo['tolinks']; // outgoing links
+ $fromlinks = $linkinfo['fromlinks']; // incoming links
+ } else { // new page, but pages may already point to it
+ // create info for page
+ $tolinks = array();
+ $fromlinks = array();
+ // look up pages that link to $pagename
+ $pname = dbmfirstkey($dbi['wikilinks']);
+ while ($pname) {
+ $linkinfo = RetrievePage($dbi, $pname, 'wikilinks');
+ if ($linkinfo['tolinks'][$pagename]) // $pname links to $pagename?
+ $fromlinks[$pname] = 1;
+ $pname = dbmnextkey($dbi['wikilinks'], $pname);
+ }
+ }
+
+ // get and sort the outgoing links
+ $outlinks = array();
+ reset($tolinks); // look up scores for tolinks
+ while(list($tolink, $dummy) = each($tolinks)) {
+ $toPage = RetrievePage($dbi, $tolink, 'wikilinks');
+ if (is_array($toPage)) // link to internal page?
+ $outlinks[$tolink] = count($toPage['fromlinks']);
+ }
+ arsort($outlinks); // sort on score
+ $links['out'] = array();
+ reset($outlinks); // convert to right format
+ while(list($link, $score) = each($outlinks))
+ $links['out'][] = array($link, $score);
+
+ // get and sort the incoming links
+ $inlinks = array();
+ reset($fromlinks); // look up scores for fromlinks
+ while(list($fromlink, $dummy) = each($fromlinks)) {
+ $fromPage = RetrievePage($dbi, $fromlink, 'wikilinks');
+ $inlinks[$fromlink] = count($fromPage['fromlinks']);
+ }
+ arsort($inlinks); // sort on score
+ $links['in'] = array();
+ reset($inlinks); // convert to right format
+ while(list($link, $score) = each($inlinks))
+ $links['in'][] = array($link, $score);
+
+ // sort all the incoming and outgoing links
+ $allLinks = $outlinks; // copy the outlinks
+ reset($inlinks); // add the inlinks
+ while(list($key, $value) = each($inlinks))
+ $allLinks[$key] = $value;
+ reset($allLinks); // lookup hits
+ while(list($key, $value) = each($allLinks))
+ $allLinks[$key] = (int) dbmfetch($dbi['hitcount'], $key);
+ arsort($allLinks); // sort on hits
+ $links['popular'] = array();
+ reset($allLinks); // convert to right format
+ while(list($link, $hits) = each($allLinks))
+ $links['popular'][] = array($link, $hits);
+
+ return $links;
+ }
+
+
+ // takes page name, list of links it contains
+ // the $linklist is an array where the keys are the page names
+ function SetWikiPageLinks($dbi, $pagename, $linklist) {
+
+ $cache = array();
+
+ // Phase 1: fetch the relevant pairs from 'wikilinks' into $cache
+ // ---------------------------------------------------------------
+
+ // first the info for $pagename
+ $linkinfo = RetrievePage($dbi, $pagename, 'wikilinks');
+ if (is_array($linkinfo)) // page exists?
+ $cache[$pagename] = $linkinfo;
+ else {
+ // create info for page
+ $cache[$pagename] = array( 'fromlinks' => array(),
+ 'tolinks' => array()
+ );
+ // look up pages that link to $pagename
+ $pname = dbmfirstkey($dbi['wikilinks']);
+ while ($pname) {
+ $linkinfo = RetrievePage($dbi, $pname, 'wikilinks');
+ if ($linkinfo['tolinks'][$pagename])
+ $cache[$pagename]['fromlinks'][$pname] = 1;
+ $pname = dbmnextkey($dbi['wikilinks'], $pname);
+ }
+ }
+
+ // then the info for the pages that $pagename used to point to
+ $oldTolinks = $cache[$pagename]['tolinks'];
+ reset($oldTolinks);
+ while (list($link, $dummy) = each($oldTolinks)) {
+ $linkinfo = RetrievePage($dbi, $link, 'wikilinks');
+ if (is_array($linkinfo))
+ $cache[$link] = $linkinfo;
+ }
+
+ // finally the info for the pages that $pagename will point to
+ reset($linklist);
+ while (list($link, $dummy) = each($linklist)) {
+ $linkinfo = RetrievePage($dbi, $link, 'wikilinks');
+ if (is_array($linkinfo))
+ $cache[$link] = $linkinfo;
+ }
+
+ // Phase 2: delete the old links
+ // ---------------------------------------------------------------
+
+ // delete the old tolinks for $pagename
+ // $cache[$pagename]['tolinks'] = array();
+ // (overwritten anyway in Phase 3)
+
+ // remove $pagename from the fromlinks of pages in $oldTolinks
+
+ reset($oldTolinks);
+ while (list($oldTolink, $dummy) = each($oldTolinks)) {
+ if ($cache[$oldTolink]) { // links to existing page?
+ $oldFromlinks = $cache[$oldTolink]['fromlinks'];
+ $cache[$oldTolink]['fromlinks'] = array(); // erase fromlinks
+ reset($oldFromlinks); // comp. new fr.links
+ while (list($fromlink, $dummy) = each($oldFromlinks)) {
+ if ($fromlink != $pagename)
+ $cache[$oldTolink]['fromlinks'][$fromlink] = 1;
+ }
+ }
+ }
+
+ // Phase 3: add the new links
+ // ---------------------------------------------------------------
+
+ // set the new tolinks for $pagename
+ $cache[$pagename]['tolinks'] = $linklist;
+
+ // add $pagename to the fromlinks of pages in $linklist
+ reset($linklist);
+ while (list($link, $dummy) = each($linklist)) {
+ if ($cache[$link]) // existing page?
+ $cache[$link]['fromlinks'][$pagename] = 1;
+ }
+
+ // Phase 4: write $cache back to 'wikilinks'
+ // ---------------------------------------------------------------
+
+ reset($cache);
+ while (list($link,$fromAndTolinks) = each($cache))
+ InsertPage($dbi, $link, $fromAndTolinks, 'wikilinks');
+
+ }
+
+?>
diff --git a/docroot/phpwiki/lib/diff.php b/docroot/phpwiki/lib/diff.php
new file mode 100755
index 0000000..f1024f6
--- /dev/null
+++ b/docroot/phpwiki/lib/diff.php
@@ -0,0 +1,1077 @@
+
+
+// You may copy this code freely under the conditions of the GPL.
+//
+
+// FIXME: possibly remove assert()'s for production version?
+
+// PHP3 does not have assert()
+define('USE_ASSERTS', function_exists('assert'));
+
+
+/**
+ * Class used internally by WikiDiff to actually compute the diffs.
+ *
+ * The algorithm used here is mostly lifted from the perl module
+ * Algorithm::Diff (version 1.06) by Ned Konz, which is available at:
+ * http://www.perl.com/CPAN/authors/id/N/NE/NEDKONZ/Algorithm-Diff-1.06.zip
+ *
+ * More ideas are taken from:
+ * http://www.ics.uci.edu/~eppstein/161/960229.html
+ *
+ * Some ideas are (and a bit of code) are from from analyze.c, from GNU
+ * diffutils-2.7, which can be found at:
+ * ftp://gnudist.gnu.org/pub/gnu/diffutils/diffutils-2.7.tar.gz
+ *
+ * Finally, some ideas (subdivision by NCHUNKS > 2, and some optimizations)
+ * are my own.
+ */
+class _WikiDiffEngine
+{
+ var $edits; // List of editing operation to convert XV to YV.
+ var $xv = array(), $yv = array();
+
+ function _WikiDiffEngine ($from_lines, $to_lines)
+ {
+ $n_from = sizeof($from_lines);
+ $n_to = sizeof($to_lines);
+ $endskip = 0;
+
+ // Ignore trailing and leading matching lines.
+ while ($n_from > 0 && $n_to > 0)
+ {
+ if ($from_lines[$n_from - 1] != $to_lines[$n_to - 1])
+ break;
+ $n_from--;
+ $n_to--;
+ $endskip++;
+ }
+ for ( $skip = 0; $skip < min($n_from, $n_to); $skip++)
+ if ($from_lines[$skip] != $to_lines[$skip])
+ break;
+ $n_from -= $skip;
+ $n_to -= $skip;
+
+ $xlines = array();
+ $ylines = array();
+
+ // Ignore lines which do not exist in both files.
+ for ($x = 0; $x < $n_from; $x++)
+ $xhash[$from_lines[$x + $skip]] = 1;
+ for ($y = 0; $y < $n_to; $y++)
+ {
+ $line = $to_lines[$y + $skip];
+ $ylines[] = $line;
+ if ( ($this->ychanged[$y] = empty($xhash[$line])) )
+ continue;
+ $yhash[$line] = 1;
+ $this->yv[] = $line;
+ $this->yind[] = $y;
+ }
+ for ($x = 0; $x < $n_from; $x++)
+ {
+ $line = $from_lines[$x + $skip];
+ $xlines[] = $line;
+ if ( ($this->xchanged[$x] = empty($yhash[$line])) )
+ continue;
+ $this->xv[] = $line;
+ $this->xind[] = $x;
+ }
+
+ // Find the LCS.
+ $this->_compareseq(0, sizeof($this->xv), 0, sizeof($this->yv));
+
+ // Merge edits when possible
+ $this->_shift_boundaries($xlines, $this->xchanged, $this->ychanged);
+ $this->_shift_boundaries($ylines, $this->ychanged, $this->xchanged);
+
+ // Compute the edit operations.
+ $this->edits = array();
+ if ($skip)
+ $this->edits[] = $skip;
+
+ $x = 0;
+ $y = 0;
+ while ($x < $n_from || $y < $n_to)
+ {
+ USE_ASSERTS && assert($y < $n_to || $this->xchanged[$x]);
+ USE_ASSERTS && assert($x < $n_from || $this->ychanged[$y]);
+
+ // Skip matching "snake".
+ $x0 = $x;
+ $ncopy = 0;
+
+ while ( $x < $n_from && $y < $n_to
+ && !$this->xchanged[$x] && !$this->ychanged[$y])
+ {
+ ++$x;
+ ++$y;
+ ++$ncopy;
+ }
+ if ($x > $x0)
+ $this->edits[] = $x - $x0;
+
+ // Find deletes.
+ $x0 = $x;
+ $ndelete = 0;
+ while ($x < $n_from && $this->xchanged[$x])
+ {
+ ++$x;
+ ++$ndelete;
+ }
+ if ($x > $x0)
+ $this->edits[] = -($x - $x0);
+
+ // Find adds.
+ if ($this->ychanged[$y])
+ {
+ $adds = array();
+ while ($y < $n_to && $this->ychanged[$y])
+ $adds[] = "" . $ylines[$y++];
+ $this->edits[] = $adds;
+ }
+ }
+ if ($endskip != 0)
+ $this->edits[] = $endskip;
+ }
+
+ /* Divide the Largest Common Subsequence (LCS) of the sequences
+ * [XOFF, XLIM) and [YOFF, YLIM) into NCHUNKS approximately equally
+ * sized segments.
+ *
+ * Returns (LCS, PTS). LCS is the length of the LCS. PTS is an
+ * array of NCHUNKS+1 (X, Y) indexes giving the diving points between
+ * sub sequences. The first sub-sequence is contained in [X0, X1),
+ * [Y0, Y1), the second in [X1, X2), [Y1, Y2) and so on. Note
+ * that (X0, Y0) == (XOFF, YOFF) and
+ * (X[NCHUNKS], Y[NCHUNKS]) == (XLIM, YLIM).
+ *
+ * This function assumes that the first lines of the specified portions
+ * of the two files do not match, and likewise that the last lines do not
+ * match. The caller must trim matching lines from the beginning and end
+ * of the portions it is going to specify.
+ */
+ function _diag ($xoff, $xlim, $yoff, $ylim, $nchunks)
+ {
+ $flip = false;
+
+ if ($xlim - $xoff > $ylim - $yoff)
+ {
+ // Things seems faster (I'm not sure I understand why)
+ // when the shortest sequence in X.
+ $flip = true;
+ list ($xoff, $xlim, $yoff, $ylim)
+ = array( $yoff, $ylim, $xoff, $xlim);
+ }
+
+ if ($flip)
+ for ($i = $ylim - 1; $i >= $yoff; $i--)
+ $ymatches[$this->xv[$i]][] = $i;
+ else
+ for ($i = $ylim - 1; $i >= $yoff; $i--)
+ $ymatches[$this->yv[$i]][] = $i;
+
+ $this->lcs = 0;
+ $this->seq[0]= $yoff - 1;
+ $this->in_seq = array();
+ $ymids[0] = array();
+
+ $numer = $xlim - $xoff + $nchunks - 1;
+ $x = $xoff;
+ for ($chunk = 0; $chunk < $nchunks; $chunk++)
+ {
+ if ($chunk > 0)
+ for ($i = 0; $i <= $this->lcs; $i++)
+ $ymids[$i][$chunk-1] = $this->seq[$i];
+
+ $x1 = $xoff + (int)(($numer + ($xlim-$xoff)*$chunk) / $nchunks);
+ for ( ; $x < $x1; $x++)
+ {
+ $matches = $ymatches[$flip ? $this->yv[$x] : $this->xv[$x]];
+ if (!$matches)
+ continue;
+ reset($matches);
+ while (list ($junk, $y) = each($matches))
+ if (empty($this->in_seq[$y]))
+ {
+ $k = $this->_lcs_pos($y);
+ USE_ASSERTS && assert($k > 0);
+ $ymids[$k] = $ymids[$k-1];
+ break;
+ }
+ while (list ($junk, $y) = each($matches))
+ {
+ if ($y > $this->seq[$k-1])
+ {
+ USE_ASSERTS && assert($y < $this->seq[$k]);
+ // Optimization: this is a common case:
+ // next match is just replacing previous match.
+ $this->in_seq[$this->seq[$k]] = false;
+ $this->seq[$k] = $y;
+ $this->in_seq[$y] = 1;
+ }
+ else if (empty($this->in_seq[$y]))
+ {
+ $k = $this->_lcs_pos($y);
+ USE_ASSERTS && assert($k > 0);
+ $ymids[$k] = $ymids[$k-1];
+ }
+ }
+ }
+ }
+
+ $seps[] = $flip ? array($yoff, $xoff) : array($xoff, $yoff);
+ $ymid = $ymids[$this->lcs];
+ for ($n = 0; $n < $nchunks - 1; $n++)
+ {
+ $x1 = $xoff + (int)(($numer + ($xlim - $xoff) * $n) / $nchunks);
+ $y1 = $ymid[$n] + 1;
+ $seps[] = $flip ? array($y1, $x1) : array($x1, $y1);
+ }
+ $seps[] = $flip ? array($ylim, $xlim) : array($xlim, $ylim);
+
+ return array($this->lcs, $seps);
+ }
+
+ function _lcs_pos ($ypos)
+ {
+ $end = $this->lcs;
+ if ($end == 0 || $ypos > $this->seq[$end])
+ {
+ $this->seq[++$this->lcs] = $ypos;
+ $this->in_seq[$ypos] = 1;
+ return $this->lcs;
+ }
+
+ $beg = 1;
+ while ($beg < $end)
+ {
+ $mid = (int)(($beg + $end) / 2);
+ if ( $ypos > $this->seq[$mid] )
+ $beg = $mid + 1;
+ else
+ $end = $mid;
+ }
+
+ USE_ASSERTS && assert($ypos != $this->seq[$end]);
+
+ $this->in_seq[$this->seq[$end]] = false;
+ $this->seq[$end] = $ypos;
+ $this->in_seq[$ypos] = 1;
+ return $end;
+ }
+
+ /* Find LCS of two sequences.
+ *
+ * The results are recorded in the vectors $this->{x,y}changed[], by
+ * storing a 1 in the element for each line that is an insertion
+ * or deletion (ie. is not in the LCS).
+ *
+ * The subsequence of file 0 is [XOFF, XLIM) and likewise for file 1.
+ *
+ * Note that XLIM, YLIM are exclusive bounds.
+ * All line numbers are origin-0 and discarded lines are not counted.
+ */
+ function _compareseq ($xoff, $xlim, $yoff, $ylim)
+ {
+ // Slide down the bottom initial diagonal.
+ while ($xoff < $xlim && $yoff < $ylim
+ && $this->xv[$xoff] == $this->yv[$yoff])
+ {
+ ++$xoff;
+ ++$yoff;
+ }
+
+ // Slide up the top initial diagonal.
+ while ($xlim > $xoff && $ylim > $yoff
+ && $this->xv[$xlim - 1] == $this->yv[$ylim - 1])
+ {
+ --$xlim;
+ --$ylim;
+ }
+
+ if ($xoff == $xlim || $yoff == $ylim)
+ $lcs = 0;
+ else
+ {
+ // This is ad hoc but seems to work well.
+ //$nchunks = sqrt(min($xlim - $xoff, $ylim - $yoff) / 2.5);
+ //$nchunks = max(2,min(8,(int)$nchunks));
+ $nchunks = min(7, $xlim - $xoff, $ylim - $yoff) + 1;
+ list ($lcs, $seps)
+ = $this->_diag($xoff,$xlim,$yoff, $ylim,$nchunks);
+ }
+
+ if ($lcs == 0)
+ {
+ // X and Y sequences have no common subsequence:
+ // mark all changed.
+ while ($yoff < $ylim)
+ $this->ychanged[$this->yind[$yoff++]] = 1;
+ while ($xoff < $xlim)
+ $this->xchanged[$this->xind[$xoff++]] = 1;
+ }
+ else
+ {
+ // Use the partitions to split this problem into subproblems.
+ reset($seps);
+ $pt1 = $seps[0];
+ while ($pt2 = next($seps))
+ {
+ $this->_compareseq ($pt1[0], $pt2[0], $pt1[1], $pt2[1]);
+ $pt1 = $pt2;
+ }
+ }
+ }
+
+ /* Adjust inserts/deletes of identical lines to join changes
+ * as much as possible.
+ *
+ * We do something when a run of changed lines include a
+ * line at one end and has an excluded, identical line at the other.
+ * We are free to choose which identical line is included.
+ * `compareseq' usually chooses the one at the beginning,
+ * but usually it is cleaner to consider the following identical line
+ * to be the "change".
+ *
+ * This is extracted verbatim from analyze.c (GNU diffutils-2.7).
+ */
+ function _shift_boundaries ($lines, &$changed, $other_changed)
+ {
+ $i = 0;
+ $j = 0;
+
+ USE_ASSERTS && assert('sizeof($lines) == sizeof($changed)');
+ $len = sizeof($lines);
+ $other_len = sizeof($other_changed);
+
+ while (1)
+ {
+ /*
+ * Scan forwards to find beginning of another run of changes.
+ * Also keep track of the corresponding point in the other file.
+ *
+ * Throughout this code, $i and $j are adjusted together so that
+ * the first $i elements of $changed and the first $j elements
+ * of $other_changed both contain the same number of zeros
+ * (unchanged lines).
+ * Furthermore, $j is always kept so that $j == $other_len or
+ * $other_changed[$j] == false.
+ */
+ while ($j < $other_len && $other_changed[$j])
+ $j++;
+
+ while ($i < $len && ! $changed[$i])
+ {
+ USE_ASSERTS && assert('$j < $other_len && ! $other_changed[$j]');
+ $i++; $j++;
+ while ($j < $other_len && $other_changed[$j])
+ $j++;
+ }
+
+ if ($i == $len)
+ break;
+
+ $start = $i;
+
+ // Find the end of this run of changes.
+ while (++$i < $len && $changed[$i])
+ continue;
+
+ do
+ {
+ /*
+ * Record the length of this run of changes, so that
+ * we can later determine whether the run has grown.
+ */
+ $runlength = $i - $start;
+
+ /*
+ * Move the changed region back, so long as the
+ * previous unchanged line matches the last changed one.
+ * This merges with previous changed regions.
+ */
+ while ($start > 0 && $lines[$start - 1] == $lines[$i - 1])
+ {
+ $changed[--$start] = 1;
+ $changed[--$i] = false;
+ while ($start > 0 && $changed[$start - 1])
+ $start--;
+ USE_ASSERTS && assert('$j > 0');
+ while ($other_changed[--$j])
+ continue;
+ USE_ASSERTS && assert('$j >= 0 && !$other_changed[$j]');
+ }
+
+ /*
+ * Set CORRESPONDING to the end of the changed run, at the last
+ * point where it corresponds to a changed run in the other file.
+ * CORRESPONDING == LEN means no such point has been found.
+ */
+ $corresponding = $j < $other_len ? $i : $len;
+
+ /*
+ * Move the changed region forward, so long as the
+ * first changed line matches the following unchanged one.
+ * This merges with following changed regions.
+ * Do this second, so that if there are no merges,
+ * the changed region is moved forward as far as possible.
+ */
+ while ($i < $len && $lines[$start] == $lines[$i])
+ {
+ $changed[$start++] = false;
+ $changed[$i++] = 1;
+ while ($i < $len && $changed[$i])
+ $i++;
+
+ USE_ASSERTS && assert('$j < $other_len && ! $other_changed[$j]');
+ $j++;
+ if ($j < $other_len && $other_changed[$j])
+ {
+ $corresponding = $i;
+ while ($j < $other_len && $other_changed[$j])
+ $j++;
+ }
+ }
+ }
+ while ($runlength != $i - $start);
+
+ /*
+ * If possible, move the fully-merged run of changes
+ * back to a corresponding run in the other file.
+ */
+ while ($corresponding < $i)
+ {
+ $changed[--$start] = 1;
+ $changed[--$i] = 0;
+ USE_ASSERTS && assert('$j > 0');
+ while ($other_changed[--$j])
+ continue;
+ USE_ASSERTS && assert('$j >= 0 && !$other_changed[$j]');
+ }
+ }
+ }
+}
+
+/**
+ * Class representing a diff between two files.
+ */
+class WikiDiff
+{
+ var $edits;
+
+ /**
+ * Compute diff between files (or deserialize serialized WikiDiff.)
+ */
+ function WikiDiff($from_lines = false, $to_lines = false)
+ {
+ if ($from_lines && $to_lines)
+ {
+ $compute = new _WikiDiffEngine($from_lines, $to_lines);
+ $this->edits = $compute->edits;
+ }
+ else if ($from_lines)
+ {
+ // $from_lines is not really from_lines, but rather
+ // a serialized WikiDiff.
+ $this->edits = unserialize($from_lines);
+ }
+ else
+ {
+ $this->edits = array();
+ }
+ }
+
+ /**
+ * Compute reversed WikiDiff.
+ *
+ * SYNOPSIS:
+ *
+ * $diff = new WikiDiff($lines1, $lines2);
+ * $rev = $diff->reverse($lines1);
+ *
+ * // reconstruct $lines1 from $lines2:
+ * $out = $rev->apply($lines2);
+ */
+ function reverse ($from_lines)
+ {
+ $x = 0;
+ $rev = new WikiDiff;
+
+ for ( reset($this->edits);
+ $edit = current($this->edits);
+ next($this->edits) )
+ {
+ if (is_array($edit))
+ { // Was an add, turn it into a delete.
+ $nadd = sizeof($edit);
+ USE_ASSERTS && assert ($nadd > 0);
+ $edit = -$nadd;
+ }
+ else if ($edit > 0)
+ {
+ // Was a copy --- just pass it through. }
+ $x += $edit;
+ }
+ else if ($edit < 0)
+ { // Was a delete, turn it into an add.
+ $ndelete = -$edit;
+ $edit = array();
+ while ($ndelete-- > 0)
+ $edit[] = "" . $from_lines[$x++];
+ }
+ else die("assertion error");
+
+ $rev->edits[] = $edit;
+ }
+
+ return $rev;
+ }
+
+ /**
+ * Compose (concatenate) WikiDiffs.
+ *
+ * SYNOPSIS:
+ *
+ * $diff1 = new WikiDiff($lines1, $lines2);
+ * $diff2 = new WikiDiff($lines2, $lines3);
+ * $comp = $diff1->compose($diff2);
+ *
+ * // reconstruct $lines3 from $lines1:
+ * $out = $comp->apply($lines1);
+ */
+ function compose ($that)
+ {
+ reset($this->edits);
+ reset($that->edits);
+
+ $comp = new WikiDiff;
+ $left = current($this->edits);
+ $right = current($that->edits);
+
+ while ($left || $right)
+ {
+ if (!is_array($left) && $left < 0)
+ { // Left op is a delete.
+ $newop = $left;
+ $left = next($this->edits);
+ }
+ else if (is_array($right))
+ { // Right op is an add.
+ $newop = $right;
+ $right = next($that->edits);
+ }
+ else if (!$left || !$right)
+ die ("assertion error");
+ else if (!is_array($left) && $left > 0)
+ { // Left op is a copy.
+ if ($left <= abs($right))
+ {
+ $newop = $right > 0 ? $left : -$left;
+ $right -= $newop;
+ if ($right == 0)
+ $right = next($that->edits);
+ $left = next($this->edits);
+ }
+ else
+ {
+ $newop = $right;
+ $left -= abs($right);
+ $right = next($that->edits);
+ }
+ }
+ else
+ { // Left op is an add.
+ if (!is_array($left)) die('assertion error');
+ $nleft = sizeof($left);
+ if ($nleft <= abs($right))
+ {
+ if ($right > 0)
+ { // Right op is copy
+ $newop = $left;
+ $right -= $nleft;
+ }
+ else // Right op is delete
+ {
+ $newop = false;
+ $right += $nleft;
+ }
+ if ($right == 0)
+ $right = next($that->edits);
+ $left = next($this->edits);
+ }
+ else
+ {
+ unset($newop);
+ if ($right > 0)
+ for ($i = 0; $i < $right; $i++)
+ $newop[] = $left[$i];
+
+ $tmp = array();
+ for ($i = abs($right); $i < $nleft; $i++)
+ $tmp[] = $left[$i];
+ $left = $tmp;
+ $right = next($that->edits);
+ }
+ }
+ if (!$op)
+ {
+ $op = $newop;
+ continue;
+ }
+ if (! $newop)
+ continue;
+
+ if (is_array($op) && is_array($newop))
+ {
+ // Both $op and $newop are adds.
+ for ($i = 0; $i < sizeof($newop); $i++)
+ $op[] = $newop[$i];
+ }
+ else if (($op > 0 && $newop > 0) || ($op < 0 && $newop < 0))
+ { // $op and $newop are both either deletes or copies.
+ $op += $newop;
+ }
+ else
+ {
+ $comp->edits[] = $op;
+ $op = $newop;
+ }
+ }
+ if ($op)
+ $comp->edits[] = $op;
+
+ return $comp;
+ }
+
+ /* Debugging only:
+ function _dump ()
+ {
+ echo "";
+ for (reset($this->edits);
+ $edit = current($this->edits);
+ next($this->edits))
+ {
+ echo "";
+ if ($edit > 0)
+ echo "Copy $edit";
+ else if ($edit < 0)
+ echo "Delete " . -$edit;
+ else if (is_array($edit))
+ {
+ echo "Add " . sizeof($edit) . "";
+ for ($i = 0; $i < sizeof($edit); $i++)
+ echo "" . htmlspecialchars($edit[$i]);
+ echo " ";
+ }
+ else
+ die("assertion error");
+ }
+ echo " ";
+ }
+ */
+
+ /**
+ * Apply a WikiDiff to a set of lines.
+ *
+ * SYNOPSIS:
+ *
+ * $diff = new WikiDiff($lines1, $lines2);
+ *
+ * // reconstruct $lines2 from $lines1:
+ * $out = $diff->apply($lines1);
+ */
+ function apply ($from_lines)
+ {
+ $x = 0;
+ $xlim = sizeof($from_lines);
+
+ for ( reset($this->edits);
+ $edit = current($this->edits);
+ next($this->edits) )
+ {
+ if (is_array($edit))
+ {
+ reset($edit);
+ while (list ($junk, $line) = each($edit))
+ $output[] = $line;
+ }
+ else if ($edit > 0)
+ while ($edit--)
+ $output[] = $from_lines[$x++];
+ else
+ $x += -$edit;
+ }
+ if ($x != $xlim)
+ ExitWiki(sprintf(gettext ("WikiDiff::apply: line count mismatch: %s != %s"), $x, $xlim));
+ return $output;
+ }
+
+ /**
+ * Serialize a WikiDiff.
+ *
+ * SYNOPSIS:
+ *
+ * $diff = new WikiDiff($lines1, $lines2);
+ * $string = $diff->serialize;
+ *
+ * // recover WikiDiff from serialized version:
+ * $diff2 = new WikiDiff($string);
+ */
+ function serialize ()
+ {
+ return serialize($this->edits);
+ }
+
+ /**
+ * Return true if two files were equal.
+ */
+ function isEmpty ()
+ {
+ if (sizeof($this->edits) > 1)
+ return false;
+ if (sizeof($this->edits) == 0)
+ return true;
+ // Test for: only edit is a copy.
+ return !is_array($this->edits[0]) && $this->edits[0] > 0;
+ }
+
+ /**
+ * Compute the length of the Longest Common Subsequence (LCS).
+ *
+ * This is mostly for diagnostic purposed.
+ */
+ function lcs ()
+ {
+ $lcs = 0;
+ for (reset($this->edits);
+ $edit = current($this->edits);
+ next($this->edits))
+ {
+ if (!is_array($edit) && $edit > 0)
+ $lcs += $edit;
+ }
+ return $lcs;
+ }
+
+ /**
+ * Check a WikiDiff for validity.
+ *
+ * This is here only for debugging purposes.
+ */
+ function _check ($from_lines, $to_lines)
+ {
+ $test = $this->apply($from_lines);
+ if (serialize($test) != serialize($to_lines))
+ ExitWiki(gettext ("WikiDiff::_check: failed"));
+
+ reset($this->edits);
+ $prev = current($this->edits);
+ $prevtype = is_array($prev) ? 'a' : ($prev > 0 ? 'c' : 'd');
+
+ while ($edit = next($this->edits))
+ {
+ $type = is_array($edit) ? 'a' : ($edit > 0 ? 'c' : 'd');
+ if ( $prevtype == $type )
+ ExitWiki(gettext ("WikiDiff::_check: edit sequence is non-optimal"));
+ $prevtype = $type;
+ }
+ $lcs = $this->lcs();
+ printf ("" . gettext ("WikiDiff Okay: LCS = %s") . " \n", $lcs);
+ }
+}
+
+
+/**
+ * A class to format a WikiDiff as HTML.
+ *
+ * Usage:
+ *
+ * $diff = new WikiDiff($lines1, $lines2); // compute diff.
+ *
+ * $fmt = new WikiDiffFormatter;
+ * echo $fmt->format($diff, $lines1); // Output HTMLified standard diff.
+ *
+ * or to output reverse diff (diff's that would take $lines2 to $lines1):
+ *
+ * $fmt = new WikiDiffFormatter('reversed');
+ * echo $fmt->format($diff, $lines1);
+ */
+class WikiDiffFormatter
+{
+ var $context_lines;
+ var $do_reverse_diff;
+ var $context_prefix, $deletes_prefix, $adds_prefix;
+
+ function WikiDiffFormatter ($reverse = false)
+ {
+ $this->do_reverse_diff = $reverse;
+ $this->context_lines = 0;
+ $this->context_prefix = ' ';
+ $this->deletes_prefix = '< ';
+ $this->adds_prefix = '> ';
+ }
+
+ function format ($diff, $from_lines)
+ {
+ $html = '\n";
+ $html .= $this->_format($diff->edits, $from_lines);
+ $html .= "
\n";
+
+ return $html;
+ }
+
+ function _format ($edits, $from_lines)
+ {
+ $html = '';
+ $x = 0; $y = 0;
+ $xlim = sizeof($from_lines);
+
+ reset($edits);
+ while ($edit = current($edits))
+ {
+ if (!is_array($edit) && $edit >= 0)
+ { // Edit op is a copy.
+ $ncopy = $edit;
+ }
+ else
+ {
+ $ncopy = 0;
+ if (empty($hunk))
+ {
+ // Start of an output hunk.
+ $xoff = max(0, $x - $this->context_lines);
+ $yoff = $xoff + $y - $x;
+ if ($xoff < $x)
+ {
+ // Get leading context.
+ $context = array();
+ for ($i = $xoff; $i < $x; $i++)
+ $context[] = $from_lines[$i];
+ $hunk['c'] = $context;
+ }
+ }
+ if (is_array($edit))
+ { // Edit op is an add.
+ $y += sizeof($edit);
+ $hunk[$this->do_reverse_diff ? 'd' : 'a'] = $edit;
+ }
+ else
+ { // Edit op is a delete
+ $deletes = array();
+ while ($edit++ < 0)
+ $deletes[] = $from_lines[$x++];
+ $hunk[$this->do_reverse_diff ? 'a' : 'd'] = $deletes;
+ }
+ }
+
+ $next = next($edits);
+ if (!empty($hunk))
+ {
+ if ( !$next || $ncopy > 2 * $this->context_lines)
+ {
+ // End of an output hunk.
+ $hunks[] = $hunk;
+ unset($hunk);
+
+ $xend = min($x + $this->context_lines, $xlim);
+ if ($x < $xend)
+ {
+ // Get trailing context.
+ $context = array();
+ for ($i = $x; $i < $xend; $i++)
+ $context[] = $from_lines[$i];
+ $hunks[] = array('c' => $context);
+ }
+
+ $xlen = $xend - $xoff;
+ $ylen = $xend + $y - $x - $yoff;
+ $xbeg = $xlen ? $xoff + 1 : $xoff;
+ $ybeg = $ylen ? $yoff + 1 : $yoff;
+
+ if ($this->do_reverse_diff)
+ list ($xbeg, $xlen, $ybeg, $ylen)
+ = array($ybeg, $ylen, $xbeg, $xlen);
+
+ $html .= $this->_emit_diff($xbeg,$xlen,$ybeg,$ylen,
+ $hunks);
+ unset($hunks);
+ }
+ else if ($ncopy)
+ {
+ $hunks[] = $hunk;
+
+ // Copy context.
+ $context = array();
+ for ($i = $x; $i < $x + $ncopy; $i++)
+ $context[] = $from_lines[$i];
+ $hunk = array('c' => $context);
+ }
+ }
+
+ $x += $ncopy;
+ $y += $ncopy;
+ }
+ return $html;
+ }
+
+ function _emit_lines($lines, $prefix, $color)
+ {
+ $html = '';
+ reset($lines);
+ while (list ($junk, $line) = each($lines))
+ {
+ $html .= "$prefix ";
+ $html .= "" . htmlspecialchars($line) . " \n";
+ }
+ return $html;
+ }
+
+ function _emit_diff ($xbeg,$xlen,$ybeg,$ylen,$hunks)
+ {
+ $html = '\n"
+ . ''
+ . $this->_diff_header($xbeg, $xlen, $ybeg, $ylen)
+ . " \n\n"
+ . "\n";
+
+ $prefix = array('c' => $this->context_prefix,
+ 'a' => $this->adds_prefix,
+ 'd' => $this->deletes_prefix);
+ $color = array('c' => '#ffffff',
+ 'a' => '#ffcccc',
+ 'd' => '#ccffcc');
+
+ for (reset($hunks); $hunk = current($hunks); next($hunks))
+ {
+ if (!empty($hunk['c']))
+ $html .= $this->_emit_lines($hunk['c'],
+ $this->context_prefix, '#ffffff');
+ if (!empty($hunk['d']))
+ $html .= $this->_emit_lines($hunk['d'],
+ $this->deletes_prefix, '#ccffcc');
+ if (!empty($hunk['a']))
+ $html .= $this->_emit_lines($hunk['a'],
+ $this->adds_prefix, '#ffcccc');
+ }
+
+ $html .= "
\n";
+ return $html;
+ }
+
+ function _diff_header ($xbeg,$xlen,$ybeg,$ylen)
+ {
+ $what = $xlen ? ($ylen ? 'c' : 'd') : 'a';
+ $xlen = $xlen > 1 ? "," . ($xbeg + $xlen - 1) : '';
+ $ylen = $ylen > 1 ? "," . ($ybeg + $ylen - 1) : '';
+
+ return "$xbeg$xlen$what$ybeg$ylen";
+ }
+}
+
+/**
+ * A class to format a WikiDiff as a pretty HTML unified diff.
+ *
+ * Usage:
+ *
+ * $diff = new WikiDiff($lines1, $lines2); // compute diff.
+ *
+ * $fmt = new WikiUnifiedDiffFormatter;
+ * echo $fmt->format($diff, $lines1); // Output HTMLified unified diff.
+ */
+class WikiUnifiedDiffFormatter extends WikiDiffFormatter
+{
+ function WikiUnifiedDiffFormatter ($reverse = false, $context_lines = 3)
+ {
+ $this->do_reverse_diff = $reverse;
+ $this->context_lines = $context_lines;
+ $this->context_prefix = ' ';
+ $this->deletes_prefix = '-';
+ $this->adds_prefix = '+';
+ }
+
+ function _diff_header ($xbeg,$xlen,$ybeg,$ylen)
+ {
+ $xlen = $xlen == 1 ? '' : ",$xlen";
+ $ylen = $ylen == 1 ? '' : ",$ylen";
+
+ return "@@ -$xbeg$xlen +$ybeg$ylen @@";
+ }
+}
+
+
+
+/////////////////////////////////////////////////////////////////
+
+if ($diff)
+{
+ if (get_magic_quotes_gpc()) {
+ $diff = stripslashes($diff);
+ }
+
+ $pagename = $diff;
+
+ $wiki = RetrievePage($dbi, $pagename, $WikiPageStore);
+// $dba = OpenDataBase($ArchivePageStore);
+ $archive= RetrievePage($dbi, $pagename, $ArchivePageStore);
+
+ $html = '';
+ $html .= gettext ("Current page:");
+ $html .= ' ';
+ if (is_array($wiki)) {
+ $html .= "";
+ $html .= sprintf(gettext ("version %s"), $wiki['version']);
+ $html .= " ";
+ $html .= sprintf(gettext ("last modified on %s"),
+ date($datetimeformat, $wiki['lastmodified']));
+ $html .= " ";
+ $html .= sprintf (gettext ("by %s"), $wiki['author']);
+ $html .= " ";
+ } else {
+ $html .= "";
+ $html .= gettext ("None");
+ $html .= " ";
+ }
+ $html .= " \n";
+ $html .= '';
+ $html .= gettext ("Archived page:");
+ $html .= ' ';
+ if (is_array($archive)) {
+ $html .= "";
+ $html .= sprintf(gettext ("version %s"), $archive['version']);
+ $html .= " ";
+ $html .= sprintf(gettext ("last modified on %s"),
+ date($datetimeformat, $archive['lastmodified']));
+ $html .= " ";
+ $html .= sprintf(gettext ("by %s"), $archive['author']);
+ $html .= " ";
+ } else {
+ $html .= "";
+ $html .= gettext ("None");
+ $html .= " ";
+ }
+ $html .= "
\n";
+
+ if (is_array($wiki) && is_array($archive))
+ {
+ $diff = new WikiDiff($archive['content'], $wiki['content']);
+ if ($diff->isEmpty()) {
+ $html .= '
[' . gettext ("Versions are identical") . ']';
+ } else {
+ //$fmt = new WikiDiffFormatter;
+ $fmt = new WikiUnifiedDiffFormatter;
+ $html .= $fmt->format($diff, $archive['content']);
+ }
+ }
+
+ GeneratePage('MESSAGE', $html, sprintf(gettext ("Diff of %s."),
+ htmlspecialchars($pagename)), 0);
+}
+?>
diff --git a/docroot/phpwiki/lib/display.php b/docroot/phpwiki/lib/display.php
new file mode 100755
index 0000000..b72f0d3
--- /dev/null
+++ b/docroot/phpwiki/lib/display.php
@@ -0,0 +1,40 @@
+?");
+ }
+
+ GeneratePage('BROWSE', $html, $pagename, $pagehash);
+ flush();
+
+ IncreaseHitCount($dbi, $pagename);
+?>
diff --git a/docroot/phpwiki/lib/editlinks.php b/docroot/phpwiki/lib/editlinks.php
new file mode 100755
index 0000000..4442867
--- /dev/null
+++ b/docroot/phpwiki/lib/editlinks.php
@@ -0,0 +1,14 @@
+
+ for this code.
+ // This allows an arbitrary number of reference links.
+
+ $pagename = rawurldecode($links);
+ if (get_magic_quotes_gpc()) {
+ $pagename = stripslashes($pagename);
+ }
+ $pagehash = RetrievePage($dbi, $pagename, $WikiPageStore);
+ settype ($pagehash, 'array');
+
+ GeneratePage('EDITLINKS', "", $pagename, $pagehash);
+?>
diff --git a/docroot/phpwiki/lib/editpage.php b/docroot/phpwiki/lib/editpage.php
new file mode 100755
index 0000000..3537512
--- /dev/null
+++ b/docroot/phpwiki/lib/editpage.php
@@ -0,0 +1,65 @@
+
+";
+ $html .= gettext ("This page has been locked by the administrator and cannot be edited.");
+ $html .= "\n";
+ $html .= gettext ("Sorry for the inconvenience.");
+ $html .= "\n";
+ GeneratePage('MESSAGE', $html, sprintf (gettext ("Problem while editing %s"), $pagename), 0);
+ ExitWiki ("");
+ }
+
+ $textarea = htmlspecialchars(implode("\n", $pagehash["content"]));
+ if (isset($copy)) {
+ // $cdbi = OpenDataBase($WikiPageStore);
+ $currentpage = RetrievePage($dbi, $pagename, $WikiPageStore);
+ $pagehash["version"] = $currentpage["version"];
+ }
+ elseif ($pagehash["version"] > 1) {
+ if(IsInArchive($dbi, $pagename))
+ $pagehash["copy"] = 1;
+ }
+ } else {
+ if (preg_match("/^${WikiNameRegexp}\$/", $pagename))
+ $newpage = $pagename;
+ else
+ $newpage = "[$pagename]";
+
+ $textarea = htmlspecialchars(
+ sprintf(gettext ("Describe %s here."), $newpage));
+
+ unset($pagehash);
+ $pagehash["version"] = 0;
+ $pagehash["lastmodified"] = time();
+ $pagehash["author"] = '';
+ }
+
+ GeneratePage('EDITPAGE', $textarea, $pagename, $pagehash);
+?>
diff --git a/docroot/phpwiki/lib/fullsearch.php b/docroot/phpwiki/lib/fullsearch.php
new file mode 100755
index 0000000..0674cf2
--- /dev/null
+++ b/docroot/phpwiki/lib/fullsearch.php
@@ -0,0 +1,52 @@
+"
+ . sprintf(gettext ("Searching for \"%s\" ....."),
+ htmlspecialchars($full))
+ . "
\n\n";
+ $found = 0;
+ $count = 0;
+
+ if (strlen($full)) {
+ // search matching pages
+ $query = InitFullSearch($dbi, $full);
+
+ // quote regexp chars (space are treated as "or" operator)
+ $full = preg_replace("/\s+/", "|", preg_quote($full));
+
+ while ($pagehash = FullSearchNextMatch($dbi, $query)) {
+ $html .= "" . LinkExistingWikiWord($pagehash["pagename"]) . " \n";
+ $count++;
+
+ // print out all matching lines, highlighting the match
+ for ($j = 0; $j < (count($pagehash["content"])); $j++) {
+ if ($hits = preg_match_all(":$full:i", $pagehash["content"][$j], $dummy)) {
+ $matched = preg_replace(":$full:i",
+ "${FieldSeparator}OT\\0${FieldSeparator}CT",
+ $pagehash["content"][$j]);
+ $matched = htmlspecialchars($matched);
+ $matched = str_replace("${FieldSeparator}OT", '', $matched);
+ $matched = str_replace("${FieldSeparator}CT", ' ', $matched);
+ $html .= "$matched \n";
+ $found += $hits;
+ }
+ }
+ }
+ }
+ else {
+ $html .= "" . gettext("(You entered an empty search string)") . " \n";
+ }
+
+ $html .= " \n "
+ . sprintf (gettext ("%d matches found in %d pages."),
+ $found, $count)
+ . "\n";
+
+ GeneratePage('MESSAGE', $html, gettext ("Full Text Search Results"), 0);
+?>
diff --git a/docroot/phpwiki/lib/msql.php b/docroot/phpwiki/lib/msql.php
new file mode 100755
index 0000000..5dbea3c
--- /dev/null
+++ b/docroot/phpwiki/lib/msql.php
@@ -0,0 +1,615 @@
+";
+ $msg .= sprintf(gettext ("Error message: %s"), msql_error());
+ ExitWiki($msg);
+ }
+ if (!msql_select_db($msql_db, $dbc)) {
+ $msg = gettext ("Cannot open database %s, giving up.");
+ $msg .= " ";
+ $msg .= sprintf(gettext ("Error message: %s"), msql_error());
+ ExitWiki($msg);
+ }
+
+ $dbi['dbc'] = $dbc;
+ $dbi['table'] = $dbinfo['table']; // page metadata
+ $dbi['page_table'] = $dbinfo['page_table']; // page content
+ return $dbi;
+ }
+
+
+ function CloseDataBase($dbi) {
+ // I found msql_pconnect unstable so we go the slow route.
+ return msql_close($dbi['dbc']);
+ }
+
+
+ // This should receive the full text of the page in one string
+ // It will break the page text into an array of strings
+ // of length MSQL_MAX_LINE_LENGTH which should match the length
+ // of the columns wikipages.LINE, archivepages.LINE in schema.minisql
+
+ function msqlDecomposeString($string) {
+ $ret_arr = array();
+
+ // initialize the array to satisfy E_NOTICE
+ for ($i = 0; $i < MSQL_MAX_LINE_LENGTH; $i++) {
+ $ret_arr[$i] = "";
+ }
+ $el = 0;
+
+ // zero, one, infinity
+ // account for the small case
+ if (strlen($string) < MSQL_MAX_LINE_LENGTH) {
+ $ret_arr[$el] = $string;
+ return $ret_arr;
+ }
+
+ $words = array();
+ $line = $string2 = "";
+
+ // split on single spaces
+ $words = preg_split("/ /", $string);
+ $num_words = count($words);
+
+ reset($words);
+ $ret_arr[0] = $words[0];
+ $line = " $words[1]";
+
+ // for all words, build up lines < MSQL_MAX_LINE_LENGTH in $ret_arr
+ for ($x = 2; $x < $num_words; $x++) {
+ $length = strlen($line) + strlen($words[$x])
+ + strlen($ret_arr[$el]) + 1;
+
+ if ($length < MSQL_MAX_LINE_LENGTH) {
+ $line .= " " . $words[$x];
+ } else {
+ // put this line in the return array, reset, continue
+ $ret_arr[$el++] .= $line;
+ $line = " $words[$x]"; // reset
+ }
+ }
+ $ret_arr[$el] = $line;
+ return $ret_arr;
+ }
+
+
+ // Take form data and prepare it for the db
+ function MakeDBHash($pagename, $pagehash)
+ {
+ $pagehash["pagename"] = addslashes($pagename);
+ if (!isset($pagehash["flags"]))
+ $pagehash["flags"] = 0;
+ if (!isset($pagehash["content"])) {
+ $pagehash["content"] = array();
+ } else {
+ $pagehash["content"] = implode("\n", $pagehash["content"]);
+ $pagehash["content"] = msqlDecomposeString($pagehash["content"]);
+ }
+ $pagehash["author"] = addslashes($pagehash["author"]);
+ if (empty($pagehash["refs"])) {
+ $pagehash["refs"] = "";
+ } else {
+ $pagehash["refs"] = serialize($pagehash["refs"]);
+ }
+
+ return $pagehash;
+ }
+
+
+ // Take db data and prepare it for display
+ function MakePageHash($dbhash)
+ {
+ // unserialize/explode content
+ $dbhash['refs'] = unserialize($dbhash['refs']);
+ return $dbhash;
+ }
+
+
+ // Return hash of page + attributes or default
+ function RetrievePage($dbi, $pagename, $pagestore) {
+ $pagename = addslashes($pagename);
+ $table = $pagestore['table'];
+ $pagetable = $pagestore['page_table'];
+
+ $query = "select * from $table where pagename='$pagename'";
+ // echo "query: $query
";
+ $res = msql_query($query, $dbi['dbc']);
+ if (msql_num_rows($res)) {
+ $dbhash = msql_fetch_array($res);
+
+ $query = "select lineno,line from $pagetable " .
+ "where pagename='$pagename' " .
+ "order by lineno";
+
+ $msql_content = "";
+ if ($res = msql_query($query, $dbi['dbc'])) {
+ $dbhash["content"] = array();
+ while ($row = msql_fetch_array($res)) {
+ $msql_content .= $row["line"];
+ }
+ $dbhash["content"] = explode("\n", $msql_content);
+ }
+
+ return MakePageHash($dbhash);
+ }
+ return -1;
+ }
+
+
+ // Either insert or replace a key/value (a page)
+ function InsertPage($dbi, $pagename, $pagehash) {
+
+ $pagehash = MakeDBHash($pagename, $pagehash);
+ // $pagehash["content"] is now an array of strings
+ // of MSQL_MAX_LINE_LENGTH
+
+ // record the time of modification
+ $pagehash["lastmodified"] = time();
+
+ if (IsWikiPage($dbi, $pagename)) {
+
+ $PAIRS = "author='$pagehash[author]'," .
+ "created=$pagehash[created]," .
+ "flags=$pagehash[flags]," .
+ "lastmodified=$pagehash[lastmodified]," .
+ "pagename='$pagehash[pagename]'," .
+ "refs='$pagehash[refs]'," .
+ "version=$pagehash[version]";
+
+ $query = "UPDATE $dbi[table] SET $PAIRS WHERE pagename='$pagename'";
+
+ } else {
+ // do an insert
+ // build up the column names and values for the query
+
+ $COLUMNS = "author, created, flags, lastmodified, " .
+ "pagename, refs, version";
+
+ $VALUES = "'$pagehash[author]', " .
+ "$pagehash[created], $pagehash[flags], " .
+ "$pagehash[lastmodified], '$pagehash[pagename]', " .
+ "'$pagehash[refs]', $pagehash[version]";
+
+
+ $query = "INSERT INTO $dbi[table] ($COLUMNS) VALUES($VALUES)";
+ }
+
+ // echo "
Query: $query
\n";
+
+ // first, insert the metadata
+ $retval = msql_query($query, $dbi['dbc']);
+ if ($retval == false) {
+ printf(gettext ("Insert/update to table 'wiki' failed: %s"), msql_error());
+ print " \n";
+ }
+
+
+ // second, insert the page data
+ // remove old data from page_table
+ $query = "delete from $dbi[page_table] where pagename='$pagename'";
+ // echo "Delete query: $query \n";
+ $retval = msql_query($query, $dbi['dbc']);
+ if ($retval == false) {
+ printf(gettext ("Delete on %s failed: %s"), $dbi[page_table],
+ msql_error());
+ print " \n";
+ }
+
+ // insert the new lines
+ reset($pagehash["content"]);
+
+ for ($x = 0; $x < count($pagehash["content"]); $x++) {
+ $line = addslashes($pagehash["content"][$x]);
+ if ($line == '') continue; // why do we always have 127 lines?
+ $esc_pagename = addslashes($pagename);
+ $query = "INSERT INTO $dbi[page_table] " .
+ "(pagename, lineno, line) " .
+ "VALUES('$esc_pagename', $x, '$line')";
+ //echo "Page line insert query: $query \n";
+ $retval = msql_query($query, $dbi['dbc']);
+ if ($retval == false) {
+ printf(gettext ("Insert into %s failed: %s"), $dbi[page_table],
+ msql_error());
+ print " \n";
+ }
+ }
+ }
+
+
+ // for archiving pages to a separate table
+ function SaveCopyToArchive($dbi, $pagename, $pagehash) {
+ global $ArchivePageStore;
+
+ $pagehash = MakeDBHash($pagename, $pagehash);
+ // $pagehash["content"] is now an array of strings
+ // of MSQL_MAX_LINE_LENGTH
+
+ if (IsInArchive($dbi, $pagename)) {
+
+ $PAIRS = "author='$pagehash[author]'," .
+ "created=$pagehash[created]," .
+ "flags=$pagehash[flags]," .
+ "lastmodified=$pagehash[lastmodified]," .
+ "pagename='$pagehash[pagename]'," .
+ "refs='$pagehash[refs]'," .
+ "version=$pagehash[version]";
+
+ $query = "UPDATE $ArchivePageStore[table] SET $PAIRS WHERE pagename='$pagename'";
+
+ } else {
+ // do an insert
+ // build up the column names and values for the query
+
+ $COLUMNS = "author, created, flags, lastmodified, " .
+ "pagename, refs, version";
+
+ $VALUES = "'$pagehash[author]', " .
+ "$pagehash[created], $pagehash[flags], " .
+ "$pagehash[lastmodified], '$pagehash[pagename]', " .
+ "'$pagehash[refs]', $pagehash[version]";
+
+
+ $query = "INSERT INTO archive ($COLUMNS) VALUES($VALUES)";
+ }
+
+ // echo "
Query: $query
\n";
+
+ // first, insert the metadata
+ $retval = msql_query($query, $dbi['dbc']);
+ if ($retval == false) {
+ printf(gettext ("Insert/update into table 'archive' failed: %s"), msql_error());
+ print " \n";
+ }
+
+ // second, insert the page data
+ // remove old data from page_table
+ $query = "delete from $ArchivePageStore[page_table] where pagename='$pagename'";
+ // echo "Delete query: $query \n";
+ $retval = msql_query($query, $dbi['dbc']);
+ if ($retval == false) {
+ printf(gettext ("Delete on %s failed: %s"),
+ $ArchivePageStore[page_table], msql_error());
+ print " \n";
+ }
+
+ // insert the new lines
+ reset($pagehash["content"]);
+
+ for ($x = 0; $x < count($pagehash["content"]); $x++) {
+ $line = addslashes($pagehash["content"][$x]);
+ $query = "INSERT INTO $ArchivePageStore[page_table] " .
+ "(pagename, lineno, line) " .
+ "VALUES('$pagename', $x, '$line')";
+ // echo "Page line insert query: $query \n";
+ $retval = msql_query($query, $dbi['dbc']);
+ if ($retval == false) {
+ printf(gettext ("Insert into %s failed: %s"),
+ $ArchivePageStore[page_table], msql_error());
+ print " \n";
+ }
+ }
+
+
+ }
+
+
+ function IsWikiPage($dbi, $pagename) {
+ $pagename = addslashes($pagename);
+ $query = "select pagename from wiki where pagename='$pagename'";
+ // echo "Query: $query \n";
+ if ($res = msql_query($query, $dbi['dbc'])) {
+ return(msql_affected_rows($res));
+ }
+ }
+
+
+ function IsInArchive($dbi, $pagename) {
+ $pagename = addslashes($pagename);
+ $query = "select pagename from archive where pagename='$pagename'";
+ // echo "Query: $query \n";
+ if ($res = msql_query($query, $dbi['dbc'])) {
+ return(msql_affected_rows($res));
+ }
+ }
+
+
+
+ // setup for title-search
+ function InitTitleSearch($dbi, $search) {
+ $search = preg_replace('/(?=[%_\\\\])/', "\\", $search);
+ $search = addslashes($search);
+ $query = "select pagename from $dbi[table] " .
+ "where pagename clike '%$search%' order by pagename";
+ $res = msql_query($query, $dbi['dbc']);
+
+ return $res;
+ }
+
+
+ // iterating through database
+ function TitleSearchNextMatch($dbi, $res) {
+ if($o = msql_fetch_object($res)) {
+ return $o->pagename;
+ }
+ else {
+ return 0;
+ }
+ }
+
+
+ // setup for full-text search
+ function InitFullSearch($dbi, $search) {
+ // select unique page names from wikipages, and then
+ // retrieve all pages that come back.
+ $search = preg_replace('/(?=[%_\\\\])/', "\\", $search);
+ $search = addslashes($search);
+ $query = "select distinct pagename from $dbi[page_table] " .
+ "where line clike '%$search%' " .
+ "order by pagename";
+ $res = msql_query($query, $dbi['dbc']);
+
+ return $res;
+ }
+
+ // iterating through database
+ function FullSearchNextMatch($dbi, $res) {
+ global $WikiPageStore;
+ if ($row = msql_fetch_row($res)) {
+ return RetrievePage($dbi, $row[0], $WikiPageStore);
+ } else {
+ return 0;
+ }
+ }
+
+ ////////////////////////
+ // new database features
+
+ // Compute PCRE suitable for searching for links to the given page.
+ function MakeBackLinkSearchRegexp($pagename) {
+ global $WikiNameRegexp;
+
+ // Note that in (at least some) PHP 3.x's, preg_quote only takes
+ // (at most) one argument. Also it doesn't quote '/'s.
+ // It does quote '='s, so we'll use that for the delimeter.
+ $quoted_pagename = preg_quote($pagename);
+ if (preg_match("/^$WikiNameRegexp\$/", $pagename)) {
+ # FIXME: This may need modification for non-standard (non-english) $WikiNameRegexp.
+ return "=(?$query
\n";
+ $res['res'] = msql_query($query, $dbi["dbc"]);
+
+ $count = 0;
+ $arr = array();
+
+ // build an array of the results.
+ while ($hash = msql_fetch_array($res[res]) ) {
+ if ($arr[$count -1 ] == $hash[pagename])
+ continue;
+ $arr[$count] = $hash[pagename];
+ $count++;
+ }
+
+ $res[count] = 0;
+ reset($arr);
+ $res[arr] = $arr;
+
+ return $res;
+ }
+
+
+// iterating through database
+function BackLinkSearchNextMatch($dbi, &$res) {
+
+ if ($res[count] > count($res[arr]))
+ return 0;
+
+ $retval = $res[arr][$res[count]];
+ $res[count]++;
+
+ return $retval;
+}
+
+/*
+ if ( ($o = msql_fetch_object($res['res'])) == FALSE ) {
+ echo "returning zero
\n";
+ echo "it's '$o'
\n";
+ return 0;
+ }
+ if ( $res['lastpage'] == $o->pagename )
+ continue;
+ if ( ! preg_match($res['regexp'], $a->line) )
+ continue;
+ $res['lastpage'] = $o->pagename;
+ return $o->pagename;
+ }
+ */
+
+
+ function IncreaseHitCount($dbi, $pagename) {
+
+ $qpagename = addslashes($pagename);
+ $query = "select hits from hitcount where pagename='$qpagename'";
+ $res = msql_query($query, $dbi['dbc']);
+ if (msql_num_rows($res)) {
+ $hits = msql_result($res, 0, 'hits');
+ $hits++;
+ $query = "update hitcount set hits=$hits where pagename='$qpagename'";
+ $res = msql_query($query, $dbi['dbc']);
+
+ } else {
+ $query = "insert into hitcount (pagename, hits) " .
+ "values ('$qpagename', 1)";
+ $res = msql_query($query, $dbi['dbc']);
+ }
+
+ return $res;
+ }
+
+ function GetHitCount($dbi, $pagename) {
+
+ $qpagename = addslashes($pagename);
+ $query = "select hits from hitcount where pagename='$qpagename'";
+ $res = msql_query($query, $dbi['dbc']);
+ if (msql_num_rows($res)) {
+ $hits = msql_result($res, 0, 'hits');
+ } else {
+ $hits = "0";
+ }
+
+ return $hits;
+ }
+
+
+
+ function InitMostPopular($dbi, $limit) {
+
+ $query = "select * from hitcount " .
+ "order by hits desc, pagename limit $limit";
+
+ $res = msql_query($query, $dbi['dbc']);
+
+ return $res;
+ }
+
+ function MostPopularNextMatch($dbi, $res) {
+
+ if ($hits = msql_fetch_array($res)) {
+ return $hits;
+ } else {
+ return 0;
+ }
+ }
+
+ function GetAllWikiPageNames($dbi_) {
+ $res = msql_query("select pagename from wiki", $dbi['dbc']);
+ $rows = msql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $pages[$i] = msql_result($res, $i, 'pagename');
+ }
+ return $pages;
+ }
+
+ ////////////////////////////////////////
+ // functionality for the wikilinks table
+
+ // takes a page name, returns array of scored incoming and outgoing links
+
+/* Not implemented yet. The code below was copied from mysql.php...
+
+ function GetWikiPageLinks($dbi, $pagename) {
+ $links = array();
+ $pagename = addslashes($pagename);
+ $res = msql_query("select wikilinks.topage, wikiscore.score from wikilinks, wikiscore where wikilinks.topage=wikiscore.pagename and wikilinks.frompage='$pagename' order by score desc, topage", $dbi['dbc']);
+
+ $rows = msql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = msql_fetch_array($res);
+ $links['out'][] = array($out['topage'], $out['score']);
+ }
+
+ $res = msql_query("select wikilinks.frompage, wikiscore.score from wikilinks, wikiscore where wikilinks.frompage=wikiscore.pagename and wikilinks.topage='$pagename' order by score desc, frompage", $dbi['dbc']);
+ $rows = msql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = msql_fetch_array($res);
+ $links['in'][] = array($out['frompage'], $out['score']);
+ }
+
+ $res = msql_query("select distinct hitcount.pagename, hitcount.hits from wikilinks, hitcount where (wikilinks.frompage=hitcounts.pagename and wikilinks.topage='$pagename') or (wikilinks.topage=pagename and wikilinks.frompage='$pagename') order by hitcount.hits desc, wikilinks.pagename", $dbi['dbc']);
+ $rows = msql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = msql_fetch_array($res);
+ $links['popular'][] = array($out['pagename'], $out['hits']);
+ }
+
+ return $links;
+ }
+
+
+ // takes page name, list of links it contains
+ // the $linklist is an array where the keys are the page names
+ function SetWikiPageLinks($dbi, $pagename, $linklist) {
+ $frompage = addslashes($pagename);
+
+ // first delete the old list of links
+ msql_query("delete from wikilinks where frompage='$frompage'",
+ $dbi["dbc"]);
+
+ // the page may not have links, return if not
+ if (! count($linklist))
+ return;
+ // now insert the new list of links
+ while (list($topage, $count) = each($linklist)) {
+ $topage = addslashes($topage);
+ if($topage != $frompage) {
+ msql_query("insert into wikilinks (frompage, topage) " .
+ "values ('$frompage', '$topage')", $dbi["dbc"]);
+ }
+ }
+
+ msql_query("delete from wikiscore", $dbi["dbc"]);
+ msql_query("insert into wikiscore select w1.topage, count(*) from wikilinks as w1, wikilinks as w2 where w2.topage=w1.frompage group by w1.topage", $dbi["dbc"]);
+ }
+*/
+
+?>
diff --git a/docroot/phpwiki/lib/mssql.php b/docroot/phpwiki/lib/mssql.php
new file mode 100755
index 0000000..2f4dff6
--- /dev/null
+++ b/docroot/phpwiki/lib/mssql.php
@@ -0,0 +1,399 @@
+";
+ $msg .= sprintf(gettext ("MSSQL error: %s"), mssql_get_last_message());
+ ExitWiki($msg);
+ }
+ // flush message
+ mssql_get_last_message();
+
+ if (!mssql_select_db($mssql_db, $dbc)) {
+ $msg = sprintf(gettext ("Cannot open database %s, giving up."), $mssql_db);
+ $msg .= " ";
+ $msg .= sprintf(gettext ("MSSQL error: %s"), mssql_get_last_message());
+ ExitWiki($msg);
+ }
+ // flush message
+ mssql_get_last_message();
+
+ $dbi['dbc'] = $dbc;
+ $dbi['table'] = $dbname;
+ return $dbi;
+ }
+
+
+ function CloseDataBase($dbi) {
+ // NOP function
+ // mssql connections are established as persistant
+ // they cannot be closed through mssql_close()
+ }
+
+
+ // prepare $pagehash for storing in mssql
+ function MakeDBHash($pagename, $pagehash)
+ {
+ $pagehash["pagename"] = addslashes($pagename);
+ if (!isset($pagehash["flags"]))
+ $pagehash["flags"] = 0;
+ $pagehash["author"] = addslashes($pagehash["author"]);
+ $pagehash["content"] = implode("\n", $pagehash["content"]);
+ $pagehash["content"] = addslashes($pagehash["content"]);
+ if (!isset($pagehash["refs"]))
+ $pagehash["refs"] = array();
+ $pagehash["refs"] = serialize($pagehash["refs"]);
+
+ return $pagehash;
+ }
+
+
+ // convert mssql result $dbhash to $pagehash
+ function MakePageHash($dbhash)
+ {
+ // unserialize/explode content
+ $dbhash['refs'] = unserialize($dbhash['refs']);
+ $dbhash['content'] = explode("\n", $dbhash['content']);
+ return $dbhash;
+ }
+
+
+ // Return hash of page + attributes or default
+ function RetrievePage($dbi, $pagename, $pagestore) {
+ $pagename = addslashes($pagename);
+ if ($res = mssql_query("select * from $pagestore where pagename='$pagename'", $dbi['dbc'])) {
+ if ($dbhash = mssql_fetch_array($res)) {
+ return MakePageHash($dbhash);
+ }
+ }
+ return -1;
+ }
+
+
+ // Either insert or replace a key/value (a page)
+ function InsertPage($dbi, $pagename, $pagehash) {
+
+ global $WikiPageStore; // ugly hack
+ if ($dbi['table'] == $WikiPageStore)
+ { // HACK
+ $linklist = ExtractWikiPageLinks($pagehash['content']);
+ SetWikiPageLinks($dbi, $pagename, $linklist);
+ }
+
+ $pagehash = MakeDBHash($pagename, $pagehash);
+
+ // record the time of modification
+ $pagehash["lastmodified"] = time();
+
+ if (IsWikiPage($dbi, $pagename)) {
+
+ $PAIRS = "author='$pagehash[author]'," .
+ "content='$pagehash[content]'," .
+ "created=$pagehash[created]," .
+ "flags=$pagehash[flags]," .
+ "lastmodified=$pagehash[lastmodified]," .
+ "pagename='$pagehash[pagename]'," .
+ "refs='$pagehash[refs]'," .
+ "version=$pagehash[version]";
+
+ $query = "UPDATE $dbi[table] SET $PAIRS WHERE pagename='$pagename'";
+
+ } else {
+ // do an insert
+ // build up the column names and values for the query
+
+ $COLUMNS = "author, content, created, flags, lastmodified, " .
+ "pagename, refs, version";
+
+ $VALUES = "'$pagehash[author]', '$pagehash[content]', " .
+ "$pagehash[created], $pagehash[flags], " .
+ "$pagehash[lastmodified], '$pagehash[pagename]', " .
+ "'$pagehash[refs]', $pagehash[version]";
+
+
+ $query = "INSERT INTO $dbi[table] ($COLUMNS) VALUES($VALUES)";
+ }
+
+ //echo "
Insert/Update Query: $query
\n";
+
+ $retval = mssql_query($query);
+ if ($retval == false) {
+ printf(gettext ("Insert/Update failed: %s \n"), mssql_get_last_message());
+ }
+ }
+
+
+ // for archiving pages to a seperate dbm
+ function SaveCopyToArchive($dbi, $pagename, $pagehash) {
+ global $ArchivePageStore;
+ $adbi = OpenDataBase($ArchivePageStore);
+ InsertPage($adbi, $pagename, $pagehash);
+ }
+
+
+ function IsWikiPage($dbi, $pagename) {
+ $pagename = addslashes($pagename);
+ if ($res = mssql_query("select count(*) from $dbi[table] where pagename='$pagename'", $dbi['dbc'])) {
+ return(mssql_result($res, 0, 0));
+ }
+ return 0;
+ }
+
+ function IsInArchive($dbi, $pagename) {
+ global $ArchivePageStore;
+
+ $pagename = addslashes($pagename);
+ if ($res = mssql_query("select count(*) from $ArchivePageStore where pagename='$pagename'", $dbi['dbc'])) {
+ return(mssql_result($res, 0, 0));
+ }
+ return 0;
+ }
+
+
+ function RemovePage($dbi, $pagename) {
+ global $WikiPageStore, $ArchivePageStore;
+ global $WikiLinksStore, $HitCountStore, $WikiScoreStore;
+
+ $pagename = addslashes($pagename);
+ $msg = gettext ("Cannot delete '%s' from table '%s'");
+ $msg .= " \n";
+ $msg .= gettext ("MSSQL error: %s");
+
+ if (!mssql_query("delete from $WikiPageStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $WikiPageStore, mssql_get_last_message()));
+
+ if (!mssql_query("delete from $ArchivePageStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $ArchivePageStore, mssql_get_last_message()));
+
+ if (!mssql_query("delete from $WikiLinksStore where frompage='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $WikiLinksStore, mssql_get_last_message()));
+
+ if (!mssql_query("delete from $HitCountStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $HitCountStore, mssql_get_last_message()));
+
+ if (!mssql_query("delete from $WikiScoreStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $WikiScoreStore, mssql_get_last_message()));
+ }
+
+
+ function IncreaseHitCount($dbi, $pagename)
+ {
+ global $HitCountStore;
+
+ $qpagename = addslashes($pagename);
+ $rowexists = 0;
+ if ($res = mssql_query("select count(*) from $dbi[table] where pagename='$qpagename'", $dbi['dbc'])) {
+ $rowexists = (mssql_result($res, 0, 0));
+ }
+
+ if ($rowexists)
+ $res = mssql_query("update $HitCountStore set hits=hits+1 where pagename='$qpagename'", $dbi['dbc']);
+ else
+ $res = mssql_query("insert into $HitCountStore (pagename, hits) values ('$qpagename', 1)", $dbi['dbc']);
+
+ return $res;
+ }
+
+ function GetHitCount($dbi, $pagename)
+ {
+ global $HitCountStore;
+
+ $qpagename = addslashes($pagename);
+ $res = mssql_query("select hits from $HitCountStore where pagename='$qpagename'", $dbi['dbc']);
+ if (mssql_num_rows($res))
+ $hits = mssql_result($res, 0, 0);
+ else
+ $hits = "0";
+
+ return $hits;
+ }
+
+ function MakeSQLSearchClause($search, $column)
+ {
+ $search = preg_replace("/\s+/", " ", trim($search));
+ $search = preg_replace('/(?=[%_\\\\])/', "\\", $search);
+ $search = addslashes($search);
+
+ $term = strtok($search, ' ');
+ $clause = '';
+ while($term) {
+ $word = "$term";
+ if ($word[0] == '-') {
+ $word = substr($word, 1);
+ $clause .= "not ($column like '%$word%') ";
+ } else {
+ $clause .= "($column like '%$word%') ";
+ }
+ if ($term = strtok(' '))
+ $clause .= 'and ';
+ }
+ return $clause;
+ }
+
+ // setup for title-search
+ function InitTitleSearch($dbi, $search) {
+ $clause = MakeSQLSearchClause($search, 'pagename');
+ $res = mssql_query("select pagename from $dbi[table] where $clause order by pagename", $dbi["dbc"]);
+
+ return $res;
+ }
+
+
+ // iterating through database
+ function TitleSearchNextMatch($dbi, $res) {
+ if($o = mssql_fetch_object($res)) {
+ return $o->pagename;
+ }
+ else {
+ return 0;
+ }
+ }
+
+
+ // setup for full-text search
+ function InitFullSearch($dbi, $search) {
+ $clause = MakeSQLSearchClause($search, 'content');
+ $res = mssql_query("select * from $dbi[table] where $clause", $dbi["dbc"]);
+
+ return $res;
+ }
+
+ // iterating through database
+ function FullSearchNextMatch($dbi, $res) {
+ if($hash = mssql_fetch_array($res)) {
+ return MakePageHash($hash);
+ }
+ else {
+ return 0;
+ }
+ }
+
+ function InitMostPopular($dbi, $limit) {
+ global $HitCountStore;
+ $res = mssql_query("select top $limit * from $HitCountStore order by hits desc, pagename", $dbi["dbc"]);
+
+ return $res;
+ }
+
+ function MostPopularNextMatch($dbi, $res) {
+ if ($hits = mssql_fetch_array($res))
+ return $hits;
+ else
+ return 0;
+ }
+
+ function GetAllWikiPageNames($dbi) {
+ global $WikiPageStore;
+ $res = mssql_query("select pagename from $WikiPageStore", $dbi["dbc"]);
+ $rows = mssql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $pages[$i] = mssql_result($res, $i, 0);
+ }
+ return $pages;
+ }
+
+
+ ////////////////////////////////////////
+ // functionality for the wikilinks table
+
+ // takes a page name, returns array of scored incoming and outgoing links
+ function GetWikiPageLinks($dbi, $pagename) {
+ global $WikiLinksStore, $WikiScoreStore, $HitCountStore;
+
+ $pagename = addslashes($pagename);
+ $res = mssql_query("select topage, score from $WikiLinksStore, $WikiScoreStore where topage=pagename and frompage='$pagename' order by score desc, topage");
+ $rows = mssql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = mssql_fetch_array($res);
+ $links['out'][] = array($out['topage'], $out['score']);
+ }
+
+ $res = mssql_query("select frompage, score from $WikiLinksStore, $WikiScoreStore where frompage=pagename and topage='$pagename' order by score desc, frompage");
+ $rows = mssql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = mssql_fetch_array($res);
+ $links['in'][] = array($out['frompage'], $out['score']);
+ }
+
+ $res = mssql_query("select distinct pagename, hits from $WikiLinksStore, $HitCountStore where (frompage=pagename and topage='$pagename') or (topage=pagename and frompage='$pagename') order by hits desc, pagename");
+ $rows = mssql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = mssql_fetch_array($res);
+ $links['popular'][] = array($out['pagename'], $out['hits']);
+ }
+
+ return $links;
+ }
+
+
+ // takes page name, list of links it contains
+ // the $linklist is an array where the keys are the page names
+ function SetWikiPageLinks($dbi, $pagename, $linklist) {
+ global $WikiLinksStore, $WikiScoreStore;
+
+ $frompage = addslashes($pagename);
+
+ // first delete the old list of links
+ mssql_query("delete from $WikiLinksStore where frompage='$frompage'",
+ $dbi["dbc"]);
+
+ // the page may not have links, return if not
+ if (! count($linklist))
+ return;
+ // now insert the new list of links
+ while (list($topage, $count) = each($linklist)) {
+ $topage = addslashes($topage);
+ if($topage != $frompage) {
+ mssql_query("insert into $WikiLinksStore (frompage, topage) " .
+ "values ('$frompage', '$topage')", $dbi["dbc"]);
+ }
+ }
+
+ // update pagescore
+ mssql_query("delete from $WikiScoreStore", $dbi["dbc"]);
+ mssql_query("insert into $WikiScoreStore select w1.topage, count(*) from $WikiLinksStore as w1, $WikiLinksStore as w2 where w2.topage=w1.frompage group by w1.topage", $dbi["dbc"]);
+ }
+
+/* more mssql queries:
+
+orphans:
+select pagename from wiki left join wikilinks on pagename=topage where topage is NULL;
+*/
+?>
diff --git a/docroot/phpwiki/lib/mysql.php b/docroot/phpwiki/lib/mysql.php
new file mode 100755
index 0000000..34cc65f
--- /dev/null
+++ b/docroot/phpwiki/lib/mysql.php
@@ -0,0 +1,394 @@
+";
+ $msg .= sprintf(gettext ("MySQL error: %s"), mysql_error());
+ ExitWiki($msg);
+ }
+ if (!mysql_select_db($mysql_db, $dbc)) {
+ $msg = sprintf(gettext ("Cannot open database %s, giving up."), $mysql_db);
+ $msg .= " ";
+ $msg .= sprintf(gettext ("MySQL error: %s"), mysql_error());
+ ExitWiki($msg);
+ }
+ $dbi['dbc'] = $dbc;
+ $dbi['table'] = $dbname;
+ return $dbi;
+ }
+
+
+ function CloseDataBase($dbi) {
+ // NOP function
+ // mysql connections are established as persistant
+ // they cannot be closed through mysql_close()
+ }
+
+
+ // prepare $pagehash for storing in mysql
+ function MakeDBHash($pagename, $pagehash)
+ {
+ $pagehash["pagename"] = addslashes($pagename);
+ if (!isset($pagehash["flags"]))
+ $pagehash["flags"] = 0;
+ $pagehash["author"] = addslashes($pagehash["author"]);
+ $pagehash["content"] = implode("\n", $pagehash["content"]);
+ $pagehash["content"] = addslashes($pagehash["content"]);
+ if (!isset($pagehash["refs"]))
+ $pagehash["refs"] = array();
+ $pagehash["refs"] = serialize($pagehash["refs"]);
+
+ return $pagehash;
+ }
+
+
+ // convert mysql result $dbhash to $pagehash
+ function MakePageHash($dbhash)
+ {
+ // unserialize/explode content
+ $dbhash['refs'] = unserialize($dbhash['refs']);
+ $dbhash['content'] = explode("\n", $dbhash['content']);
+ return $dbhash;
+ }
+
+
+ // Return hash of page + attributes or default
+ function RetrievePage($dbi, $pagename, $pagestore) {
+ $pagename = addslashes($pagename);
+ if ($res = mysql_query("select * from $pagestore where pagename='$pagename'", $dbi['dbc'])) {
+ if ($dbhash = mysql_fetch_array($res)) {
+ return MakePageHash($dbhash);
+ }
+ }
+ return -1;
+ }
+
+
+ // Either insert or replace a key/value (a page)
+ function InsertPage($dbi, $pagename, $pagehash)
+ {
+ global $WikiPageStore; // ugly hack
+
+ if ($dbi['table'] == $WikiPageStore) { // HACK
+ $linklist = ExtractWikiPageLinks($pagehash['content']);
+ SetWikiPageLinks($dbi, $pagename, $linklist);
+ }
+
+ $pagehash = MakeDBHash($pagename, $pagehash);
+
+ $COLUMNS = "author, content, created, flags, " .
+ "lastmodified, pagename, refs, version";
+
+ $VALUES = "'$pagehash[author]', '$pagehash[content]', " .
+ "$pagehash[created], $pagehash[flags], " .
+ "$pagehash[lastmodified], '$pagehash[pagename]', " .
+ "'$pagehash[refs]', $pagehash[version]";
+
+ if (!mysql_query("replace into $dbi[table] ($COLUMNS) values ($VALUES)",
+ $dbi['dbc'])) {
+ $msg = sprintf(gettext ("Error writing page '%s'"), $pagename);
+ $msg .= " ";
+ $msg .= sprintf(gettext ("MySQL error: %s"), mysql_error());
+ ExitWiki($msg);
+ }
+ }
+
+
+ // for archiving pages to a seperate dbm
+ function SaveCopyToArchive($dbi, $pagename, $pagehash) {
+ global $ArchivePageStore;
+ $adbi = OpenDataBase($ArchivePageStore);
+ InsertPage($adbi, $pagename, $pagehash);
+ }
+
+
+ function IsWikiPage($dbi, $pagename) {
+ $pagename = addslashes($pagename);
+ if ($res = mysql_query("select count(*) from $dbi[table] where pagename='$pagename'", $dbi['dbc'])) {
+ return(mysql_result($res, 0));
+ }
+ return 0;
+ }
+
+ function IsInArchive($dbi, $pagename) {
+ global $ArchivePageStore;
+
+ $pagename = addslashes($pagename);
+ if ($res = mysql_query("select count(*) from $ArchivePageStore where pagename='$pagename'", $dbi['dbc'])) {
+ return(mysql_result($res, 0));
+ }
+ return 0;
+ }
+
+
+ function RemovePage($dbi, $pagename) {
+ global $WikiPageStore, $ArchivePageStore;
+ global $WikiLinksStore, $HitCountStore, $WikiScoreStore;
+
+ $pagename = addslashes($pagename);
+ $msg = gettext ("Cannot delete '%s' from table '%s'");
+ $msg .= " \n";
+ $msg .= gettext ("MySQL error: %s");
+
+ if (!mysql_query("delete from $WikiPageStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $WikiPageStore, mysql_error()));
+
+ if (!mysql_query("delete from $ArchivePageStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $ArchivePageStore, mysql_error()));
+
+ if (!mysql_query("delete from $WikiLinksStore where frompage='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $WikiLinksStore, mysql_error()));
+
+ if (!mysql_query("delete from $HitCountStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $HitCountStore, mysql_error()));
+
+ if (!mysql_query("delete from $WikiScoreStore where pagename='$pagename'", $dbi['dbc']))
+ ExitWiki(sprintf($msg, $pagename, $WikiScoreStore, mysql_error()));
+ }
+
+
+ function IncreaseHitCount($dbi, $pagename)
+ {
+ global $HitCountStore;
+
+ $qpagename = addslashes($pagename);
+ $res = mysql_query("update $HitCountStore set hits=hits+1"
+ . " where pagename='$qpagename'",
+ $dbi['dbc']);
+
+ if (!mysql_affected_rows($dbi['dbc'])) {
+ $res = mysql_query("insert into $HitCountStore (pagename, hits)"
+ . " values ('$qpagename', 1)",
+ $dbi['dbc']);
+ }
+
+ return $res;
+ }
+
+ function GetHitCount($dbi, $pagename)
+ {
+ global $HitCountStore;
+
+ $qpagename = addslashes($pagename);
+ $res = mysql_query("select hits from $HitCountStore"
+ . " where pagename='$qpagename'",
+ $dbi['dbc']);
+ if (mysql_num_rows($res))
+ $hits = mysql_result($res, 0);
+ else
+ $hits = "0";
+
+ return $hits;
+ }
+
+ function MakeSQLSearchClause($search, $column)
+ {
+ $search = preg_replace("/\s+/", " ", trim($search));
+ $search = preg_replace('/(?=[%_\\\\])/', "\\", $search);
+ $search = addslashes($search);
+
+ $term = strtok($search, ' ');
+ $clause = '';
+ while($term) {
+ $word = strtolower("$term");
+ if ($word[0] == '-') {
+ $word = substr($word, 1);
+ $clause .= "not (LCASE($column) like '%$word%') ";
+ } else {
+ $clause .= "(LCASE($column) like '%$word%') ";
+ }
+ if ($term = strtok(' '))
+ $clause .= 'AND ';
+ }
+
+ return $clause;
+ }
+
+ // setup for title-search
+ function InitTitleSearch($dbi, $search) {
+ $clause = MakeSQLSearchClause($search, 'pagename');
+ $res = mysql_query("select pagename from $dbi[table] where $clause order by pagename", $dbi["dbc"]);
+
+ return $res;
+ }
+
+
+ // iterating through database
+ function TitleSearchNextMatch($dbi, $res) {
+ if($o = mysql_fetch_object($res)) {
+ return $o->pagename;
+ }
+ else {
+ return 0;
+ }
+ }
+
+
+ // setup for full-text search
+ function InitFullSearch($dbi, $search) {
+ $clause = MakeSQLSearchClause($search, 'content');
+ $res = mysql_query("select * from $dbi[table] where $clause", $dbi["dbc"]);
+
+ return $res;
+ }
+
+ // iterating through database
+ function FullSearchNextMatch($dbi, $res) {
+ if($hash = mysql_fetch_array($res)) {
+ return MakePageHash($hash);
+ }
+ else {
+ return 0;
+ }
+ }
+
+ // setup for back-link search
+ function InitBackLinkSearch($dbi, $pagename) {
+ global $WikiLinksStore;
+
+ $topage = addslashes($pagename);
+ $res = mysql_query( "SELECT DISTINCT frompage FROM $WikiLinksStore"
+ . " WHERE topage='$topage'"
+ . " ORDER BY frompage",
+ $dbi["dbc"]);
+ return $res;
+ }
+
+
+ // iterating through database
+ function BackLinkSearchNextMatch($dbi, $res) {
+ if($a = mysql_fetch_row($res)) {
+ return $a[0];
+ }
+ else {
+ return 0;
+ }
+ }
+
+
+ function InitMostPopular($dbi, $limit) {
+ global $HitCountStore;
+ $res = mysql_query("select * from $HitCountStore order by hits desc, pagename limit $limit", $dbi["dbc"]);
+
+ return $res;
+ }
+
+ function MostPopularNextMatch($dbi, $res) {
+ if ($hits = mysql_fetch_array($res))
+ return $hits;
+ else
+ return 0;
+ }
+
+ function GetAllWikiPageNames($dbi) {
+ global $WikiPageStore;
+ $res = mysql_query("select pagename from $WikiPageStore", $dbi["dbc"]);
+ $rows = mysql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $pages[$i] = mysql_result($res, $i);
+ }
+ return $pages;
+ }
+
+
+ ////////////////////////////////////////
+ // functionality for the wikilinks table
+
+ // takes a page name, returns array of scored incoming and outgoing links
+ function GetWikiPageLinks($dbi, $pagename) {
+ global $WikiLinksStore, $WikiScoreStore, $HitCountStore;
+
+ $pagename = addslashes($pagename);
+ $res = mysql_query("select topage, score from $WikiLinksStore, $WikiScoreStore where topage=pagename and frompage='$pagename' order by score desc, topage");
+ $rows = mysql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = mysql_fetch_array($res);
+ $links['out'][] = array($out['topage'], $out['score']);
+ }
+
+ $res = mysql_query("select frompage, score from $WikiLinksStore, $WikiScoreStore where frompage=pagename and topage='$pagename' order by score desc, frompage");
+ $rows = mysql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = mysql_fetch_array($res);
+ $links['in'][] = array($out['frompage'], $out['score']);
+ }
+
+ $res = mysql_query("select distinct pagename, hits from $WikiLinksStore, $HitCountStore where (frompage=pagename and topage='$pagename') or (topage=pagename and frompage='$pagename') order by hits desc, pagename");
+ $rows = mysql_num_rows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = mysql_fetch_array($res);
+ $links['popular'][] = array($out['pagename'], $out['hits']);
+ }
+
+ return $links;
+ }
+
+
+ // takes page name, list of links it contains
+ // the $linklist is an array where the keys are the page names
+ function SetWikiPageLinks($dbi, $pagename, $linklist) {
+ global $WikiLinksStore, $WikiScoreStore;
+
+ $frompage = addslashes($pagename);
+
+ // first delete the old list of links
+ mysql_query("delete from $WikiLinksStore where frompage='$frompage'",
+ $dbi["dbc"]);
+
+ // the page may not have links, return if not
+ if (! count($linklist))
+ return;
+ // now insert the new list of links
+ while (list($topage, $count) = each($linklist)) {
+ $topage = addslashes($topage);
+ if($topage != $frompage) {
+ mysql_query("insert into $WikiLinksStore (frompage, topage) " .
+ "values ('$frompage', '$topage')", $dbi["dbc"]);
+ }
+ }
+
+ // update pagescore
+ mysql_query("delete from $WikiScoreStore", $dbi["dbc"]);
+ mysql_query("insert into $WikiScoreStore select w1.topage, count(*) from $WikiLinksStore as w1, $WikiLinksStore as w2 where w2.topage=w1.frompage group by w1.topage", $dbi["dbc"]);
+ }
+
+/* more mysql queries:
+
+orphans:
+select pagename from wiki left join wikilinks on pagename=topage where topage is NULL;
+*/
+?>
diff --git a/docroot/phpwiki/lib/pageinfo.php b/docroot/phpwiki/lib/pageinfo.php
new file mode 100755
index 0000000..add912f
--- /dev/null
+++ b/docroot/phpwiki/lib/pageinfo.php
@@ -0,0 +1,76 @@
+
+
+\n" .
+ " " .
+ " $enter\n" .
+ " \n" .
+ " \n";
+
+ while (list($key, $val) = each($pagehash)) {
+ if ($key > 0 || !$key) #key is an array index
+ continue;
+ if ((gettype($val) == "array") && ($showpagesource == "on")) {
+ $val = implode($val, "$FieldSeparator#BR#$FieldSeparator\n");
+ $val = htmlspecialchars($val);
+ $val = str_replace("$FieldSeparator#BR#$FieldSeparator", " ", $val);
+ }
+ elseif (($key == 'lastmodified') || ($key == 'created'))
+ $val = date($datetimeformat, $val);
+ else
+ $val = htmlspecialchars($val);
+
+ $table .= "
$key $val \n";
+ }
+
+ $table .= "";
+ }
+ return $table;
+ }
+
+ $html .= "";
+ $html .= gettext ("Current version");
+ $html .= "
";
+ // $dbi = OpenDataBase($WikiPageStore); --- done by index.php
+ $html .= ViewPageProps($info, $WikiPageStore);
+
+ $html .= "";
+ $html .= gettext ("Archived version");
+ $html .= "
";
+ // $dbi = OpenDataBase($ArchivePageStore);
+ $html .= ViewPageProps($info, $ArchivePageStore);
+
+ GeneratePage('MESSAGE', $html, gettext("PageInfo").": '$info'", 0);
+?>
diff --git a/docroot/phpwiki/lib/pgsql.php b/docroot/phpwiki/lib/pgsql.php
new file mode 100755
index 0000000..1b40e77
--- /dev/null
+++ b/docroot/phpwiki/lib/pgsql.php
@@ -0,0 +1,449 @@
+dbi after open: '$dbi' '$dbi[table]' '$dbi[dbc]'\n";
+ return $dbi;
+ }
+
+
+ function CloseDataBase($dbi) {
+ // NOOP: we use persistent database connections
+ }
+
+
+ // Return hash of page + attributes or default
+ function RetrievePage($dbi, $pagename, $pagestore) {
+ $pagename = addslashes($pagename);
+ $query = "select * from $pagestore where pagename='$pagename'";
+ // echo "
$query
";
+ $res = pg_exec($dbi['dbc'], $query);
+
+ if (pg_numrows($res)) {
+ if ($array = pg_fetch_array($res, 0)) {
+ while (list($key, $val) = each($array)) {
+ // pg_fetch_array gives us all the values twice,
+ // so we have to manually edit out the indices
+ if (gettype($key) == "integer") {
+ continue;
+ }
+ $pagehash[$key] = $val;
+ }
+
+ // unserialize/explode content
+ $pagehash['refs'] = unserialize($pagehash['refs']);
+ $pagehash['content'] = explode("\n", $pagehash['content']);
+
+ return $pagehash;
+ }
+ }
+
+ // if we reach this the query failed
+ return -1;
+ }
+
+
+ // Either insert or replace a key/value (a page)
+ function InsertPage($dbi, $pagename, $pagehash) {
+ // update the wikilinks table
+ $linklist = ExtractWikiPageLinks($pagehash['content']);
+ SetWikiPageLinks($dbi, $pagename, $linklist);
+
+
+ // prepare the content for storage
+ if (!isset($pagehash["pagename"]))
+ $pagehash["pagename"] = $pagename;
+ if (!isset($pagehash["flags"]))
+ $pagehash["flags"] = 0;
+ $pagehash["author"] = addslashes($pagehash["author"]);
+ $pagehash["content"] = implode("\n", $pagehash["content"]);
+ $pagehash["content"] = addslashes($pagehash["content"]);
+ $pagehash["pagename"] = addslashes($pagehash["pagename"]);
+ $pagehash["refs"] = serialize($pagehash["refs"]);
+
+ // Check for empty variables which can cause a sql error
+ if(empty($pagehash["created"]))
+ $pagehash["created"] = time();
+ if(empty($pagehash["version"]))
+ $pagehash["version"] = 1;
+
+ // record the time of modification
+ $pagehash["lastmodified"] = time();
+
+
+ if (IsWikiPage($dbi, $pagename)) {
+
+ $PAIRS = "author='$pagehash[author]'," .
+ "content='$pagehash[content]'," .
+ "created=$pagehash[created]," .
+ "flags=$pagehash[flags]," .
+ "lastmodified=$pagehash[lastmodified]," .
+ "pagename='$pagehash[pagename]'," .
+ "refs='$pagehash[refs]'," .
+ "version=$pagehash[version]";
+
+ $query = "UPDATE $dbi[table] SET $PAIRS WHERE pagename='$pagename'";
+
+ } else {
+ // do an insert
+ // build up the column names and values for the query
+
+ $COLUMNS = "author, content, created, flags, " .
+ "lastmodified, pagename, refs, version";
+
+ $VALUES = "'$pagehash[author]', '$pagehash[content]', " .
+ "$pagehash[created], $pagehash[flags], " .
+ "$pagehash[lastmodified], '$pagehash[pagename]', " .
+ "'$pagehash[refs]', $pagehash[version]";
+
+
+ $query = "INSERT INTO $dbi[table] ($COLUMNS) VALUES($VALUES)";
+ }
+
+ // echo "
Query: $query
\n";
+ $retval = pg_exec($dbi['dbc'], $query);
+ if ($retval == false)
+ echo "Insert/update failed: " . pg_errormessage($dbi['dbc']);
+
+ }
+
+
+ function SaveCopyToArchive($dbi, $pagename, $pagehash) {
+ global $ArchivePageStore;
+ // echo "
save copy called
";
+
+ // echo "
dbi in SaveCopyToArchive: '$dbi' '$ArchivePageStore' '$dbi[dbc]'
";
+
+ // prepare the content for storage
+ if (!isset($pagehash["pagename"]))
+ $pagehash["pagename"] = $pagename;
+ if (!isset($pagehash["flags"]))
+ $pagehash["flags"] = 0;
+ $pagehash["author"] = addslashes($pagehash["author"]);
+ $pagehash["content"] = implode("\n", $pagehash["content"]);
+ $pagehash["content"] = addslashes($pagehash["content"]);
+ $pagehash["pagename"] = addslashes($pagehash["pagename"]);
+ $pagehash["refs"] = serialize($pagehash["refs"]);
+
+ if (IsInArchive($dbi, $pagename)) {
+
+ $PAIRS = "author='$pagehash[author]'," .
+ "content='$pagehash[content]'," .
+ "created=$pagehash[created]," .
+ "flags=$pagehash[flags]," .
+ "lastmodified=$pagehash[lastmodified]," .
+ "pagename='$pagehash[pagename]'," .
+ "refs='$pagehash[refs]'," .
+ "version=$pagehash[version]";
+
+ $query = "UPDATE $ArchivePageStore SET $PAIRS WHERE pagename='$pagehash[pagename]'";
+
+ } else {
+ // do an insert
+ // build up the column names and values for the query
+
+ $COLUMNS = "author, content, created, flags, " .
+ "lastmodified, pagename, refs, version";
+
+ $VALUES = "'$pagehash[author]', '$pagehash[content]', " .
+ "$pagehash[created], $pagehash[flags], " .
+ "$pagehash[lastmodified], '$pagehash[pagename]', " .
+ "'$pagehash[refs]', $pagehash[version]";
+
+
+ $query = "INSERT INTO $ArchivePageStore ($COLUMNS) VALUES($VALUES)";
+ }
+
+ // echo "
Query: $query
\n";
+ $retval = pg_exec($dbi['dbc'], $query);
+ if ($retval == false)
+ echo "Insert/update failed: " . pg_errormessage($dbi['dbc']);
+
+
+ }
+
+
+ function IsWikiPage($dbi, $pagename) {
+ global $WikiPageStore;
+ $pagename = addslashes($pagename);
+ $query = "select count(*) from $WikiPageStore " .
+ "where pagename='$pagename'";
+ $res = pg_exec($query);
+ $array = pg_fetch_array($res, 0);
+ return $array[0];
+ }
+
+
+ function IsInArchive($dbi, $pagename) {
+ global $ArchivePageStore;
+ $pagename = addslashes($pagename);
+ $query = "select count(*) from $ArchivePageStore " .
+ "where pagename='$pagename'";
+ $res = pg_exec($query);
+ $array = pg_fetch_array($res, 0);
+ return $array[0];
+ }
+
+
+ // setup for title-search
+ function InitTitleSearch($dbi, $search) {
+
+ global $search_counter;
+ $search_counter = 0;
+
+ $search = strtolower($search);
+ $search = preg_replace('/(?=[%_\\\\])/', "\\", $search);
+ $search = addslashes($search);
+ $query = "select pagename from $dbi[table] where lower(pagename) " .
+ "like '%$search%' order by pagename";
+ //echo "search query: $query \n";
+ $res = pg_exec($dbi["dbc"], $query);
+
+ return $res;
+ }
+
+
+ // iterating through database
+ function TitleSearchNextMatch($dbi, $res) {
+ global $search_counter;
+ if($o = @pg_fetch_object($res, $search_counter)) {
+ $search_counter++;
+ return $o->pagename;
+ } else {
+ return 0;
+ }
+ }
+
+
+ // setup for full-text search
+ function InitFullSearch($dbi, $search) {
+ global $search_counter;
+ $search_counter = 0;
+ $search = strtolower($search);
+ $search = preg_replace('/(?=[%_\\\\])/', "\\", $search);
+ $search = addslashes($search);
+ $query = "select pagename,content from $dbi[table] " .
+ "where lower(content) like '%$search%'";
+
+ $res = pg_exec($dbi["dbc"], $query);
+
+ return $res;
+ }
+
+ // iterating through database
+ function FullSearchNextMatch($dbi, $res) {
+ global $search_counter;
+ if ($hash = @pg_fetch_array($res, $search_counter)) {
+ $search_counter++;
+ $page['pagename'] = $hash["pagename"];
+ $page['content'] = explode("\n", $hash["content"]);
+ return $page;
+ }
+ else {
+ return 0;
+ }
+ }
+
+
+ ////////////////////////
+ // new database features
+
+ // setup for back-link search
+ function InitBackLinkSearch($dbi, $pagename) {
+ global $WikiLinksPageStore;
+
+ $topage = addslashes($pagename);
+ $query = "SELECT DISTINCT frompage FROM $WikiLinksPageStore"
+ . " WHERE topage='$topage'"
+ . " ORDER BY frompage";
+ $res['res'] = pg_exec( $dbi["dbc"], $query);
+ $res['row'] = 0;
+ return $res;
+ }
+
+
+// iterating through database
+function BackLinkSearchNextMatch($dbi, &$res) {
+ if($a = @pg_fetch_row($res['res'], $res['row'])) {
+ $res['row']++;
+ return $a[0];
+ }
+ else {
+ return 0;
+ }
+}
+
+
+ function IncreaseHitCount($dbi, $pagename) {
+ global $HitCountPageStore;
+
+ $qpagename = addslashes($pagename);
+ $query = "update $HitCountPageStore set hits=hits+1 where pagename='$qpagename'";
+ $res = pg_exec($dbi['dbc'], $query);
+
+ if (!pg_cmdtuples($res)) {
+ $query = "insert into $HitCountPageStore (pagename, hits) " .
+ "values ('$qpagename', 1)";
+ $res = pg_exec($dbi['dbc'], $query);
+ }
+
+ return $res;
+ }
+
+ function GetHitCount($dbi, $pagename) {
+ global $HitCountPageStore;
+ $qpagename = addslashes($pagename);
+ $query = "select hits from $HitCountPageStore where pagename='$qpagename'";
+ $res = pg_exec($dbi['dbc'], $query);
+ if (pg_cmdtuples($res)) {
+ $hits = pg_result($res, 0, "hits");
+ } else {
+ $hits = "0";
+ }
+
+ return $hits;
+ }
+
+
+
+ function InitMostPopular($dbi, $limit) {
+
+ global $pg_most_pop_ctr, $HitCountPageStore;
+ $pg_most_pop_ctr = 0;
+
+ $query = "select * from $HitCountPageStore " .
+ "order by hits desc, pagename limit $limit";
+ $res = pg_exec($dbi['dbc'], $query);
+ return $res;
+ }
+
+ function MostPopularNextMatch($dbi, $res) {
+
+ global $pg_most_pop_ctr;
+ if ($hits = @pg_fetch_array($res, $pg_most_pop_ctr)) {
+ $pg_most_pop_ctr++;
+ return $hits;
+ } else {
+ return 0;
+ }
+ }
+
+ function GetAllWikiPageNames($dbi) {
+ global $WikiPageStore;
+ $res = pg_exec($dbi['dbc'], "select pagename from $WikiPageStore");
+ $rows = pg_numrows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $pages[$i] = pg_result($res, $i, "pagename");
+ }
+ return $pages;
+ }
+
+ ////////////////////////////////////////
+ // functionality for the wikilinks table
+
+ // takes a page name, returns array of links
+ function GetWikiPageLinks($dbi, $pagename) {
+ global $WikiLinksPageStore;
+ $pagename = addslashes($pagename);
+
+ $res = pg_exec("select topage, score from wikilinks, wikiscore where topage=pagename and frompage='$pagename' order by score desc, topage");
+ $rows = pg_numrows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = pg_fetch_array($res, $i);
+ $links['out'][] = array($out['topage'], $out['score']);
+ }
+
+ $res = pg_exec("select frompage, score from wikilinks, wikiscore where frompage=pagename and topage='$pagename' order by score desc, frompage");
+ $rows = pg_numrows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = pg_fetch_array($res, $i);
+ $links['in'][] = array($out['frompage'], $out['score']);
+ }
+
+ $res = pg_exec("select distinct pagename, hits from wikilinks, hitcount where (frompage=pagename and topage='$pagename') or (topage=pagename and frompage='$pagename') order by hits desc, pagename");
+ $rows = pg_numrows($res);
+ for ($i = 0; $i < $rows; $i++) {
+ $out = pg_fetch_array($res, $i);
+ $links['popular'][] = array($out['pagename'], $out['hits']);
+ }
+
+ return $links;
+
+ }
+
+
+ // takes page name, list of links it contains
+ // the $linklist is an array where the keys are the page names
+
+ function SetWikiPageLinks($dbi, $pagename, $linklist) {
+ global $WikiLinksPageStore;
+ $frompage = addslashes($pagename);
+
+ // first delete the old list of links
+ $query = "delete from $WikiLinksPageStore where frompage='$frompage'";
+ //echo "$query \n";
+ $res = pg_exec($dbi['dbc'], $query);
+
+ // the page may not have links, return if not
+ if (! count($linklist))
+ return;
+
+ // now insert the new list of links
+ reset($linklist);
+ while (list($topage, $count) = each($linklist)) {
+ $topage = addslashes($topage);
+ if ($topage != $frompage) {
+ $query = "insert into $WikiLinksPageStore (frompage, topage) " .
+ "values ('$frompage', '$topage')";
+ //echo "$query \n";
+ $res = pg_exec($dbi['dbc'], $query);
+ }
+ }
+ // update pagescore
+ pg_exec("delete from wikiscore");
+ pg_exec("insert into wikiscore select w1.topage, count(*) from wikilinks as w1, wikilinks as w2 where w2.topage=w1.frompage group by w1.topage");
+
+ }
+
+
+?>
diff --git a/docroot/phpwiki/lib/savepage.php b/docroot/phpwiki/lib/savepage.php
new file mode 100755
index 0000000..28bbbc9
--- /dev/null
+++ b/docroot/phpwiki/lib/savepage.php
@@ -0,0 +1,208 @@
+";
+ $html .= gettext ("PhpWiki is unable to save your changes, because another user edited and saved the page while you were editing the page too. If saving proceeded now changes from the previous author would be lost.");
+ $html .= "
\n";
+ $html .= gettext ("In order to recover from this situation follow these steps:");
+ $html .= "\n
";
+ $html .= gettext ("Use your browser's Back button to go back to the edit page.");
+ $html .= "\n ";
+ $html .= gettext ("Copy your changes to the clipboard or to another temporary place (e.g. text editor).");
+ $html .= "\n ";
+ $html .= gettext ("Reload the page. You should now see the most current version of the page. Your changes are no longer there.");
+ $html .= "\n ";
+ $html .= gettext ("Make changes to the file again. Paste your additions from the clipboard (or text editor).");
+ $html .= "\n ";
+ $html .= gettext ("Press Save again.");
+ $html .= " \n";
+ $html .= gettext ("Sorry for the inconvenience.");
+ $html .= "
";
+
+ GeneratePage('MESSAGE', $html,
+ sprintf (gettext ("Problem while updating %s"), $pagename), 0);
+ exit;
+ }
+
+
+ if (get_magic_quotes_gpc()) {
+ $post = stripslashes($post);
+ }
+ $pagename = rawurldecode($post);
+ $pagehash = RetrievePage($dbi, $pagename, $WikiPageStore);
+
+ // if this page doesn't exist yet, now's the time!
+ if (! is_array($pagehash)) {
+ $pagehash = array();
+ $pagehash['version'] = 0;
+ $pagehash['created'] = time();
+ $pagehash['flags'] = 0;
+ $newpage = 1;
+ } else {
+ if (($pagehash['flags'] & FLAG_PAGE_LOCKED) && !defined('WIKI_ADMIN')) {
+ $html = "" . gettext ("This page has been locked by the administrator and cannot be edited.");
+ $html .= "\n
" . gettext ("Sorry for the inconvenience.");
+ GeneratePage('MESSAGE', $html, sprintf (gettext ("Problem while editing %s"), $pagename), 0);
+ ExitWiki ("");
+ }
+
+ if(isset($editversion) && ($editversion != $pagehash['version'])) {
+ ConcurrentUpdates($pagename);
+ }
+
+ // archive it if it's a new author
+ if ($pagehash['author'] != $remoteuser) {
+ SaveCopyToArchive($dbi, $pagename, $pagehash);
+ }
+ $newpage = 0;
+ }
+
+ // set new pageinfo
+ $pagehash['lastmodified'] = time();
+ $pagehash['version']++;
+ $pagehash['author'] = $remoteuser;
+
+ // create page header
+ $enc_url = rawurlencode($pagename);
+ $enc_name = htmlspecialchars($pagename);
+ $html = sprintf(gettext("Thank you for editing %s."),
+ "$enc_name ");
+ $html .= " \n";
+
+ if (! empty($content)) {
+ // patch from Grant Morgan for magic_quotes_gpc
+ if (get_magic_quotes_gpc())
+ $content = stripslashes($content);
+
+ $pagehash['content'] = preg_split('/[ \t\r]*\n/', chop($content));
+
+ // convert spaces to tabs at user request
+ if (isset($convert)) {
+ $pagehash['content'] = CookSpaces($pagehash['content']);
+ }
+ }
+
+ for ($i = 1; $i <= NUM_LINKS; $i++) {
+ if (! empty(${'r'.$i})) {
+ if (preg_match("#^($AllowedProtocols):#", ${'r'.$i}))
+ $pagehash['refs'][$i] = ${'r'.$i};
+ else
+ $html .= "Link [$i]: unknown protocol " .
+ " - use one of $AllowedProtocols - link discarded.
\n";
+ }
+ }
+
+ InsertPage($dbi, $pagename, $pagehash);
+ UpdateRecentChanges($dbi, $pagename, $newpage);
+
+ $html .= gettext ("Your careful attention to detail is much appreciated.");
+ $html .= "\n";
+
+ // fixme: no test for flat file db system
+ if (isset($DBMdir) && preg_match('@^/tmp\b@', $DBMdir)) {
+ $html .= "Warning: the Wiki DB files still live in the " .
+ "/tmp directory. Please read the INSTALL file and move " .
+ "the DBM file to a permanent location or risk losing " .
+ "all the pages! \n";
+ }
+
+ if (!empty($SignatureImg))
+ $html .= "
\n";
+
+ $html .= "";
+ include('lib/transform.php');
+
+ GeneratePage('BROWSE', $html, $pagename, $pagehash);
+?>
diff --git a/docroot/phpwiki/lib/search.php b/docroot/phpwiki/lib/search.php
new file mode 100755
index 0000000..42ceabc
--- /dev/null
+++ b/docroot/phpwiki/lib/search.php
@@ -0,0 +1,34 @@
+"
+ . sprintf(gettext ("Searching for \"%s\" ....."),
+ htmlspecialchars($search))
+ . "
\n";
+
+ // quote regexp chars (backends should do this...)
+ //$search = preg_quote($search);
+
+ // search matching pages
+ $found = 0;
+ if (strlen($search)) {
+ $query = InitTitleSearch($dbi, $search);
+ while ($page = TitleSearchNextMatch($dbi, $query)) {
+ $found++;
+ $html .= LinkExistingWikiWord($page) . " \n";
+ }
+ }
+ else {
+ $html .= gettext("(You entered an empty search string)") . " \n";
+ }
+
+ $html .= " \n"
+ . sprintf(gettext ("%d pages match your query."), $found)
+ . "\n";
+
+ GeneratePage('MESSAGE', $html, gettext ("Title Search Results"), 0);
+?>
diff --git a/docroot/phpwiki/lib/setupwiki.php b/docroot/phpwiki/lib/setupwiki.php
new file mode 100755
index 0000000..ba3a2ef
--- /dev/null
+++ b/docroot/phpwiki/lib/setupwiki.php
@@ -0,0 +1,118 @@
+
+" . htmlspecialchars ($pagename) . "", $version, $source);
+ print (" \n");
+
+ flush();
+ InsertPage($dbi, $pagename, $page);
+}
+
+function LoadFile ($dbi, $filename, $text, $mtime)
+{
+ set_time_limit(30); // Reset watchdog.
+ if (!$mtime)
+ $mtime = time(); // Last resort.
+
+ $defaults = array('author' => 'The PhpWiki programming team',
+ 'pagename' => rawurldecode($filename),
+ 'created' => $mtime,
+ 'flags' => 0,
+ 'lastmodified' => $mtime,
+ 'refs' => array(),
+ 'version' => 1);
+
+ if (!($parts = ParseMimeifiedPages($text)))
+ {
+ // Can't parse MIME: assume plain text file.
+ $page = $defaults;
+ $page['pagename'] = rawurldecode($filename);
+ $page['content'] = preg_split('/[ \t\r]*\n/', chop($text));
+ SavePage($dbi, $page, "text file");
+ }
+ else
+ {
+ for (reset($parts); $page = current($parts); next($parts))
+ {
+ // Fill in defaults for missing values?
+ // Should we do more sanity checks here?
+ reset($defaults);
+ while (list($key, $val) = each($defaults))
+ if (!isset($page[$key]))
+ $page[$key] = $val;
+
+ if ($page['pagename'] != rawurldecode($filename))
+ printf("Warning: "
+ . "pagename (%s) doesn't match filename (%s)"
+ . " (using pagename) \n",
+ htmlspecialchars($page['pagename']),
+ htmlspecialchars(rawurldecode($filename)));
+
+ SavePage($dbi, $page, "MIME file");
+ }
+ }
+}
+
+function LoadZipOrDir ($dbi, $zip_or_dir)
+{
+ global $LANG, $genericpages;
+
+ $type = filetype($zip_or_dir);
+
+ if ($type == 'file')
+ {
+ $zip = new ZipReader($zip_or_dir);
+ while (list ($fn, $data, $attrib) = $zip->readFile())
+ LoadFile($dbi, $fn, $data, $attrib['mtime']);
+ }
+ else if ($type == 'dir')
+ {
+ $handle = opendir($dir = $zip_or_dir);
+
+ // load default pages
+ while ($fn = readdir($handle))
+ {
+ if ($fn[0] == '.' || filetype("$dir/$fn") != 'file')
+ continue;
+ $stat = stat("$dir/$fn");
+ $mtime = $stat[9];
+ LoadFile($dbi, $fn, implode("", file("$dir/$fn")), $mtime);
+ }
+ closedir($handle);
+
+ if ($LANG != "C") { // if language is not default, then insert
+ // generic pages from the English ./pgsrc
+ reset($genericpages);
+ $dir = DEFAULT_WIKI_PGSRC;
+ while (list(, $fn) = each($genericpages))
+ LoadFile($dbi, $fn, implode("", file("$dir/$fn")), $mtime);
+ }
+ }
+}
+
+$genericpages = array(
+ "ReleaseNotes",
+ "SteveWainstead",
+ "TestPage"
+ );
+
+LoadZipOrDir($dbi, WIKI_PGSRC);
+?>
diff --git a/docroot/phpwiki/lib/stdlib.php b/docroot/phpwiki/lib/stdlib.php
new file mode 100755
index 0000000..12f62ea
--- /dev/null
+++ b/docroot/phpwiki/lib/stdlib.php
@@ -0,0 +1,521 @@
+ '') {
+ print "
" . gettext("WikiFatalError") . " \n";
+ print $errormsg;
+ print "\n