Hi all,
i've seen that the fancycaptcha system is enabled on mediawiki.org
Could Brion or any admin of www.mediawiki.org, tell me how to have
images show up instead of the arithmetic system of "simplecaptcha".
Thanks a lot.
François
I have two new extensions to through into the mix:
1) Graphviz embedding extension. It generates Graphviz
(http://www.graphviz.org) images from embedded markup. Although one
already exists, I couldn't find the source code, so I wrote a new one.
2) Graph generating SpecialPage. This extension allows you to query
your wiki for both XML and Graphviz markup of subsets of your wiki. As
an added bonus, the content can be transparently included in pages so
dynamic graphs (images) of relations can be generated.
The extensions can be downloaded from
http://opensource.case.edu/projects/MediaWikiHacks/
A live example of these extensions is the Case Western Reserve
University Wiki (http://wiki.case.edu). Some links of interest are:
http://wiki.case.edu/Special:Graphshttp://wiki.case.edu/Category:Staffhttp://wiki.case.edu/Category:ITS_Employeeshttp://wiki.case.edu/CaseWiki:Categories
As you can see, for all smaller category pages, a graph of the category
layout is automatically generated. (Yes, I know about the font issues).
This extension is a work in progress and is a little rough around the
edges. I welcome any input others should provide and am open to
suggestions for future possibilities.
Gregory Szorc
gregory.szorc(a)case.edu
Hi all,
we were forces to reinstall one of our webservers which contained
a mediawiki 1.5.0 installation using mysql for storage.
The Database was saved using mysqlhotcopy every day with a cronjob.
After reinstalling all services,i copied back to wikidb to the mysql
data dir...surfing to the webpage of our wiki gave me a "cannot find
wikidb.pages" error.
i realized that the installation missed 4 files:
(well,only the .frm files where in the directory,.MYD and .MYI were gone..)
page
pagelinks
revision
user_groups
all other tables are correct.
As i understand the structure of wiki,the text of the pages is stored in
the "text" table and page and pagelinks.
Is there a way to recover our wiki using the text table (and other
tables if necessary) ?
I really don't have a clue why mysqlhotcopy missed the .MYD and .MYI files.
any ideas ?
thanks for help,
gw
Hiho :)
Does anyone know how to write a special page and/or a (php?-)script
where users can pick some categories (of all or a specific selection
created by a sysop) and than a page shows all articles with the chosen
categories? Sorry but I didn't find any information about that in the
internet..
Thanks for your help! Florian :)
Hi all,
we were forces to reinstall one of our webservers which contained
a mediawiki 1.5.0 installation using mysql for storage.
The Database was saved using mysqlhotcopy every day with a cronjob.
After reinstalling all services,i copied back to wikidb to the mysql
data dir...surfing to the webpage of our wiki gave me a "cannot find
wikidb.pages" error.
i realized that the installation missed 4 files:
(well,only the .frm files where in the directory,.MYD and .MYI were gone..)
page
pagelinks
revision
user_groups
all other tables are correct.
As i understand the structure of wiki,the text of the pages is stored in
the "text" table and page and pagelinks.
Is there a way to recover our wiki using the text table (and other
tables if necessary) ?
I really don't have a clue why mysqlhotcopy missed the .MYD and .MYI files.
any ideas ?
thanks for help,
gw
Hi ho :)
Does anyone know how to write a special page and/or a (php?-)script
where users can pick some categories (of all or a specific selection
created by a sysop) and than a page shows all articles with the chosen
categories? Sorry but I didn't find any information about that in the
internet..
Thanks for your help! Florian :)
Hi everyone,
I thought I would mention a couple of MediaWiki extensions I have developed in
case they might be useful to the community.
One extension is an interface for users to more easily export multiple wiki
pages to XML, PDF and HTML by using the search function. The open source
software HTML2FPDF is used for exports to PDF. MediaWiki's native XML
exporter is used for the XML. The way the search is used is, the user first
enters a search term, then the returned page names in which the search term
appears are appended to the list of pages to be exported. They can dump
those pages at any time to any of the formats (an installation with this
extension can be seen at
http://pgrdictionary.org/en/index.php/Special:DocumentExport (this install is
a plant genetic resource dictionary, so use plant genetic terms, such as
chromosome, to get search returns) ).
The other extension goes through every page in your Wiki, looking for Wiki
page titles in the content of each page and Wiki-linking those words to their
corresponding Wiki pages. It's meant to be used after you have imported fresh
content into your Wiki, when that content does not yet contain any links
between Wiki pages.
Hopefully they are useful, please let me know if they are.
thanks,
Andrew.
I would like to be able to create wiki pages by just calling a function to
which I pass the name of the new page and the text which I want on the page.
I have managed a solution where I just hack directly into the tables
(bypassing all the mediawiki code), but started to hit code translation type
problems, so I think it is best to do it building on all the wisdom already
built into the mediawiki code.
The sort of "createPage()" function which I have so far got, but which does
not work, is as follows:
/**
* Pass the name of the new page and the content and create the page
*/
function createPage($wikiPageName, $pageText) {
$title = new Title();
$title = $title->newFromText($wikiPageName);
$article = new Article($title);
$article->insertNewArticle($pageText, "No summary", false, false);
}
Anybody done this before? Any ideas what I need to do specifically to get
this working?
I have tried to trace through what the "index.php" file does when creating a
page, but it's not easy to follow particularly because of the way in which
the parameters are often implicit (e.g. via POST of form data).
Thanks for any help!
Hugh Prior
Hello,
I have developed a couple of MediaWiki extensions and I was wondering how I
would go about finding out if they would be useful to the MediaWiki
community, and if they are, how I might go about contributing them to the
MediaWiki project?
thanks,
Andrew.
Dear mediawiki devs,
I am happy to announce that submissions are currently being accepted
for the second annual Wikimedia Conference. The primary deadline for
submitting an abstract is April 15, 2006. The conference will be held
from August 4-6, 2006 in Cambridge, Massachusetts, USA, on the Harvard
Law School campus.
http://wikimania.wikimedia.org/wiki/Call_for_papers <-- full text of the CfP
http://cfp.wikimania.wikimedia.org/ <-- where to submit abstracts
If you have an idea for speakers, tutorials, or panels, please add
them to the list:
http://meta.wikimedia.org/wiki/Wikimania_2006/Program_ideas
The three days before the conference, August 1-3, will be Hacking Days
for a smaller group of developers; hosted at the MIT Media Lab.
Please help spread the news. :-)
Looking forward to a sweet conference,
SJ