Just found the answer...
I decided to go for simple Text, but it also works with a button.
I included:
<div align= right><a href="javascript:print()">Print</a></div>
directly under
<div id="content">
in MonoBook.php
I have some experience with this sort of thing, so thought I would add
my 2p to the information pool being shared here.
1) In general, there is no such thing as a universal format.
Having a data mediation format that spans versions is often an
intractable problem to solve. Essentially, if we can find a format that
is agnostic to any version of the application, then we would just use
that format as the data schema and not worry about data migrations for
any version change because every version uses the same format. Finding
such a format nearly always subsumes the possibility of future
application innovation.
2) An existing standard can be settled upon that meets core needs.
In this case, the stakeholders identify a standard format that has some
level of widespread use and agree to always have the capability to
export and import in that format. This is how we individually overcome
limits in the applications we use daily. Specifically, we often search
for a Save-As format from a source application that we know is
accommodated by a destination application. The problem with this is
that although it can be lossy, it is more likely to be gainful - meaning
that the importing application has to make assumptions in order to fill
in missing data that it might need.
This solution is not ideal, primarily because there may be a
data requirement of the importing application that cannot be
algorithmically determined. As a result, human intervention might be
required for each unit of data imported. This is certainly not a
reasonable solution for even moderately sized datasets of just a few
hundred elements.
3) Look-ahead designs are used before features are implemented.
In this case, a very heavy-weight design effort attempts to
prognosticate the data design well ahead of code implementation. This
actually can be done if innovation is buffered and features are queued
and agreed upon well in advance. This is about as un-agile as software
development gets, however; and, as most software engineers know, it is
brutally difficult to design something to this level of detail so far
ahead of implementation (and indeed it almost always fails in my
experience).
4) Create a migration mechanism for each release.
This is typically what is done. The reasons are simple, the source
application data formats are well known and the destination data formats
are well known. The only thing needed is an intelligent mapping from
one to the other. As Lee has pointed out, the problem with this is that
it places a burden on the user community to stay abreast of development
whenever a migration is required.
I am sure there are other analyses in the solution domain, but the above
is off the top of my head. Although certainly not empirical, I
conjecture that an industry best practice is to provide 4) as a minimum,
and support a collection of widespread formats for 2).
Sorry for rambling on about this, but this has been a problem that has
been around for a long time in software engineering circles. Comments
and criticisms welcome.
Thanks,
George
-----Original Message-----
From: mediawiki-l-bounces(a)Wikimedia.org
[mailto:mediawiki-l-bounces@Wikimedia.org] On Behalf Of Lee Daniel
Crocker
Sent: Monday, March 28, 2005 11:26 AM
To: Wikimedia developers
Cc: Mediawiki List
Subject: [Mediawiki-l] Re: [Wikitech-l] Long-term: Wiki import/export
format
On Mon, 2005-03-28 at 17:51 +0200, Lars Aronsson wrote:
> It sounds so easy. But would you accept this procedure if it requires
> that Wikipedia is unavailable or read-only for one hour? for one day?
> for one week? The conversion time should be a design requirement.
> ...
> Not converting the database is the fastest way to cut conversion time.
> Perhaps you can live with the legacy format? Consider it.
A properly written export shouldn't need to have exclusive access to the
database at all. The only thing that would need that is a complete
reinstall and import, which is only one application of the format and
should be needed very rarely (switching to a wholly new hardware or
software base, for example). In those few cases (maybe once every few
years or so), Wikipedia being uneditable for a few days would not be
such a terrible thing--better than it being down completely because the
servers are overwhelmed.
--
Lee Daniel Crocker <lee(a)piclab.com> <http://www.piclab.com/lee/>
<http://creativecommons.org/licenses/publicdomain/>
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
I upgraded from 1.3 to 1.4 and trying the image gallery returns the
following error
Fatal error: Call to a member function on a non-object in
/var/www/html/sysadminwiki/includes/ImageGallery.php on line 137
the Image list works fine
Aaron
--
____________________________________________
Aaron Macks amacks(a)techtarget.com
TechTarget PGP keyid: FBE946C5
117 Kendrick St, Suite 800 Phone: (781) 657-1519
Needham, MA 02494 Fax: (781) 657-1100
Hey everyone,
I've been happily running my mediawiki for a while now and I implemented a
little 5 line fix so that I could do interwiki links for Google and a few
other things effectively. The catch with Google and many other services is
that they don't like to interpret underscores as spaces. My hack was to
leave all $1 interwiki links alone but add a $2 symbol for use when you
actually want real spaces (or %20) in your URL.
It is getting a little tedious editing my change into the main each update
and I'd imagine this might be a decent feature or fix if implemented
correctly.
I didn't know much about the innards and figured out only enough to make
it work, so here goes:
I changed the function getFullURLs() in ./includes/Title.php by switching
this:
$url = str_replace( '$1', $namespace .
$this->mUrlform, $baseUrl );
if ( '' != $this->mFragment ) {
$url .= '#' . $this->mFragment;
}
return $url;
with this:
$urlPre = str_replace( "$1", $namespace .
$this->mUrlform, $baseUrl );
if ( '' != $this->mFragment ) {
$url .= '#' . $this->mFragment;
}
$url = str_replace( "$2", $namespace .
str_replace("_", "%20", $this->mUrlform),
$urlPre);
return $url;
I then changed all of the bits of the URLs in the database entries of the
Interwiki table which should have spaces from $1 to $2.
What do you think?
Thad
Why not just use a spider that you program not to follow edit links?
al.
-----Original Message-----
From: Bernhard Walle [mailto:bernhard.walle@gmx.de]
Sent: Saturday, 26 March 2005 12:47 a.m.
To: mediawiki-l(a)wikimedia.org
Subject: [Mediawiki-l] Creating a static HTML version
Hello,
what are the possibilities to create a static complete HTML (or PDF)
version of a MediaWiki?
Is wiki2static the only software available for that?
Regards,
Bernhard
--
_______________________________________________
MediaWiki-l mailing list
MediaWiki-l(a)Wikimedia.org
http://mail.wikipedia.org/mailman/listinfo/mediawiki-l
I wanted to add a link to this in the Quick Bar - Navigation. Adding array(
'text'=>'newimages', 'href'=>'newimages-url' to LocalSettings.php threw up
an error page. I had to add 'newimages-url' => 'Special:Newimages', after
'newimages' => 'New images gallery', in Language.php.
Seems to work for me but don't really know if there are other implications
of these changes.
Ken Ross
I posted this a few days ago but didn't get a response. I'm reposted in
the hopes that my original post was missed.
I've upgraded from 1.3.10 to 1.4.0 and one of my extensions no longer works.
I think the problem might have to do with my use of:
$wgParser->internalParse($output, 0)
While trying to get an extension to run I got the following in my apache
error logs.
[error] PHP Warning: end(): Passed variable is not an array or object
in /home/imbeauje/apache/htdocs/wiki/includes/Parser.php on line 425
[error] PHP Warning: end(): Passed variable is not an array or object
in /home/imbeauje/apache/htdocs/wiki/includes/Parser.php on line 425
[error] PHP Warning: end(): Passed variable is not an array or object
in /home/imbeauje/apache/htdocs/wiki/includes/Parser.php on line 425
[error] PHP Warning: end(): Passed variable is not an array or object
in /home/imbeauje/apache/htdocs/wiki/includes/Parser.php on line 425
[error] PHP Notice: Undefined index: html in
/home/imbeauje/apache/htdocs/wiki/includes/Parser.php on line 373
[error] PHP Fatal error: Unsupported operand types in
/home/imbeauje/apache/htdocs/wiki/includes/Parser.php on line 373
In case it is useful here is my extension:
$wgExtensionFunctions[] = "wfTableSchema";
function wfTableSchema() {
global $wgParser;
$wgParser->setHook( "tableSchema", "renderTableSchema" );
}
function renderTableSchema( $input )
{
global $wgTitle, $wgParser;
$CLASSPATH =
"/home/imbeauje/src/:/home/imbeauje/Tools/lib/jconn2.jar";
$title = $wgTitle->getDBkey();
//title is of the form database..tableName. Parse to extract
list($dbName, $tableName) = explode("..", $title);
$dbName = strtolower($dbName);
//then pass as params to java prog that will generate table
$cmd = "java -classpath $CLASSPATH ".
"wikiTools.WikiDatabaseTableGenerator $dbName
$tableName";
$output = `$cmd`;
$output = $wgParser->internalParse($output, 0);
return $output;
}
?>
I've also filed this in bugzilla at:
http://bugzilla.wikimedia.org/show_bug.cgi?id=1732
Jc
hi,
there seems to be a small glitch with the diff3 handling:
if the system has a non-gnu diff3 (like solaris by default), the call
to diff3 does not work, and the page is empty after the save. apache
log shows than "usage: diff3 file1 file2 file3".
-drio