An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 413 of 429 tests (96.27%) FAILED!
Introducing the Hutter Prize for Lossless Compression of Human Knowledge
Artificial intelligence researchers finally have an objective and
rigorously validated measure of the intelligence of their machines.
Furthermore the higher the measured intelligence of their machines the
more money they can win via the Hutter Prize.
The purse for the Hutter Prize was initially underwritten with a 50,000
Euro commitment to the prize fund by Marcus Hutter of the Swiss Dalle
Molle Institute for Artificial Intelligence, affiliated with the
University of Lugano and The University of Applied Sciences of Southern
Switzerland.
The theoretic basis of the Hutter Prize is related to an insight by the
12th century philosopher, William of Ockham, called "Ockham's Razor",
sometimes quoted as: "It is vain to do with more what can be done with
less." But it was not till the year 2000 that this was mathematically
proven*, by Marcus Hutter, to be a founding principle of intelligence.
Indeed, Hutter's Razor** might be phrased, "It is truer to explain with
less that which can be explained with more."
There have been previous tests and related prizes for artificial
intelligence, such as the Turing Test and the Loebner Prize. However,
these tests suffered from subjective definitions of intelligence.
Hutter's recent theoretic breakthrough creates a mathematics of
artificial intelligence which accurately measures the degree of
intelligence possessed by an artificial agent. It does so by measuring
how succinctly it represents knowledge of the world. As Hutter has now
proven, the most succinct computer model of the world isn't just the
most aesthetic or memory-efficient out of all models of the known
observations -- it also most accurately predicts new observations. In
short, it is the most intelligent.
Artificial intelligence has thereby entered the realm of engineering:
Lossless compression of human knowledge.
This is momentous because by optimizing for rigorous metrics, the field
of artificial intelligence may finally clarify the murky waters of
inadequate definition, within which it has been haphazardly swimming for
the last 50 years, to become both a hard science and tractable
engineering discipline.
Named for the discoverer of the proof and the initial 50,000 Euro donor,
the Hutter Prize currently targets the compression of a 100 megabyte
sample of human knowledge drawn from the broadly based Wikipedia online
encyclopedia. As Moore's Law increases the capacity of machines, and as
additional donations to the prize fund increase the incentives of
contestants, the intent is to increase the amount of knowledge targeted
for compression. It is reasonable to expect that the 100 megabyte
sample will produce, at the very least, advances in linguistic
modeling. As the targeted depth and breadth of knowledge increases,
conceptual frameworks will come into play, eventually covering the range
of disciplines from political science to physics by applying theories
that prove optimal in compressing the target.
A common objection to this approach to artificial intelligence is that
it offers little that is new -- that the computational difficulty of
searching for patterns in data remains what it has always been. This
objection misses two important points:
1) Hutter's proof provides a new mathematics of intelligence allowing
for "top down" theoretic advances which may render many problems
tractable that otherwise appear intractable.
2) There is a large overlap between succinctly codified knowledge and an
intelligent compression program. Indeed, a reasonable definition of
"knowledge" is that it optimizes the compression of new observations as
instances of old patterns. This means that even if a compressor does
nothing but apply codified human knowledge, generating no new knowledge
of its own, it can still demonstrate greater intelligence than competing
programs and thereby make measurable progress toward artificial
intelligence but also -- and this is key -- progressively more
intelligent bodies of human-generated knowledge by pitting those bodies
of knowledge against each other in what might be called an
epistemological tournament.
The formula for winnings is modeled after the M-Prize or Methuselah
Mouse Prize, which awards money to longevity researchers for progress in
keeping mice alive the longest. Here, modified for compression ratios,
is the formula:
S = size of program outputting the uncompressed knowledge
Snew = new record
Sprev = previous record
P = [Sprev - Snew] / Sprev = percent improvement
Award monies:
Fund contains: Z at noon GMT on day of new record
Winner receives: Z * P
Initially Z is 50,000 Euro with a minimum payout of 500 Euro (or minimum
improvement of 1% over the prior winner).
Donations are welcome. The history of improvement in The Calgary Corpus
Compression Challenge*** is about 3% per year. The larger the
commitment from donors to the fund, the greater the rate of progress
toward a high quality body of human knowledge and, quite possibly, the
long-held promise of artificial intelligence.
For further details of the Hutter Prize see:
http://prize.hutter1.net
For discussion of the Hutter Prize see:
http://groups.google.com/group/Hutter-Prize
-- Jim Bowery
* http://www.hutter1.net/ai/uaibook.htm
** Hutter's Razor has some caveats relating to the nature of the
universe and computability, but those conditions must be met for any
computer-based intelligence.
*** http://mailcom.com/challenge
On Thu, Aug 10, 2006 at 07:18:42PM -0400, David Spencer, MediaWiki User wrote:
> 1. I feel it is a shame that the wiki publisher does not have the option
> to code break out links to external web sites. It is very easy for those
> new to computers to head to a new link and forget what they originally
> were looking for on the original wiki. Then they leave our wiki.
>
> 2. When I send someone to an external link from my wiki, I am not
> responsible for the content of that external site. Sometimes "newbies"
> may think that I created the external link page too.
There's a maxim in the design field that you can get yourself in
trouble by over-optimising for 1) untrained users or 2) too-slow
hardware.
Ok; I just synthesized that now from things lots of other, smarter
people than me have said. But it's still true. People have a tendency
to try to make things easier for 'dummies' :-) while make much more
important things much harder for 'smart people'.
On balance, it's probably much better for the entire audience to *train
the non-savvy users*. If they can't be bothered to learn, then they're
not *entitled* to have everyone else's work made more difficult on
their behalf.
> 3. FYI... I am a Mac user using FireFox. To make a break out link, all I
> have to do is hover over the link, press lightly and choose "Open link
> in new window".
Mac mice have click-*pressure* sensors? Wow!
> 4. The wiki I am developing is here http://www.christianmedia.ca
Interesting. I don't think I've seen anyone run a primarily-MW site
where the front page *wasn't*.
> Thanks again for all of your comments and words of wisdom. This is a
> very helpful listserv!
<pedant>
"listserv" is a registered trademark of a mailing list software company
whose products we aren't using. :-)
http://www.lsoft.com/corporate/legal.asp
</pedant>
Glad we could help. And that we didn't scare you off. :-)
Cheers,
-- jra
--
Jay R. Ashworth jra(a)baylink.com
Designer Baylink RFC 2100
Ashworth & Associates The Things I Think '87 e24
St Petersburg FL USA http://baylink.pitas.com +1 727 647 1274
The Internet: We paved paradise, and put up a snarking lot.
An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 413 of 429 tests (96.27%) FAILED!
I fixed the link that had me looking in the wrong place. Not sure if
there are other bad links out there or not...
Tim
> -------- Original Message --------
> Subject: Re: [Wikitech-l] Where are the Extensions
> From: Timwi <timwi(a)gmx.net>
> Date: Fri, August 11, 2006 8:35 am
> To: wikitech-l(a)wikimedia.org
>
> Jens Frank wrote:
> > On Mon, Aug 07, 2006 at 02:53:25PM -0700, tim(a)greenscourt.com wrote:
> >
> >>I am looking for the "Reviews" extension which I believe was written by
> >>Magnus Manske and feel lost in a maze.
> >>
> >>Several messages posted to the MediaWiki-L mailing list refer to updates
> >>to this code as follows:
> >> Update of /cvsroot/wikipedia/extensions/Review
> >> In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv22161
> >>
> >>The page at http://meta.wikimedia.org/wiki/MediaWiki_extensions links to
> >>http://cvs.sourceforge.net/viewcvs.py/wikipedia/extensions/#dirlist, but
> >>this location just times out for me.
> >>
> >>I've tried going to http://sourceforge.net/projects/wikipedia/ but I
> >>can't seem to find where the extensions are hiding there.
> >
> > We don't use sourceforge's CVS any more. Check our SVN repository at
> > http://svn.wikimedia.org/viewvc/mediawiki/trunk/extensions/
>
> Maybe someone should actually fix the links then? :-p
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)wikimedia.org
> http://mail.wikipedia.org/mailman/listinfo/wikitech-l
August 8, 2006
Hello:
Break out links-How are they done?
I would like to create an external link from my wiki to an external web
site. I would like a new web browser to open up.
What is the wiki code I should use to perform this action?
For example, in HTML a breakout link code would look like this.
To read more about ice cream, please visit <A
HREF="http://en.wikipedia.org/wiki/Ice_Cream" TARGET="_blank">here</A>
How do I make a break out link to open a new browser window from a wiki?
To read more about ice cream, please visit
[http://en.wikipedia.org/wiki/Ice_Cream here].
The above wiki code will not open a new web browser.
I look forward to your response!
David Spencer
Here's a loose idea if anybody wants to implement it.
In my Firefox web browser, the back and forward buttons let me go
back and forth in my browsing history and they also serve as
drop-down menus for the nearest 10 pages.
In MediaWiki, the [[Special:Allpages/Ijkl]] page lets me see the
article names immediately following Ijkl in alphabetic order.
Unfortunately, there seems to be no easy way to get the previous
page of alphabetically arranged article names, i.e. those
immediately before Ijkl. When I'm looking for articles about
Sweden, Swedes, Swedish, ... I often go to Special:Allpages/Swed
to get an overview (by typing that URL in my browser).
What if every article page served by Mediawiki contained a
previous and next arrow that let me browse the site in alphabetic
order, and where the arrow buttons also served as drop-down menus
for the nearest 10 entries. This could be especially useful for
pages that don't yet exist, e.g. [[Ijkl]], because they would link
to pages that do exist.
I realize there can be all kinds of issues with implementing such
a function, including runtime performance, caching, and the exact
definition of alphabetic order for various languages, but I don't
want to make this any more complicated than the Special:Allpages.
--
Lars Aronsson (lars(a)aronsson.se)
Aronsson Datateknik - http://aronsson.se
I am looking for the "Reviews" extension which I believe was written by
Magnus Manske and feel lost in a maze.
Several messages posted to the MediaWiki-L mailing list refer to updates
to this code as follows:
Update of /cvsroot/wikipedia/extensions/Review
In directory sc8-pr-cvs1.sourceforge.net:/tmp/cvs-serv22161
The page at http://meta.wikimedia.org/wiki/MediaWiki_extensions links to
http://cvs.sourceforge.net/viewcvs.py/wikipedia/extensions/#dirlist, but
this location just times out for me.
I've tried going to http://sourceforge.net/projects/wikipedia/ but I
can't seem to find where the extensions are hiding there.
Can anyone point me to the direction for the extensions in the
repository?
Thanks!
Tim
An automated run of parserTests.php showed the following failures:
Running test TODO: Table security: embedded pipes (http://mail.wikipedia.org/pipermail/wikitech-l/2006-April/034637.html)... FAILED!
Running test TODO: Link containing double-single-quotes '' (bug 4598)... FAILED!
Running test TODO: Template with thumb image (with link in description)... FAILED!
Running test TODO: message transform: <noinclude> in transcluded template (bug 4926)... FAILED!
Running test TODO: message transform: <onlyinclude> in transcluded template (bug 4926)... FAILED!
Running test BUG 1887, part 2: A <math> with a thumbnail- math enabled... FAILED!
Running test TODO: HTML bullet list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML ordered list, unclosed tags (bug 5497)... FAILED!
Running test TODO: HTML nested bullet list, open tags (bug 5497)... FAILED!
Running test TODO: HTML nested ordered list, open tags (bug 5497)... FAILED!
Running test TODO: Parsing optional HTML elements (Bug 6171)... FAILED!
Running test TODO: Inline HTML vs wiki block nesting... FAILED!
Running test TODO: Mixing markup for italics and bold... FAILED!
Running test TODO: 5 quotes, code coverage +1 line... FAILED!
Running test TODO: HTML Hex character encoding.... FAILED!
Running test TODO: dt/dd/dl test... FAILED!
Passed 413 of 429 tests (96.27%) FAILED!