Hello, I regularly read and contribute to Wikipedia under the name Iammaxus.
Over the last few weeks, I have been mulling over an idea on how to
significantly improve the mediawiki project. I posted the following on my
user page before finding and reading this mailing list only to find out that
this idea has been discussed here
(http://mail.wikipedia.org/pipermail/wikitech-l/2003-December/007185.html),
albeit not with the scale that I'm imagining Because im not sure where to
put this in the metawiki, and im not sure if anyone would care if I did,
here is an excerpt from a conversation i had with a friend of mine
knowledgeable in computer science on 12/24/03 (note that I wasn't seriously
asking him to do this, just more of bouncing an idea of off him):
I Am MAXUS (2:30:14 PM): yo
Chin Jut: hi
I Am MAXUS: code a meta data system for wikipedia
Chin Jut: what's that mean?
I Am MAXUS: i can't even begin to imagine the possibilities
I Am MAXUS: well first off
I Am MAXUS: organize topics
I Am MAXUS: in a tree system
I Am MAXUS: this would be part of the meta data system
I Am MAXUS: well wiat
I Am MAXUS: lemme start from the beginning Chin Jut: ok
I Am MAXUS: the overall point is to allow more machine generated info, stuff
that shouldnt be manually made like it is now, or even with one time use
scripts
I Am MAXUS: such as lists of articles
I Am MAXUS: tables of dates
Chin Jut: alright
Chin Jut: "List of famous bears", that sorta thing
I Am MAXUS: but much greater than taht
Chin Jut: alright
I Am MAXUS: so u could just request a list of a certain sub tree
I Am MAXUS: etc
I Am MAXUS: then it would have more specific meta data
I Am MAXUS: such as meta data about books including the author and such
I Am MAXUS: so that this would automatically be put into an article about
the book
Chin Jut: ok
I Am MAXUS: so:
I Am MAXUS: ?
I Am MAXUS: go
I Am MAXUS: ]and do it
Chin Jut: well, gee
Chin Jut: that's a large project
Chin Jut: I'm not even sure what to begin with
Chin Jut: I mean, what features need to be available?
Chin Jut: (I'm not gonna be able to do this, why am I talking?)
Chin Jut: what, concretely, needs to be done? I Am MAXUS: good question
I Am MAXUS: well first of all, learn xml and shit
I Am MAXUS: cause thats how all this junk is done
I Am MAXUS: or somethintg
I Am MAXUS: lol
Chin Jut: But Tim Sweeney speaks disparagingly of XML...
I Am MAXUS: does he? Chin Jut: yeah
I Am MAXUS: what does he say is bad about it?
Chin Jut: lemme see if I can find it
Chin Jut: "Does anyone else see XML as an overcomplicated solution the
meager problem of serializing data in and out of text files?"
Chin Jut: Philip Wadler (one of the main guys behind Haskell) also bashes
XML: "So the essence of XML is this: the problem it solves is not hard, and
it does not solve the problem well."
Chin Jut: All the same, yeah, I'll learn XML
I Am MAXUS: lol
I Am MAXUS: well screw those guys
I Am MAXUS: cause everyone is using it
Chin Jut: yeah
Chin Jut: Phil Wadler goes on to say
I Am MAXUS: waht does he mean "serializing data..."?
Chin Jut: It's worth studying XML just because it became popular while
better things did not
Chin Jut: Serializing data means writing it and reading it from files
Chin Jut: you take a complicated structure, like a tree
Chin Jut: and turn it into some linear sequence of bytes
Chin Jut: hence, you've turned it into a series... you've serialized it
I Am MAXUS: right
I Am MAXUS: anyway i dont know if u can usel xml
I Am MAXUS: because of the database based nature of it
I Am MAXUS: but something similar
I Am MAXUS: anyway
I Am MAXUS: i dont know about searching, and thats the main function of this
system
I Am MAXUS: so u have to figure out how to do that
I Am MAXUS: so each article has info attached to it
I Am MAXUS: ideally, the types of info could be specified by ppl in a
realtively plain language way
I Am MAXUS: so taht ppl could specify more types of meatadata for certain
types of files
Chin Jut: what types of metadata would people specify?
Chin Jut: Like "this article is about bears"?
Chin Jut: Shouldn't that be auto-discovered by computers?
I Am MAXUS: thats beyond the scope of this
I Am MAXUS: that requires all sorts of human communication stuff
Chin Jut: ok
Chin Jut: so then what is this, exactly?
I Am MAXUS: i told you!
I Am MAXUS: for example, lets take the organism pages
Chin Jut: ok
I Am MAXUS: they all have the classification on the side
I Am MAXUS: well instead, each page would have a is a member of this higher
group
I Am MAXUS: which wikipedia would look at
Chin Jut: I see
I Am MAXUS: and see what thats a memeber of
I Am MAXUS: and dynamically figure out the whole classifcation
I Am MAXUS: this is not such a useful example because its much more static
I Am MAXUS: but take the date pages, those are useful mommas to meta-fy
Chin Jut: ah
I Am MAXUS: so if a page is an event
Chin Jut: so people would have to say in the event
I Am MAXUS: it would include date info
I Am MAXUS: and type of event info
Chin Jut: "Date info: July 4, 1776"
I Am MAXUS: right
Chin Jut: and then the date page would say "Search for all pages with date
info: July 4, 1776"
I Am MAXUS: so it coud be added to the "type of info in history" page
I Am MAXUS: that too
Chin Jut: "typo of info in history" page?
I Am MAXUS: well like the music in history page
I Am MAXUS: etxc
I Am MAXUS: etc
Chin Jut: ah
I Am MAXUS: so if u had a page that was under the music subtree
I Am MAXUS: and then under the band subtree
I Am MAXUS: and u had dates of existence of that band
I Am MAXUS: and then there could be an important concert subtree, etc
Chin Jut: well, to be technical, I don't think these are trees, I think
they're DAGs
Chin Jut: but it doesn't matter
I Am MAXUS: dag?
Chin Jut: directed acyclic graph
Chin Jut: in a tree, a node has only one parent
Chin Jut: (at most)
I Am MAXUS: yeah good point
Chin Jut: ok, it's sorta interesting. I have no idea how to do it
efficiently, though. But I think I might actually work on it
I Am MAXUS: lol
I Am MAXUS: i want to copy this conversation into the metawikipedia.org
somewhere
I Am MAXUS: maybe in the todo for vers 4 or 5
Chin Jut: I want to eat breakfast, because I have yet to do so
Chin Jut: bbl
I Am MAXUS: bye
Chin Jut signed off at 2:48:15 PM.
Those are our AOL Instant Messenger screen names. His wikipedia name is
Chinju
Thanks for reading this, I hope you consider this because I believe this
idea has the potential to revolutionize not just the wikipedia, but all
sorts of projects.
I haven't set up Spam Assassin, but I do run a lot of mailing lists. You
can eliminate 99% of all spam on mailing lists by doing two things:
1) Only members of the list can post to the list (I don't think you're
doing this, but not sure since I'm not on the other list being discussed)
2) You must answer an email to join the list (I think you already do this)
There are other things that can be done to get rid of the last 1%, but this
gets it down to very controlled levels for most mailing lists. If the list
admin wants to take this off line, I'd be happy to discuss it, and work
with him until the problem is resolved.
-Kelly
At 11:09 AM 1/30/2004, you wrote:
>Brion Vibber idly wondered,
>
> > > Can we install Spam Assassin software on the wikien-l mailing list?
> > > How about other mailing lists?
> >
> > Do you know how to set it up properly as a machine-wide service?
>
>Are you kidding? All I know is VB and Java on Windows. The only service
>I know anything about takes place on Sundays at my church...
>
>But I know what e-mail filtering is: certain combinations of keywords
>tip off the software to flag a message as spam, and automatically moves
>it to another bin. You can either leave the suspected spam in the bin,
>or double-check it, or just have it automatically deleted.
>
>Ed Poor
>_______________________________________________
>Wikitech-l mailing list
>Wikitech-l(a)Wikipedia.org
>http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Hi,
some time ago I asked whether we should start using the MediaWiki namespace for
bars of links that recur on several articles; take, for example, the list of
links at the bottom of each Greek letter (http://en.wikipedia.org/wiki/Beta).
As far as I remember, it was said that the problem would be that we would get a
large number of self-links.
My suggestion to solve that problem was to have {{msg:}} automatically change
[[blahblah]] to '''blahblah''' when it is inserted into the page "blahblah".
I have attempted to code this, and I came up with the following. This patch has
the side-effect of changing all self-links to bold text, even when they're not
inserted using {{msg:}}.
I've tested this on "normal" self-links, and it appears to be working fine. I
can't test it with stuff inserted with {{msg:}} because the MediaWiki stuff
doesn't seem to work for me (see Message-ID: <bverjf$meb$1(a)sea.gmane.org> -
any help appreciated :/).
Thanks,
Timwi
Index: phase3/includes/OutputPage.php
===================================================================
RCS file: /cvsroot/wikipedia/phase3/includes/OutputPage.php,v
retrieving revision 1.79
diff -u -r1.79 OutputPage.php
--- phase3/includes/OutputPage.php 30 Jan 2004 17:07:50 -0000 1.79
+++ phase3/includes/OutputPage.php 30 Jan 2004 23:57:02 -0000
@@ -1054,6 +1054,10 @@
$s .= $sk->makeLink( $link, $text, "", $trail );
*/
}
+ if( $nt->getPrefixedText() == $wgTitle->getPrefixedText() ) {
+ $s .= "<strong>" . $text . "</strong>" . $trail;
+ continue;
+ }
if( $ns == $media ) {
$s .= $sk->makeMediaLinkObj( $nt, $text ) . $trail;
$wgLinkCache->addImageLinkObj( $nt );
I'm taking a look at the new machines and starting some tests... it may
be a while before we figure out what do to with this much power! :D
Jimmy: I notice the Opteron box has 3 SCSI drives attached. sda is the
system drive, then there are sdb and sdc... all are 30 GB, but only sda
seems to be partitioned. Is this as it should be, or was there supposed
to be a RAID setup? I don't remember...
-- brion vibber (brion @ pobox.com)
Gabriel wrote:
> On Thu, 29 Jan 2004 14:01:34 +0100, Anthere wrote:
>
> > We just can't have an open list any more,
> > if a solution to found for 1) limit spam
> > 2) improve security for the recipiendaries
>
> I have spamassassin running on my server for all mail
> accounts- it works just great- even without training.
> Filtered 1600 spams in the last three months, missed
> four, no false positives.
Brion,
Can we install Spam Assassin software on the wikien-l mailing list? How
about other mailing lists?
Anthere and I have gotten tired of reading and deleting up to 30 daily
spam, spam, spam, ham, eggs and spam (hasn't got MUCH spam in it) every
day -- the joke gets old after a while.
Ed Poor
Wikien-l Admin Emeritus
Sorry for the massive crossposting, but this is big good news.
The new colocation facility (Neutelligent/Hostway, Tampa) just called
me and they are at this moment taking delivery of 9 new servers
belonging to the Wikimedia Foundation. :-)
I'm heading over there now with Michael and we will be spending as
long as it takes to install them.
It's more up to Brion and the other developers as to when we'll be
able to go live on these. I'm just going to get them up and running
and make sure that the latest (most secure) ssh is on them.
Since I'm on my way out the door and need a quick list of names
suidas
beauvais
glanwilla
moreri
hoffman
bayle
coronelli
zwinger
browne
are taken from "Notable encyclopedists before 1700",
in the article:
http://en.wikipedia.org/wiki/Encyclopedia
--Jimbo
Hello,
I'm trying to use the MediaWiki namespace on my new installation of
MediaWiki.
Unfortunately, {{msg:Blah}} is being replaced by nothing (i.e. it is
just being removed) even though [[MediaWiki:Blah]] does contain text.
What might the cause of this be?
Thanks again for any help,
Timwi
When I try to upload a .pdf or .gif via
http://en.wikipedia.org/wiki/Special:Upload
I get:
Upload warning
".pdf" is not a recommended image file format.
But no form to submit anyway appears. Thus I cannot upload the files. Is
this a bug, the intended behavior, or a misconfiguration?
- David [[User:Nohat]]
Hi,
I read the INSTALL and did it all and setup seemed to go perfectly
and the main page comes up... however... When I try to do edit this page,
I get:
br>
<b>Warning</b>: stat failed for /home/justin/justin.com/htdocs/wiki/upload/lock_yBgMBwiR (errno=2 - No such file or directory) in <b>/home/justin/justin.com/htdocs/wiki/GlobalFunctions.php</b> on line <b>161</b><br>
<br>
<b>Warning</b>: stat failed for /home/justin/justin.com/htdocs/wiki/upload/lock_yBgMBwiR (errno=2 - No such file or directory) in <b>/home/justin/justin.com/htdocs/wiki/GlobalFunctions.php</b> on line <b>161</b><br>
<br>
and some more failurel lines but these are the first two.
/home/justin/justin.com/htdocs/wiki/upload/lock_yBgMBwiR
doesn't exist.
/home/justin/justin.com/htdocs/wiki/upload
is world writeable
and I'm no php expert but have looked at line 161 in the mentioned file and I'm sure
there is nothing wrong with it.
I am using
mediawiki-1.1.0
apache-1.3.23-142
php 4.1.0
SuSE Linux 8.0
Any ideas, I don't even know much about wiki, I'm setting it up
for someone else to use, just a dumb sysadmin here trying to finish
all my task for one day, can anybody please help?
Thanks
Bill