On Sunday 28 July 2002 03:00 am, The Cunctator wrote:
> What are the articles this person has been changing?
For 66.108.155.126:
20:08 Jul 27, 2002 Computer
20:07 Jul 27, 2002 Exploit
20:07 Jul 27, 2002 AOL
20:05 Jul 27, 2002 Hacker
20:05 Jul 27, 2002 Leet
20:03 Jul 27, 2002 Root
20:02 Jul 27, 2002 Hacker
19:59 Jul 27, 2002 Hacker
19:58 Jul 27, 2002 Hacker
19:54 Jul 27, 2002 Principle of least astonishment
19:54 Jul 27, 2002 Hacker
19:52 Jul 27, 2002 Trance music
19:51 Jul 27, 2002 Trance music
For 208.24.115.6:
20:20 Jul 27, 2002 Hacker
For 141.157.232.26:
20:19 Jul 27, 2002 Hacker
Most of these were complete replacements with discoherent statements.
Such as "TAP IS THE ABSOLUTE DEFINITION OF THE NOUN HACKER" for Hacker.
For the specifics follow http://www.wikipedia.com/wiki/Special:Ipblocklist
and look at the contribs.
--mav
I hereby decree, in my usual authoritarian and bossy manner, that today shall
forever be known as Magnus Manske day. Wikipedians of the distant future will
marvel at the day when the new software era dawned upon us.
Tonight at dinner, every Wikipedian should say a toast to Magnus and his many
inventions.
--Jimbo
On Wednesday 04 September 2002 10:38 am, Helga wrote:
> Hello, I am a little swamped with all the wiki list reading material and it
> seems my limited email is getting overloaded.
You might want to create an email filter to sort any emails with the string
"Helga" in the subject into a special folder (just use the help menu of
whatever email program you use and look up "filter").
Otherwise you may miss some emails that concern you.
-- Daniel Mayer (aka mav)
On Saturday 24 August 2002 12:01 pm, Karen wrote:
> Something I wondered - how do you know who the new users to greet them?
> Do you just look for user names you haven't seen before or is there some
> way to identify them? I'd be happy to do the meet-and-greet but I don't
> know how to do it.
Well - I guess I do it the hard way and scan each edit in all Recent Changes
for a 24 hour period looking for edit link user names (a dead give-a-away)
and for user names I don't remember seeing before. This works for me since I
pretty good reading comprehension and memory.
What would be most useful is a listing of new users that can be accessed from
http://www.wikipedia.org/wiki/Special:Listusers. That way this job would be
much easier.
BTW we really /do not/ have 3498 real users -- a good many of these "users"
logged in only to abuse our upload utility or for other nefarious or
non-contributing reasons (I don't greet any user who hasn't contributed at
all). Is there a way to get rid of many of these no-longer used user accounts
Lee (just the ones that have been inactive for months and whose user pages
are still edit links)?
-- Daniel Mayer (aka mav)
On Monday 19 August 2002 03:41 pm, you wrote:
> Can still be done later. The problem is the lack of time. If you wait to
> long there are to many links to the new location of the english
> wikipedia that can not be broken. If there is no fundamental objection
> to put the English wikipedia at en.wikipedia.org then that must be done.
> What to do whit www.wikipedia.org can wait (a littel.)
>
> giskart
This is just silly -- we are building an encyclopedia here not an
organization. There is nothing at all wrong with having the English wikipedia
at wikipedia.org and have all the pages that are about the English language
project be in the wikipedia namespace (or in the other languages project
namespaces). As each language figures out what to call their wikipedia we can
buy them domain names for that and make sure the xx.wikipedia.com domain
names still work.
Other than being a one-page portal to all the different language wikipedias
(which the Main Page already does -- as do most of the other language main
pages) I don't see any logic in using wikipedia.org for anything other than
the English language wikipedia.
-- Daniel Mayer (aka mav)
PyLaTeX appears to have a TeX parser (and much more!) written in Python,
and it's at
http://pylatex.sourceforge.net/
I'm not sure of the licence. I guess if we restrict the macros to a
"safe" set, we shouldn't need any more than the lexical analyser.
Neil
As I understand it, the problem with TeX is that it is an absolutely
general Turing-complete programming language, and you can change the
lexer on the fly, etc. We should use a subset of TeX with only the basic
symbol-rendering stuff, and very little else.
Here's a syntax for a subset of TeX macros in a vaguely BNF-style
notation, basically specifying anything that comes after a backslash.
http://www.csci.csusb.edu/dick/samples/comp.text.TeX.htmlhttp://www.csci.csusb.edu/dick/samples/comp.text.TeX.Mathematical.html
Would limiting TeX to using only these macros make it
* safe?
* complete enough for our purposes?
If so, then we could write a TeX-parser that would "sanitize" (and
canonicalize, if necessary) any input TeX before letting the real TeX
interpreter see it. Probably using Bison and C.
Neil