I was reminded (thanks, Erik!) that the category feature in the CVS was
not working. I /think/ I fixed it (now in CVS). Try it by setting
$wgUseCategoryMagic=true in LocalSettings.
Magnus
"Karl Eichwalder" <ke(a)gnu.franken.de> schrieb:
> Jimmy Wales <jwales(a)bomis.com> writes:
>
> > :-) Well, this is amusing, but a better answer would be to serve them
> > a plaintext page. Maybe the 'printable' version would be happy enough?
>
> Maybe, but don't get me wrong. The text was meant as a boiler
> plate--the normal article text should follow.
>
> BTW, using lynx you can read (and edit? I didn't try) the WP article
> pretty well.
If I remember correclty it was quite irritating to scroll to the top and
sidebars to get to the article.
As for editing: Yes, editing is possible. There are some points though:
* Lynx does not allow one to add new lines, nor to go beyond a certain
number of lines. A Lynx user cannot insert a new line or edit more than
a certain number (20?) of lines down the article - where every return
counts as a line
* Lynx will 'transcribe' a UTF-8 page to Latin-1 (doing a commendable task
at it, by the way, Japanese seemed to become correct romaji). It will
be this Latin-1'ed page that is saved
There might well be more problems with it.
Andre Engels
The DNS update to end coronelli's idleness hasn't gone through yet, the
secondary DNS joey.bomis.com still lists the old data.
With 500k articles traffic peak ahead it would be of great benefit to
tackle this rather soon. Nearly doubling the ram cache size available will
be good for the hit ratio and therefore for the overall speed of wikipedia.
--
Gabriel Wicke
Nick,
We have also set up Mediawiki here in Gartner a couple of months ago and are finding it extremely useful for all those things which do not belong on the official internal company website. After using it a while within our application development area we intend to roll it out throughout the rest of the company. I agree wholeheartedly with your comments regarding it being a "killer app".
I also find the differences between it and other wiki engines to be so significant as to make it almost a completely different application in terms of functionality, usefulness, and ease of use. Some people have found it harder to install but I think the difference there is minimal and well worth it.
Kudos to the development team, and I will encourage developers here in our company to contribute back to Mediawiki.
Thanks,
Michael Richards
Gartner
-----Original Message-----
From: wikitech-l-bounces(a)Wikipedia.org
[mailto:wikitech-l-bounces@Wikipedia.org]On Behalf Of Nick Pisarro
Sent: Sunday, February 22, 2004 1:24 AM
To: Wikimedia developers
Subject: Re: [Wikitech-l] Should give some credit to the software
A lot of credit for the growth of Wikipedia has to go to the fantastic
quality of the software that you guys have created.
...
Using MediaWiki, we have set up an internal wiki site here at Aperture
Technologies, Inc. for gathering and disseminating knowledge about
technical development and other issues. In the six weeks since it has been
made available internally, it has already become an indispensable part of
our development department's operation. We expect to expand its use into
other departments of the company fairly soon.
Again, I consider this "killer app" quality software.
Nick Pisarro, Jr.
Aperture Technologies, Inc.
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)Wikipedia.org
http://mail.wikipedia.org/mailman/listinfo/wikitech-l
Jimmy, we're seeing a couple odd problems on suda. Something that may
or may not be helped by a kernel upgrade (maybe? who knows) and what
appears to be a corrupt page in the database (maybe? who knows).
We *need* a slaved server *now* in case suda goes south unrecoverably.
Is gunther available for this (if so on what IP?), or should I strip
down one of the other machines?
-- brion vibber (brion @ pobox.com)
Hi all..
I am Isam Bayazidi, a wikipedian from the Arabic wikipedia, and a
SysOp there .. There had been a raising demand on having a mailing list
for Arabic wikipedia issued.. I wanted to know how can we have it
created ?! wikiar-l List ..
Thank you very much
Yours
Isam Bayazidi
"Nicolas Weeger" <nweeger(a)noos.fr> schrieb:
> First, it would be nice to automatically change links to redirects to links to
> the 'real' article during edition. This avoids having the 'redirected from'
> message on the linked article, thus giving the impression of a well-managed
> encyclopedia ^_-. And also will avoid some issues if the redirect becomes a
> disambiguation page.
No. If those "redirected from" messages are a problem, remove the
message. To automatically change the links would mean that when someone
decides the redirect page should not be that any more, that person
needs to go to all pages linking to the page redirecting to to check
what their actual link is. This is quite a problem if the redirect becomes
an article, and _even worse_ if it becomes a disambiguation page.
Rather than solving problems, it would create them.
Andre Engels
Hi,
I've granted CVS access to Nick Pisarro. He wants to commit the following
changes:
- PEAR:Mail support for authenticating against SMTP servers
- make use of previously unused $wgWhitelistEdit array to specify list of
pages a user has read access to until they create an account.
Both of these changes seem useful to me. I have also granted Nick access
to the trackers.
Regards,
Erik
Optim wrote on wikipedia-l
> I think it would be a good idea to have separate
> databases (or downloadable files) for userpages,
> talkpages and articles.
..
> Having separate databases (or downloadable files)
> will help the people who mirror our content to
> copy just what they really want (the articles)
> and not userpages and talkpages
Optim provided non-technical rationale.
People who want to download Wikipedia for local browsing might appreciate
smaller dump sizes,
so I checked the distribution of records per namespace.
Here are figures for the most recent fr: dumps (largest dumps that I can
download without errors)
Keep in mind that real dumps are smaller due to compression, and that some
namespaces may compress better than other due to similarity of subsequent
versions.
I am not sure what to conclude from this, but here are the figures anyway.
CUR table
namespace=description: x bytes = y perc. of total - z number of records
0: Articles: 58346580 bytes = 71.7% - 32549 records
1: Article discussions: 7487166 bytes = 9.2% - 3714 records
2: User pages: 1811361 bytes = 2.2% - 1254 records
3: User discussions: 3737941 bytes = 4.5% - 1379 records
4: Wikipedia: 6788373 bytes = 8.3% - 641 records
5: Wikipedia discussions: 1680691 bytes = 2% - 282 records
6: Image pages: 1020852 bytes = 1.2% - 3653 records
7: Image discussions: 48294 bytes = 0% - 62 records
8: Messages: 389328 bytes = 0.4% - 600 records
9: Message discussions: 10702 bytes = 0% - 8 records
OLD table
namespace=description: x bytes = y perc. of total - z number of records
0: Articles: 783459699 bytes = 51.4% - 195416 records
1: Article discussions: 100863970 bytes = 6.6% - 12668 records
2: User pages: 19020815 bytes = 1.2% - 5765 records
3: User discussions: 133820662 bytes = 8.7% - 10385 records
4: Wikipedia: 455423770 bytes = 29.8% - 22493 records
5: Wikipedia discussions: 30388381 bytes = 1.9% - 2408 records
6: Image pages: 558602 bytes = 0% - 2456 records
7: Image discussions: 109211 bytes = 0% - 90 records
8: Messages: 91808 bytes = 0% - 109 records
9: Message discussions: 21505 bytes = 0% - 16 records
Erik Zachte
Hello.
I have some feature ideas, and i'm wondering what everyone would think about'em.
Note: i'm not a developer, just a (french) Wikipedia user, so don't count on me
to implement'em :)
(ok, i could maybe if i took the time to dig into the code, but people knowing
it a lot would probably need less time).
First, it would be nice to automatically change links to redirects to links to
the 'real' article during edition. This avoids having the 'redirected from'
message on the linked article, thus giving the impression of a well-managed
encyclopedia ^_-. And also will avoid some issues if the redirect becomes a
disambiguation page.
Second, it could be really nice if, during preview, the first lines of the
linked articles were displayed somewhere. Real often, people write links and
just check the article exists, but don't check if it's actually the article they
intend to link to. Displaying the first lines would make it fast to check.
In the same idea, maybe detect & display links to disambiguation pages in a
different color ?
Those could be options in preferences, for people who don't want that behaviour.
I think those ideas would make it easier to write articles, and make correct
article interlinks.
On the other hand, they would probably increase slightly the load (on db &
server), since they require digging & displaying first lines of linked articles,
which may be numerous. But hopefully our new servers could handle that, no? :)
Nicolas 'Ryo'