Though it is not entirely fixed, here are some news of a hardware donation made to Wikimedia Foundation
-------------
An employee at HP in France has contacted us, offering us free hardware.
This is a no-strings-attached donation from a few friends at HP who want to support Wikipedia. They reserve the right to mention this donation.
The machines offered are:* One 4U 6 CPU Xeon 750 with 1GB of RAM* One 4 CPU Xeon 700 with 2GB of RAM* One 6U 8 CPU Xeon 900 with 16GB of RAM and 2 SCSI drive max (which might work as a web server box immediately)* 3 Celeron 600 1U with 20G/ide and 128M of memory.
There were a lot of discussion on irc these days about these servers
and many people got involved to determine what would be best doing.
First we decided to accept the offer :-) The 6 servers will be picked up 2 french contributors, Med and Cereal Killer on thursday.
There will be a small shipping cost from Grenoble (the city where the servers come from) to Paris, which will be taken by the WMF.
Med will install the Celerons. They will have a memory upgrade (the WMF will pay for this).
Yann has found us a hosting company ready to host the three squids for free, http://lost-oasis.fr/ :-)
The three bigger servers will be shipped from Paris to Florida. We are currently trying to find cheap shipping cost.
Any help/suggestion for *cheap* shipping is most welcome. We'd prefer a company with good credentials,
but there is no requirement in terms of shipping speed, so we might be able to find a good price.
Please, help us if you know a good shipping solution.
-----------
I will set a page on the future wikimediafoundation website (http://meta.wikimedia.org/wiki/WMF/Main Page) to begin listing corporations helping Wikimedia projects.
---------------------------------
Do you Yahoo!?
Yahoo! Mail - 50x more storage than other providers!
When developers are rewarded, one option would be to pay for the jobs that do not get done, the donkey work. The tidying up of code, operational procedures, conversions could be for money. However, these are often considered to be the best bits for breaking in new programmers to the environment. So it would be a mixed blessing.
The sexy stuff like new features would remain as it is.
Thanks,
GerardM
gzip, chunked, identity, trailers
Hello all,
can anybody tell me why I can't delete this file:
http://it.wikipedia.org/wiki/Immagine:Mrdoctor.jpg
If I click the "canc" (="del") link next to it, I get
'Non è stato possibile cancellare il file "/usr/local/apache/htdocs/wikipedia.org/upload/it/d/d6/Mrdoctor.jpg"'
(='Could not delete the file ...'). Trying to open the image itself gives a 404.
I don't think this is related to the lenght of 0 byte, as I already successfully deleted other 0 byte uploads. Is this just a minor database inconsistency? Should I ignore it?
On a related note: is there a way of getting a list of images where the image description was deleted instead of the image itself? I can't do that working on a database dump, right? Any suggestions? (Sorry if this is a FAQ - I'd expected it to be one, but I couldn't find an answer neither on meta nor in the gmane archive).
Thanks,
[[it:Utente:Leonard Vertighel]]
________________________________________________________________
Verschicken Sie romantische, coole und witzige Bilder per SMS!
Jetzt neu bei WEB.DE FreeMail: http://freemail.web.de/?mc=021193
Hi there,
what possibilities do I have to get the plain text of a page? On the one
hand I would like to have access to the source on the other hand I would
like to be able to get the formatted text. Are there any possibilities of
doing so?
Sincerely
Charly
I hereby declare the article validation function to be operational with
basic functions, which means it works but is still full of little icky
bugs :-)
To try, use CVS HEAD and set "$wgUseValidation = true ;" in your
LocalSettings. What to vote for is determined in Language.php by
"$wgValidationTypesEn" (demo is provided in CVS).
On the default skin (no other skins yet, sorry) you'll get a "validate"
tab on all articles (no other namespaces), for both old and current
versions. On a sidenote, I discriminate versions by their timestamp, as
there is currently no other versioning system that works for both cur
and old tables, AFAIK.
From the validation page, you also get a link to a Special Page (tm)
showing the statistics for the different versions of this article.
Breaking down statistics further was not implemented due to privacy
reasons and my personal lazyness ;-)
There are plans for some kind of "German 1.0" in the near future, not to
mention the English one and the one-volume printed version, so it would
be nice if we could get this running soon.
Magnus
> I would like that all developers (others may
> participate as well of course) help setting up a list
> to define how development tasks completed could be
> rewarded.
>
> This will be followed by a public poll to define how
> the rewards would be perceived...
>
> and a private poll for developers to know what their
> opinion on the matter is
>
> Please, contribute to
> http://meta.wikimedia.org/wiki/Bounty
>
> thanks :-)
I don't know about the other MediaWiki developers, but the only reward I'd ask
for for working on MediaWiki is that people work hard to make great articles and
a strong, friendly communities with this software.
If I could ask for anything else, I'd ask that people treat each other with the
gratitude and respect they deserve. Everyone who puts work into these projects
-- editorial, technical, community-building -- deserves our appreciation and kudos.
~ESP
A couple of image-related questions...
1. Magnus mentioned that he has added code to CVS that lets you do
[[Image:blah.jpg|thumb=blah_manually_thumbnailed.jpg]]
Has that gone live on the Wikipedia?
2. How best should the
Wikipedia deal with convenient viewing of multi-megapixel images like
the image on the following page:
http://en.wikipedia.org/wiki/State_Library_of_Victoria
Seeing we want the ability to make printed material out of Wikipedia
content, uploading full-resolution images is IMO the way to do it.
However, such images currently break the image description page.
Should the image description page automatically resize images larger
than screen width and have a link to download the full image? Would
such a feature be enough to deal with the problem? Would the automatic
resizing from "really big" to "screen size" cause the same kind of image
quality issues that Tannin mentioned with thumbnailing?
--
---------------------------------------------------------------------------
Robert Merkel
robert.merkel(a)benambra.org
http://benambra.org
Computer games don't affect kids; I mean if Pac-Man affected us as kids
we'd be running around in darkened rooms, munching magic pills, and
listening to repetitive electronic music.
---Marcus Brigstocke
(sometimes attributed to
Kristian Wilson, Nintendo Inc, 1989 - probably apocryphal)
--------------------------------------------------------------------------
Hi,
[[meta:MediaWiki_roadmap]] states that RSS syndication is available for many
special pages, such as Special:Newpages. Which special pages are you talking
of?
* wiki.phtml?title=Special:Newpages&feed=rss
is ok. But all the other pages I tried do not work
* Special:Recentchangeslinked
* Special:Whatlinkshere
* Special:Watchlist
* Special:Categories
* ...
Have a look at
http://meta.wikipedia.org/w/wiki.phtml?title=Special:Categories
There is a Link in the Toolbox (rss/atom) but it does not work?
Thanks,
Jakob
On wikipedia the MAX_INCLUDE_REPEAT limit (=5) in Parser.php is
problematic. I propose to change the sens of this constraint from
"max template inclusions" to "max different template inclusions".
To solve the memory usage problem, we could use a tempory table
containing the previously loaded templates in the article ? So, each
template replicated x time in an article would generate only one SQL
request.
An opinion ?
Emmanuel Engelhart
--
-------------------------------------------------------------------------
Il y a 10 catégories de gens. Ceux qui savent compter en binaire et
ceux qui ne savent pas.
-------------------------------------------------------------------------
Emmanuel Engelhart ICQ UIN : 53388731 TEL (+49)(0)6.22.15.88.03.31
In answer to Brion Vibber,
*I was referring to a post where server side compression was advocated based on the line/screen quality of the client side. This would happen when the details of the screen are requested. The argument was that a 5 Mb picture is too big over dial up. Consequently the current practice of compress once and save would not apply.
PS Thanks for fixing wiktionary.
Thanks,
GerardM