-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hello,
Microsoft has informed us with an email to OTRS (#2010011110039819) that
wikimedia.org (and presumably our other domains) will be removed from
the Compatibility View List for Internet Explorer 8 near the end of
January 2010.
More info: http://msdn.microsoft.com/en-us/library/dd567845(VS.85).aspx
The actual list:
http://www.microsoft.com/downloads/details.aspx?familyid=B885E621-91B7-432D…
Thought you should know.
- -Mike
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.9 (GNU/Linux)
iEYEARECAAYFAktLte8ACgkQst0AR/DaKHsrAQCfbZraeGtTuYN8liN4CsorpTgG
oyAAoMBmUHbVNbOkwq77Uzs1hWvBS1ip
=MciX
-----END PGP SIGNATURE-----
Hello,
The Hebrew Wikipedia passed the 100,000 articles mark yesterday. At the main
www.wikipedia.org page the link to he.wikipedia was appropriately moved to
the 100,000+ section, but the word for "search" was not added under the
globe.
The Hebrew word is חיפוש. Can anyone please add it?
Thanks in advance.
--
אָמִיר אֱלִישָׁע אַהֲרוֹנִי
Amir Elisha Aharoni
http://aharoni.wordpress.com
"We're living in pieces,
I want to live in peace." - T. Moore
Today Wikimedia's world-wide five-minute-average transmission rate
crossed 10gbit/sec for the first time ever, as far as I know. This
peak rate was achieved while serving roughly 91,725 requests per
second.
This fantastic news is almost coincident with Wikipedia's 9th
anniversary on January 15th.
[http://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Day ]
In casual units, a rate of 10gbit/sec is roughly equivalent to 5 of
the US Library of Congress per day (using the common 1 LoC = 20 TiB
units). Wikimedia's 24 hour average transmission rate is now over
5.4gbit/sec, or 2.6 US LoC/day.
A snapshot of the traffic graph on this historic day can be seen here:
http://commons.wikimedia.org/wiki/File:2010-01-11_wikimedia_crosses_10gbit.…
Ten years ago many traditional information sources were turning
electronic, and possibly locking out the unlimited use previously
enjoyed by public libraries. It seemed to me that closed pay-per-use
electronic databases would soon dominate all other sources of factual
information. At the same time, the public seemed to be losing much of
its interest in the more intellectually active activities such as
reading. So if someone told me then that within the decade one of the
most popular websites in the world would be a free content
encyclopedia, consisting primarily of text, or that the world would
soon be consuming over 50 terabytes of compressed educational material
per day—I never would have believed them.
The growth and success of the Wikimedia projects is an amazing
accomplishment, both for the staff and volunteers keeping the
infrastructure operating efficiently as well as the tens of thousands
of volunteers contributing this amazing corpus. This success affirms
the importance of intellectual endeavours in our daily lives and
demonstrates the awesome power of people working together towards a
common goal.
Congratulations to you all.
Hi everyone,
I wanted to get some idea of what y'all think of our existent bug
tracker. I know there are mixed feelings about Bugzilla and works well
for a lot of things we do and maybe we need to use it more
effectively/upgrade it and find tools that integrate with it better. But
I'd like to know whether there is any kind of consensus for sticking
with it or switching away from it and what you would like to have that
Bugzilla is missing now (we are a few versions behind on
bugzilla.wikimedia.org)
Guillaume and Naoko have expressed a need for a Project Management Tool
and I though it would be good to try use a tracker with some project
management functionality or integrates easily with some project
management tool.
I've started a page for gathering feedback and suggestions
http://www.mediawiki.org/wiki/TrackerPMTool
Once we have a few good candidates I'll try to get a some test instances
going for people to play with.
-p
--
Priyanka Dhanda
Code Maintenance Engineer
WikiMedia Foundation
http://wikimediafoundation.org
Hi,
I want to download the wikipedia database dump of English version. But
the whole database dump is 10.1GB, which is too large for me. In fact, I
only need a part of the database, and any part is ok for me. Can I download
a small database, which is the subset of the whole database dump ?
Thanks for your time and help!
Best Wishes!
Hi all,
So we got some new search servers (thanks Mark&Rob) and I have deployed
them today. As a consequence, the search limit is now re-raised to 500
and interwiki search is back on all wikis. I would still however like to
keep srmax on 50 for API because there seems to be quite a number of
broken bots and people experimenting...
Additionally, I've switched mwsuggest to lucene backend, so now the AJAX
suggestions are no longer alphabetical but ranked according to number of
links to them (and some CamelCase and such redirects are not shown).
This has been active on en.wp for a while, but now it's on all wikis.
If you see things broken please find me on IRC, or leave a message on my
en.wp talk page.
Cheers, Robert
On Sun, Jan 10, 2010 at 5:50 PM, Robert Stojnic <rainmansr(a)gmail.com> wrote:
> So we got some new search servers (thanks Mark&Rob) and I have deployed
> them today. As a consequence, the search limit is now re-raised to 500
> and interwiki search is back on all wikis. I would still however like to
> keep srmax on 50 for API because there seems to be quite a number of
> broken bots and people experimenting...
>
> Additionally, I've switched mwsuggest to lucene backend, so now the AJAX
> suggestions are no longer alphabetical but ranked according to number of
> links to them (and some CamelCase and such redirects are not shown).
> This has been active on en.wp for a while, but now it's on all wikis.
>
> If you see things broken please find me on IRC, or leave a message on my
> en.wp talk page.
If anyone feels adventurous:
http://www.joachims.org/publications/joachims_02c.pdfhttp://www.cs.cornell.edu/People/tj/svm_light/svm_rank.html
Hi,
I have a
suggestion for wikipedia!! I think that the database dumps including
the image files should be made available by a wikipedia bittorrent
tracker so that people would be able to download the wikipedia backups
including the images (which currently they can't do) and also so that
wikipedia's bandwidth costs would be reduced. I think it is important
that wikipedia can be downloaded for using it offline now and in the
future for people.
best regards,
Jamie Morken