It's weekly code review update time! :) Some changes of note...
User preferences:
* (bug 505) Time zones can now be specified by location in user
preferences, avoiding the need to manually update for DST. Patch by Brad
Jorsch.
Web standards behavior:
* (bug 2585) HTTP 404 return code is now given for a page view if the
page does not exist, allowing spiders and link checkers to detect broken
links.
* (bug 16459) Use native getElementsByClassName where possible, for
better performance in modern browsers
Localization/UI changes:
* (bug 16612) Fixed "noprint" class for Modern skin print style
* (bug 16712) Special:NewFiles updated to use "newer"/"older" paging
messages for clarity over "previous/next"
* (bug 16026) revision-info, revision-info-current, cannotdelete,
redirectedfrom, historywarning and difference messages now use Wiki text
rather than raw HTML markup
Upload/file-handling issues:
* (bug 13835) Fix rendering of {{filepath:Wiki.png|nowiki}}
* (bug 16772) Special:Upload now correctly rejects files with spaces in
the file extension (e.g. Foo. jpg).
API changes:
* (bug 16726) siprop=namespacealiases should also list localized aliases
* (bug 16730) Added apprfiltercascade parameter to list=allpages to
filter cascade-protected pages
-- brion
In r44919 I've gone ahead and reimplemented bug 2585, so a page view for
a nonexistent page will return a "404 Not Found" HTTP status code
instead of a "200 OK".
This will allow bots and spiders to detect broken links from the outside
web to bad or deleted wiki pages, but should be totally transparent to
end-users.
When we first attempted this in 2005, the 404 was also applied to
action=edit pages... We had a large number of reports from active
editors that they were having trouble creating articles, with the edit
page not loading properly.
Since this was disrupting basic site activity, we ended up reverting it
before we had a chance to really track it down -- none of the developers
could reproduce it at the time.
The size of our base HTML from even the simplest skin should be enough
to avoid tripping Internet Explorer's "friendly error pages", and we
were unable to track down any specific browser, ISP proxy, or other
hypothetical problem source.
If people do encounter problems once this goes live, it at least
shouldn't affect primary wiki functionality while we reproduce and fix
it -- hitting a raw page view for a nonexistent page is something that
you usually have to jump through some hoops to do within the wiki.
After things have settled a bit, we'll probably start using more 4xx and
5xx status codes for various error pages.
-- brion
---- Kalan <kalan.001(a)gmail.com> schrijft:
>> On Tue, Dec 23, 2008 at 07:59, Brion Vibber <brion(a)wikimedia.org> wrote:
>> In r44919 I've gone ahead and reimplemented bug 2585, so a page view for
>> a nonexistent page will return a "404 Not Found" HTTP status code
>> instead of a "200 OK".
>
> Do links of type http://**.wikipedia.org/wiki/redlink to nonexistent
> articles really happen that often? They have to be typed by hand, or
> otherwise they will be action=edit&redlink=1, which redirects to a
> 404ing page only in English Wikipedia and some other wikis, not
> everywhere.
They happen when third parties link to an article that is later deleted. When search engine crawlers follow such a link, the 404 will tell them it's a dead link.
Roan Kattouw (Catrope)
Hi!
I was wondering if there is a way to download an early English
Wikipedia database xml dump (something in the first month or so).
Curently, at download.wikimedia.org there are only a few recent dumps,
and none of them are complete.
Thanks!
~Dimo
-------------------------------------
Не пропускай тази оферта!
ПРОМО КОД `KOLEDA` незабавно ти дава до 20% отстъпка от цената на
всяка поръчка за уеб хостинг от СуперХостинг.БГ
http://www.superhosting.bg/mailfoot2.adv
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
We’ve had some pretty unpredictable code deployment schedules lately and
I know that’s been a bit confusing and frustrating (especially when bug
fixes are backed up!) To regularize this, I’m now instituting a regular
weekly update schedule.
Every Tuesday I’ll be doing a code review and deployment prep to get
updated MediaWiki code deployed on the Wikimedia sites.
This does mean that Tuesday is probably not a good time to commit large
experimental changes — but a great time to be available for cleanup and
bug fixes on the changes you’ve been making over the last week. :)
Of course there’s some relation to Bug Mondays... I’m dithering over
whether to move the bug day to Wednesday or Thursday, since I know on my
end Mondays are often taken up with catch-up and meetings, and people
are often kinda busy; further it’s hard for me to prep a Bug Monday
during work time when our European and Australian folks have already
been through most or all of Monday before I wake up. ;)
Any preferences between Wed and Thurs?
(Cross-posted w/ my blog --
http://leuksman.com/log/2008/12/16/code-review-tuesdays/ )
- -- brion
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.8 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org
iEYEARECAAYFAklH6vgACgkQwRnhpk1wk45eBQCcCoYdt7s/v1PB2QrUDjq8RtEU
Fi0AnR6hLGzY4If1jf3FVm59s6klp3V0
=P7f7
-----END PGP SIGNATURE-----
http://developers.slashdot.org/article.pl?sid=08/12/17/1443216
"Facebook now has 140 million users, and in recent weeks has been
adding 600,000 new users a day. To keep pace with that growth, the
Facebook engineering team has been tweaking its use of memcached, and
says it can now handle 200,000 UDP requests per second. Facebook has
detailed its refinements to memcached, which it hopes will be included
in the official memcached repository. For now, their changes have been
released to github."
--
--
ℱin del ℳensaje.
Looking to the future of mediaWiki the base set of javascipt will
continue to grow as the client side applications grow in complexity to
address improved usability issues and new features. To this end it will
become more necessary to A) have a better system for sending out client
side javascript. and B) standardize around a JavaScript helper library.
A) The improved delivery mechanism is a two part issue: Code
maintainability and client side performance. 1) To maintain and
modularize code as the complexity of javsaciprt libraries grows, it
makes a lot of sense to split JavaScript class & objects into respective
files folders etc. Likewise we dont' want additional requests for
language code delivery. By using a server side delivery system we can do
clean dynamic addition of sets of javascript files to the page in a
single request "just in time" as the user interacts with a given set of
interface components. If we don't update our javascript delivery
mechanisms this will result in _lots_ of little javascipt requests and
less maintainable/flexible javascript code.
2) Furthermore with complex javascript libraries we want to add verbose
comments, documentation, and debugging statements to the code without
resulting in reduced client side performace in delays due to file size
increases. Minimized javascript can strip all that unnecessary bits.
I propose we implement or adopt something like:
http://code.google.com/p/minify/
This will mean sets of javascript files can be grabbed in a single
request, minimized, grouped, cached, and gziped (if the clients supports
it). This should work fine with our reverse proxy setup resulting in a
net decrease in cluster load by dealing with smaller files for the
majority of the time. A user preference could request un-compressed
individual files and or a url parameter like ?jsdebug=true could enable
non-compressed output for debugging.
A library such as minify can also minimize and group all the style
sheets and minimize html output if we wanted. Although the gains are not
anywhere as dramatic or as necessary for the html/css space.
If we can get some community consensus about this direction that would
be good. I will start looking at integrating the above mentioned
library, run some tests etc.
B) We should also address the convergence on a javascript library for
HTML document traversing, event handling, interface improvements,
maintainability, flexibility etc. All the sequencer, metavid stuff uses
jQuery. jQuery is GPL/MIT licensed javascript library emerging as the
"winner" in script libraires with very wide adoption (google, apple.com,
digg.com, mozilla.com etc) and very small footprint. Refactoring
existing mediaWiki javsacript code as jQuery javascript would result in
much fewer cross browser hacks in mediaWiki javasctipt and generally
shorter, more maintainable code. So seems like a good direction to me ;)
peace,
--michael