Does the Wikimedia Foundation's technology team have any insight or
comment on the finding that (other than the Wikipedia Main Page and
the "404 error" page), in September the most popular page on the
English Wikipedia was "Mathematical descriptions of opacity", with
over 5.1 million views? There was no discernible "bump" in interest
in opacity due to outside news events or a book or movie release on
the subject.
The phenomenon is outlined here:
http://www.examiner.com/wiki-edits-in-national/wikipedia-s-top-10-most-view…
Do you think this is some sort of malicious probing activity by a hacker, or is
it perhaps the deliberate testing of a developer employed by the WMF?
Thank you,
Greg
On Mon, Oct 3, 2011 at 10:15 PM, Brion Vibber <brion(a)wikimedia.org> wrote:
> I would *very* strongly recommend doing the internal refactoring before we
> get anywhere near reviewing and deploying that bad boy; otherwise we'll
> spend all the code review time pointing out things to refactor to avoid
> future maintenance problems. :)
I agree. The only question is whether the current system is running
out of steam badly enough to require a two stage process. If the
answer is "no, we're good", then definitely FileStore first.
I've been working on a pageoutput branch. Centred around an earlier RFC.
https://www.mediawiki.org/wiki/User:Dantman/Page_output_branchhttps://www.mediawiki.org/wiki/Requests_for_comment/Drop_actions_in_favour_…
The general goals of the project are:
- Actions die off and instead we have SpecialPages that take on that
functionality. &action= urls become one method of accessing certain
special pages.
- We have a new PageView system that replaces Article which just handles
the output of a page and SpecialPage becomes an implementation of these.
- Logic for things like tabs, special links in the toolbox, canonical
urls, etc... become part of the PageView implementation and Skin only
handles the way they are laid out instead of also doing the actual creation.
Some of this stuff is related to skinning and works better for the
skinning rewrite if done before so I started on this before the template
system and whatnot.
It takes a day of work to do much in this branch and I only have about 1
day in a week I can go and start coding things at that level, so the
project hasn't gone all too far yet. (not to mention merging is
horrible, I wasted several hours at one point I could have been coding
with on merging some conflicts from trunk).
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
I finally got around to organizing my Skinning system page and cleaning
out the ideas sitting there that I actually dropped or replaced.
https://www.mediawiki.org/wiki/User:Dantman/Skinning_system
Some notable things:
- A template system is planned. The idea is slowly stabilizing.
- I plan to replace things like bodytext, newtalk, sitenotice, catlinks,
dataAfterContent, etc... with a 'regions' system that lets skins define
their own areas with certain parameters, extensions define things they
want in certain types of areas, and MediaWiki figures out what to put where.
- I have an idea on how to eliminate content_{actions,navigation},
personal_urls, toolbox, language_urls, etc... in favour of a more
flexible system:
https://www.mediawiki.org/wiki/User:Dantman/Skinning_system/Link_lists_rewr…
- I've dropped the idea of controlling the SEARCH/LANGUAGES/TOOLBOX from
the navigation editing interface and perhaps even the skins defining the
navigation blocks themselves. Instead I'm thinking of users being able
to define navigation blocks of different types. Drop those in for use in
different skins. And the sidebar would become a widgetized sidebar setup
where from a separate interface you can drop navigation lists as widgets
into the sidebar, alongside the default search/etc..., raw blocks of
wikitext, and extensions will be able to implement alternative widgets
like donation buttons, advertisement blocks, etc... also extensions may
be able to implement context-sensitive types of navigation that can be
dropped in. eg: To make something like what Blender's Wiki tried to do
possible.
An example of a possible way the Vector skin might be built with this
system:
https://gist.github.com/1239039
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
Hello all,
It’s with great pleasure that I’m announcing Phil Chang as our new
Product Manager for Mobile, starting today. Phil will be working
closely with Tomasz’ engineering team and will report to me. As
product manager, it’s Phil’s job to define the scope and the
priorities of the improvements and feature additions that we’ll be
making to our mobile experience -- in partnership with the community
and the WMF team.
Phil’s been in the mobile space since 2002. His first product
development work on mobile was with mobile start-ups in the UK,
including a company that released StealthText, a service for
auto-expiring text messages, and later Picsel Technologies. At Picsel,
he led development of two new products, a Mobile Content Solution and
an embedded mobile web browser.
Prior to that, Phil had a long career in the Bay Area as a product
innovator at Macromedia, AT&T, and his own start-up, e-Acumen.
In his spare time, Phil enjoys art exhibitions, musical performances,
ice skating and rollerblading. During his time in the UK, he became a
British citizen and traveled throughout Europe. He has been a student
of the German, Japanese and Korean languages, and is married to a
native Japanese. He and his wife have a young daughter who holds three
citizenships: Japan, US and UK.
Working for a nonprofit again is coming full circle for Phil. During
his high-school and university years, for six years, he published and
edited a magazine of the arts operated as an independent non-profit.
He’s passionate about making culture accessible to all human beings,
and helping us bring free knowledge to billions of people. He'll also
help us explore how our mobile user community can be directly engaged
in the projects by contributing text and media.
Please join me in welcoming Phil to the Wikimedia movement.
All the best,
Erik
--
Erik Möller
VP of Engineering and Product Development, Wikimedia Foundation
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate
There are a few issues with how our file/image handling works that make
handling remote files a bit harder than it needs to be.
Some things don't work consistently over InstantCommons because a local
MediaHandler class is used that doesn't understand the remote data, and the
whole FileRepo architecture is pretty awkward for actually storing "local"
files outside of the filesystem (eg the Extension:SwiftMedia backend has to
jump through a lot of hoops).
I've started collecting some notes for an RFC on the wiki:
https://www.mediawiki.org/wiki/Requests_for_comment/Refactor_on_File-FileRe…
There's not a lot of specific work targets in there yet, as we'll want some
more data from the folks working on the alternate backends as well as the
frontend issues which are mostly what I've written on.
But I would like to see if we can avoid having to have a local MediaHandler
that's compatible with a remotely-hosted file. As long as the necessary
metadata can be exposed, and any rendering can be encapsulated as a
thumbnail or an iframe, client sites *should* be able to render anything we
can push out from Commons without any special support on their end.
-- brion
Hi everyone!
I would like to hold a brown bag lunch on Selenium Unit Testing for
Mediawiki extensions.
I have been using phpunit and Selenium for a few years.
Selenium Unit Testing in the core has been put on hold until we get our new
QA Lead.
I have attached an event to this email for Wednesday, September 7th from
12:30pm - 1:00pm
There will be a small presentation on how we may use Selenium on the
Fundraising extensions.
Discussion/questions will follow.
Thanks!
--
Jeremy Postlethwaite
jpostlethwaite(a)wikimedia.org
515-839-6885 x6790
Backend Software Developer
Brown bag - Selenium Unit Testing
Selenium Unit Testing on Mediawiki
*When*
Wed, September 7, 12:30pm – 1:00pm GMT-07:00
*Where*
SF Office Sixth floor
*Who*
•
Jeremy Postlethwaite
•
wikitech-l(a)lists.wikimedia.org
Wikimedia Foundation <http://wikimediafoundation.org/>
I think I sent this to the wrong address initially, here it is again
(sorry if I'm just getting confused),
Dan.
---------- Forwarded message ----------
From: Dan Bolser <dan.bolser(a)gmail.com>
Date: 1 October 2011 10:37
Subject: API for Extension:ExpandTemplates?
To: "mediawiki-l(a)lists.wikimedia.org" <mediawiki-l-bounces(a)lists.wikimedia.org>
Hi,
Can I get the XML parse tree from the MW API using Extension:ExpandTemplates?
http://www.mediawiki.org/wiki/Extension:ExpandTemplates
Would this ever be core functionality?
In general, how should an extension extend the API, and could this be
done for Extension:ExpandTemplates? I think it would be very useful
for third party client applications to get programmatic access to the
MW parse tree (for convenient high level page manipulation, such as
trivially checking for a specific value of a specific field of a
specific template and updating that template to add a new field, or
counting the length of a list in a specific section, etc., etc.)
I've been trying to implement such a tool in Perl, but of course, my
template parser doesn't (yet) fully agree with the MW template parser,
and the latter may change in future, requiring me to constantly
maintain my Perl parser. Such an API extension would mean I could just
get access to to what I need in Perl via the API.
Here is a link to the stuff I've done in Perl in case anyone is
interested (constantly changing these days):
https://github.com/dbolser/MediaWiki--/tree/master/local-lib/lib/perl5/Medi…
(Note, API.pm isn't my work, and I'll remove it from that location soon).
Cheers,
Dan.