Hi!
What's the proper way of thumbnail generation for Ogg media handler, so
it will work like at commons?
First, I've downloaded and compiled latest ffmpeg version (from
git://git.videolan.org/ffmpeg.git) using the following configure
options:
./configure --prefix=/usr --disable-ffserver --disable-encoder=vorbis
--enable-libvorbis
The prefix is usual for CentOS layout (which I have at hosting) and best
options for vorbis were suggested in this article:
http://xiphmont.livejournal.com/51160.html
I've downloaded Apollo_15_launch.ogg from commons then uploaded to my
wiki to check Ogg handler. The file was uploaded fine, however the
thumbnail is broken - there are few squares at gray field displayed
instead of rocket still image.
In Extension:OggHandler folder I found ffmpeg-bugfix.diff. However there
is no libavformat/ogg2.c in current version of ffmpeg. Even, I found the
function ogg_get_length () in another source file, however the code was
changed and I am not sure that manual comparsion and applying is right
way. It seems that the patch is suitable for ffmpeg version developed
back in 2007 but I was unable to find original sources to successfully
apply the patch.
I was unable to find ffmpeg in Wikimedia svn repository. Is it there?
Then, I've tried svn co
https://oggvideotools.svn.sourceforge.net/svnroot/oggvideotools
oggvideotools
but I am upable to compile neither trunk nor branches/dev/timstarling
version, it bails out with the following error:
-- ERROR: Theora encoder library NOT found
-- ERROR: Theora decoder library NOT found
-- ERROR: Vorbis library NOT found
-- ERROR: Vorbis encoder library NOT found
-- ogg library found
-- GD library and header found
CMake Error at CMakeLists.txt:113 (MESSAGE):
I have the following packages installed:
libvorbis-1.1.2-3.el5_4.4
libvorbis-devel-1.1.2-3.el5_4.4
libogg-1.1.3-3.el5
libogg-devel-1.1.3-3.el5
libtheora-devel-1.0alpha7-1
libtheora-1.0alpha7-1
ffmpeg compiles just fine (with yasm from alternate repo, of course).
But there is no libtheoradec, libtheoraenc, libvorbisenc neither in main
CentOS repository nor in aliernative
http://apt.sw.be/redhat/el5/en/i386/rpmforge/RPMS/
However it seems these is libtheoraenc.c in ffmpeg; what is the best
source of these libraries? It seems that there is no chance to find
proper rpm's for CentOS and one need to compile these from sources?
Dmitriy
Hello,
Our javascript tests are being run under TestSwarm [1] and we currently
cover up most desktop browsers (thanks brion).
According to our squids stats [2], most of Wikimedia mobile traffic
comes from the following browsers (sorted by popularity):
- Safari
- Android
- Opera
- Mozilla
- Blackberry
* Safari, Opera & Mozilla for mobile : they are probably mostly the same
as the desktop version. I have not found emulators for them.
* Android : has an emulator. On my computer it is painfully slow and not
usable for anything.
* Blackberry : emulator is Windows only :-/
Would be great to enhance our testswarm with more browsers. Maybe we
could contact those mobiles developers to connect to our testswarm?
:-)
[1] http://toolserver.org/~krinkle/testswarm/
[2] http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm
--
Ashar Voultoiz
Some of you may have found that ResourceLoader's bundled & minified
JavaScript loads can be a bit frustrating when syntax errors creep into your
JavaScript code -- not only are the line numbers reported in your browser of
limited help, but a broken file can cause *all* JS modules loaded in the
same request to fail[1]. This can manifest as for instance a jquery-using
Gadget breaking the initial load of jquery itself because it gets bundled
together into the same request.
I've taken a copy of JSMin+ (MPL 1.1/GPL 2.0/LGPL 2.1 triple-license) to our
includes/libs -- it's a JS minification library that had originally gotten
knocked out of the running for merging due to being a bit slow, but has the
advantage of coming with an actual JavaScript parser [2].
Only the parser is being used right now, in two places:
- on the JavaScriptMinifier test cases to confirm that results are valid JS
(should be extended to a fuzz tester, probably)
- on each individual file loaded via ResourceLoaderFileModule or
ResourceLoaderWikiModule, so we can throw a JavaScript exception with
details of the parse error *with line numbers for the original input file*
This can be disabled by turning off $wgResourceLoaderValidateJs, but I'm
setting it on by default to aid testing.
I'd like for folks to keep an eye out to make sure they don't get any false
positive parse errors in real-world modules, and to see if there are any
noticeable performance regressions. Like ResourceLoader's minification
itself the validation parses are cached so shouldn't cause too much ongoing
load, but it still takes some time.
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=28626
[2] http://crisp.tweakblogs.net/blog/1856/jsmin+-version-13.html
-- brion
Hi everyone
I've just posted postmortem notes on the MediaWiki 1.17 release here:
http://www.mediawiki.org/wiki/MediaWiki_1.17/Release_postmortem
...and since I expect there will be some editing/futzing with that
page, I've included the full wikitext below. Also, I wouldn't be
surprised if this generates some discussion on this list.
(start of wikitext):
We released [[MediaWiki 1.17]] on June 22. In the interests of doing
better next time, a small group of us (Tim, Chad, Sam, Sumana, and
RobLa) got together to brainstorm what went right and what we need to
look at. [[User:RobLa-WMF|RobLa]] then summarized that discussion,
and wrote this summary up. Any first person references are probably
me (RobLa), and any references to "we" is probably the group above.
See the history for this page for the raw notes.
Note: this is specifically about the MediaWiki 1.17.0 release, rather
than the 1.17 deployment.
== Timeline ==
Here is the timeline, derived from SVN commit logs:
* 2010-07-28 - MediaWiki 1.16.0 released
* 2010-12-07 - REL1_17 branched. This is the branch that MediaWiki
1.17.0 was based on.
* 2011-02-03 - 1.17wmf1 branched
* 2011-05-05 - MediaWiki 1.17.0beta1 tagged
* 2011-06-14 - MediaWiki 1.17.0rc1 released
* 2011-06-22 - MediaWiki 1.17.0 released
== How it went ==
We started by brainstorming "what went well" and "what to look at".
In the initial brainstorming, the original group had many more items
in the "what to look at" section than in the "what went well". I
then set about organizing things, and settled upon four categories:
substance, polish, timing, and process. What became clear was that we
felt pretty good about the substance and polish of the release (where
positive and negatives balanced out pretty well), but the timing and
process categories had the most that we needed to look at.
=== Substance and polish ===
As for the substance, it went very well. We had three large features
(ResourceLoader, category sorting and the new installer) that
complicated this release. As of this writing, it looks like these
features are in pretty good shape, and we can be pretty proud of
releasing them in the state that they're in. We fixed a lot of bugs
(207 noted in the [[Release notes/1.17|release notes]), and made many
smaller improvement to the codebase. Everyone was right to be very
eager to get this release out.
Things of substance that didn't go so well: our PostgreSQL support
suffered until quite late in the process, and our command line
installer is incomplete in some frustrating ways. On PostgreSQL: the
developers who fixed the last of the bugs aren't people that use
PostgreSQL on a day-to-day basis. The folks that normally develop our
PostgreSQL support had other engagements, and we don't have a very
deep list of people to fall back on. We need to work out a plan for
engaging PostgreSQL users as developers in this area, or it will be
very difficult to continue support for this DB. The command line
interface to the installer just needs a little more time to mature;
there are many ways of solving this problem without delaying a
release, but I won't get overly prescriptive in this writeup.
The polish of 1.17 was superb. The release notes were well-written,
and there hasn't been an urgent need for a rapid 1.17.1 release.
We'll do one anyway, since there were a couple of niggly bugs that can
be fixed easily enough.
=== Timing ===
As noted, the biggest area for improvement is around the timing and
release process. It wasn't all bad; we did (just barely) manage to
keep the release cycle under one year. Still, that's much longer than
our aspiration of quarterly releases, or even the previous historic
norm of 2-3 releases per year. Moreover, it has been a long time
since branching 1.17, so we already have seven months worth of work
backed up for future releases. 1.18 was branched in early May, so in
addition to the five months of changes we have backed up for that
release, we already have two more months of changes backed up for
1.19.
The biggest thing that delayed this release (and the 1.17 deployment
in March) was the code review backlog. That topic has been covered in
many earlier threads, but a brief recap: after the 1.16 release, we
fell way behind on code review, relying solely on Tim up until that
point. We added more reviewers in October, which helped us get the
backlog down to a reasonable level by December. We branched, finished
off the 1.17-specific review, and deployed. Further minor review work
was needed prior to the 1.17 release. With more Wikimedia Foundation
developers spending 20% of their time on review, we're optimistic
we'll be able to finish off the backlog and stay on top of the review
process.
As we drew closer to the 1.17 release, we issued 1.17 beta 1. This
beta unintentionally lasted several weeks as we tried to finish off
the last of the release blockers. In particular, a security bug we
worked on during this time created an awkward situation, since we had
to iterate multiple times to fully plug the hole. The good news,
though, is that the period was long enough for us to get some good
end-user testing and bug reporting prior to the final release.
=== Process ===
Process is where we need the most work. The actual logistics of
putting up the tarball and other bits are working well (these haven't
changed in years), but everything leading up to that point could use a
lot of streamlining.
The first issue is purely one of scoping. Right now, we're not
terribly deliberate about what goes in and what is out. Part of the
problem we have here is that opinions vary as to what a reasonable
release interval is. The range of opinion seems to be anywhere from
"multiple times a day" to "every six months". It's difficult to plan
this without getting consensus on this point, and it's difficult to
get consensus without first proving that we can get on top of the code
review backlog and stay on top of it. If we go with a longer cycle,
we can consider adopting a process similar to GNOME<ref>Example of
GNOME release timeline: http://live.gnome.org/ThreePointOne</ref> or
Ubuntu or other project that has a good track record for sticking with
a regular releases. The most interesting practices there involve
having clear deadlines for proposing new features, deadlines for
features being done or pulled, and other date-risk mitigation
strategies.
As with the code review process last year, this year, we're probably
too reliant on Tim to not only drive but execute many steps. One way
we can speed up the process is to document it, making it clear where
we are in the process, and more importantly, how people can help.
"Help" can mean explicitly doing the work, but it can also be simply
"don't do things that delay the release further", or "stop others from
delaying the release". We have a wonderful [[Release checklist]], but
that list was too focused on the last steps before the release. Many
steps before the actual publication of the tarball were missing, so
they've been added into that document. More work can be done there.
Additionally, we will probably experiment with other team members
(e.g. Chad) performing at least alpha or beta releases.
During this release, we tagged many things "1.17" for backporting to
trunk. This process was useful, as long as people remember to untag
once they've merged. There was some confusion at various times who
was responsible for doing this work. It switched sometimes between
Roan, Chad, Tim and others. Additionally, pretty much everyone felt
empowered to tag things for backporting, but there probably wasn't
enough discipline in trimming that list back before actually making
the change. Some unreviewed changes were backported (or directly
applied) to the release branch, causing confusion and delay. We have
a policy about backporting
<ref>http://www.mediawiki.org/wiki/Commit_access_requests#Guidelines_for_applyin…
- bullet points 4 & 5</ref>, but that policy wasn't followed very
closely.
The process of finding release notes that weren't added and then
backporting them was work that could have been done by people other
than Tim, but Tim ended up doing most of this. This is work that
needs to happen sooner in the process in a more distributed fashion.
Additionally, one way to avoid this extra work is to keep backporting
to a minimum in the first place.
This gets to the larger issue of communication and momentum at the end
of this process. With timezone differences, it's not sustainable to
have daily scrums all of the time, but having scrums during the last
couple of weeks or so in the process may help keep things moving to
the end.
== Recommendations ==
This section is intentionally left unfinished. The goal of this was
to establish and document what happened. To the extent anything is
incorrect or misleading above, corrections are encouraged.
Recommendations for new things to try based on lessons learned from
this release should be included below:
* ''your recommendation here''
...and possibly discussed on the talk page (suggestions above may be
ruthlessly edited; talk page is better for attribution and
preservation).
== References ==
<references/>
Thanks,
Steve
------------------
Steven Krein
CEO
OrganizedWisdom
http://OrganizedWisdom.com
917-903-4288
@stevenkrein
StartUp Health
http://startuphealth.com
On Jul 10, 2011, at 9:12 PM, wikitech-l-request(a)lists.wikimedia.org wrote:
> Send Wikitech-l mailing list submissions to
> wikitech-l(a)lists.wikimedia.org
>
> To subscribe or unsubscribe via the World Wide Web, visit
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> or, via email, send a message with subject or body 'help' to
> wikitech-l-request(a)lists.wikimedia.org
>
> You can reach the person managing the list at
> wikitech-l-owner(a)lists.wikimedia.org
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Wikitech-l digest..."
>
>
> Today's Topics:
>
> 1. Re: testing mobile browsers? (Marco Schuster)
> 2. Parser bugs and their priority (Mark A. Hershberger)
> 3. Supporting Extension authors (Mark A. Hershberger)
> 4. Re: all those zeros in tables in text brosers (Aryeh Gregor)
> 5. Re: Parser bugs and their priority (MZMcBride)
> 6. Re: testing mobile browsers? (Tomasz Finc)
> 7. Re: testing mobile browsers? (Tomasz Finc)
> 8. Re: ?Easy? Bug Triage meeting notes (MZMcBride)
> 9. Re: Wikimedia engineering report for June 2011 (MZMcBride)
> 10. Re: Wikimedia engineering report for June 2011 (Ryan Lane)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Sat, 9 Jul 2011 22:16:25 +0200
> From: Marco Schuster <marco(a)harddisk.is-a-geek.org>
> Subject: Re: [Wikitech-l] testing mobile browsers?
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID:
> <CAC6HepSnZjJ9gkU308hZO6eTCXz+Ew4ae5EsN1FkSwzrOfwo9Q(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> On Sat, Jul 9, 2011 at 8:51 PM, H?kon Wium Lie <howcome(a)opera.com> wrote:
>> Opera comes in two flavors for mobile devices: Opera Mini and Opera
>> Mobile. Opera Mobile is, indeed, close to the desktop version in the
>> sense that it runs the same display, javascript engine etc. on the
>> device.
>
> The versions of Opera Mobile floating in the wild are kinda different.
> Every HTC HD2 user with Windows Mobile 6.5 is likely to still run the
> ages-old buggy HTC version (8.x AFAIR, compared to current v10!), as
> the "official" versions STILL don't support the multi-touch features
> even though libraries exist which abstract the multi-touch -.-
>
> Marco
>
>
>
> ------------------------------
>
> Message: 2
> Date: Sat, 09 Jul 2011 17:01:33 -0400
> From: mhershberger(a)wikimedia.org (Mark A. Hershberger)
> Subject: [Wikitech-l] Parser bugs and their priority
> To: Wikitech List <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <87liw7dv82.fsf(a)everybody.org>
> Content-Type: text/plain; charset=utf-8
>
>
> Recently, there was a discussion on a bug (?UNIQ key exposed?
> https://bugzilla.wikimedia.org/14562) about the priority setting I had
> given the bug.
>
> It was part of the problems I found in Bugzilla last December and
> gathered into a tracking bug (https://bugzilla.wikimedia.org/26213).
>
> It looks like I made the wrong decision on #14562 since it was part of
> an extension that, while deployed on enwiki, wasn't likely to be
> triggered.
>
> When I was discussing this with Robla, he suggested I ask about this on
> wikitech-l, so here goes:
>
> There are at least four bugs live on Wikipedia that leave really ugly
> UNIQ strings in the wikitext. I've created a demonstration of them on
> my wiki page: http://hexm.de/4x
>
> The bug numbers are on the page linked to their entry in Bugzilla.
>
> I suppose these are all linked to the parser work that Brion & co are
> currently working on, but the arrival of the new parser 6 months to a
> year or more away (http://www.mediawiki.org/wiki/Future/Parser_plan),
> I'd like to get these sort of parser issues sorted out now.
>
> For those more familar with the current parser: how can those developers
> who are less experienced start fixing the problem? How important are
> these issues?
>
> Mark.
>
>
>
> ------------------------------
>
> Message: 3
> Date: Sat, 09 Jul 2011 17:17:08 -0400
> From: mhershberger(a)wikimedia.org (Mark A. Hershberger)
> Subject: [Wikitech-l] Supporting Extension authors
> To: Wikitech List <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <87hb6vdui3.fsf(a)everybody.org>
> Content-Type: text/plain; charset=utf-8
>
>
> MediaWiki.org is great for extension authors, as far as it goes. Today,
> though, someone asked on #mediawiki how to create development branches
> for their extension in their SVN repo. I told him I didn't think it
> could be done ? that he might have to use the SVN repo as a backend to
> push to from git or bzr ? but I'm not sure that answer was correct.
>
> Anyway, as I was writing about the UNIQ tracking bug, I thought of some
> documentation and support that we should try to get in place for
> extension developers. Since Sumana is creating a lot of good
> documentation about testing lately, that is where I started:
>
> * What sort of things should they test?
>
> * Can they have tests that will continue to work against the current
> parser and the next one?
>
> * How can they write parser tests and unit tests to try out their
> code?
>
> * How can they make sure that those tests are run on the test server?
> (I think this actually requires some work on the test server, but?)
>
> Of course, that documentation would help more than just the extension
> writers who have ?UNIQ? showing up in their output. What else could we
> do to support extension authors?
>
> Mark.
>
>
>
> ------------------------------
>
> Message: 4
> Date: Sun, 10 Jul 2011 10:35:42 -0400
> From: Aryeh Gregor <Simetrical+wikilist(a)gmail.com>
> Subject: Re: [Wikitech-l] all those zeros in tables in text brosers
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID:
> <CAKA+AxmFMi9Q_Vd=TF+Xv8DRBwnwrmzLd5DPPi0sc7NeU4AP7g(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> On Sat, Jul 9, 2011 at 3:49 AM, Niklas Laxstr?m
> <niklas.laxstrom(a)gmail.com> wrote:
>> It exists and it's called data-sort-value attribute. And it's already
>> live on Wikipedia as far as I can see.
>
> data-* attributes are only valid in HTML5 and will not work until
> $wgHtml5 is set to true. As has been discussed here a number of
> times, it's been true on trunk continuously since r53142 (June 12,
> 2009) and in releases since 1.16, but it hasn't yet been enabled on
> Wikimedia sites. Apparently there are finally plans to try enabling
> it for Wikimedia in the near future.
>
>
>
> ------------------------------
>
> Message: 5
> Date: Sun, 10 Jul 2011 11:34:34 -0400
> From: MZMcBride <z(a)mzmcbride.com>
> Subject: Re: [Wikitech-l] Parser bugs and their priority
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <CA3F3ECA.12797%z(a)mzmcbride.com>
> Content-Type: text/plain; charset="US-ASCII"
>
> Mark A. Hershberger wrote:
>> Recently, there was a discussion on a bug ("UNIQ key exposed"
>> https://bugzilla.wikimedia.org/14562) about the priority setting I had
>> given the bug.
>
> [...]
>
>> I suppose these are all linked to the parser work that Brion & co are
>> currently working on, but the arrival of the new parser 6 months to a
>> year or more away (http://www.mediawiki.org/wiki/Future/Parser_plan),
>> I'd like to get these sort of parser issues sorted out now.
>>
>> For those more familar with the current parser: how can those developers
>> who are less experienced start fixing the problem? How important are
>> these issues?
>
> Bugs have a habit of setting their own priority. If these bugs were
> regularly being hit by users, they would have been resolved ages ago (in
> theory!). But they only appear in very, very strange edge cases, which have
> the lowest user impact (not quite zero, but slightly above).
>
> My recommendation would be to add some parser tests for these bugs, mark
> them as failing, and if the rewrite doesn't take care of them, re-examine
> them then. There are so many more bugs that are affecting so many people.
> The fact that you can't combine three types of esoteric wiki syntax in
> certain ways currently really shouldn't be a very high concern.
>
> I think that's roughly what Tim was trying to say.
>
> MZMcBride
>
>
>
>
>
> ------------------------------
>
> Message: 6
> Date: Sun, 10 Jul 2011 12:57:52 -0700
> From: Tomasz Finc <tfinc(a)wikimedia.org>
> Subject: Re: [Wikitech-l] testing mobile browsers?
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID:
> <CAMxhqbduBc8AbVgWTYOc=xPQVORwpUyw44osfNc1-HYSPTZ1sw(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> That would be awesome to get in. I've been meaning to chat with Timo
> about this. Let me see if there are any others that were seeing.
>
> --tomasz
>
> On Sat, Jul 9, 2011 at 2:30 AM, Ashar Voultoiz <hashar+wmf(a)free.fr> wrote:
>> Hello,
>>
>> Our javascript tests are being run under TestSwarm [1] and we currently
>> cover up most desktop browsers (thanks brion).
>>
>> According to our squids stats [2], most of Wikimedia mobile traffic
>> comes from the following browsers (sorted by popularity):
>> ? - Safari
>> ? - Android
>> ? - Opera
>> ? - Mozilla
>> ? - Blackberry
>>
>> * Safari, Opera & Mozilla for mobile : they are probably mostly the same
>> as the desktop version. I have not found emulators for them.
>> * Android : has an emulator. On my computer it is painfully slow and not
>> usable for anything.
>> * Blackberry : emulator is Windows only :-/
>>
>> Would be great to enhance our testswarm with more browsers. Maybe we
>> could contact those mobiles developers to connect to our testswarm?
>>
>> :-)
>>
>>
>> [1] http://toolserver.org/~krinkle/testswarm/
>> [2] http://stats.wikimedia.org/wikimedia/squids/SquidReportClients.htm
>>
>> --
>> Ashar Voultoiz
>>
>>
>> _______________________________________________
>> Wikitech-l mailing list
>> Wikitech-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>>
>
>
>
> ------------------------------
>
> Message: 7
> Date: Sun, 10 Jul 2011 12:58:50 -0700
> From: Tomasz Finc <tfinc(a)wikimedia.org>
> Subject: Re: [Wikitech-l] testing mobile browsers?
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID:
> <CAMxhqbesgRQpzAo50VEDhH+tFHruiG18nWO9VHakH0DebDJa3Q(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> On Sat, Jul 9, 2011 at 11:51 AM, H?kon Wium Lie <howcome(a)opera.com> wrote:
>> Also sprach Ashar Voultoiz:
>>
>> ?> Our javascript tests are being run under TestSwarm [1] and we currently
>> ?> cover up most desktop browsers (thanks brion).
>> ?>
>> ?> According to our squids stats [2], most of Wikimedia mobile traffic
>> ?> comes from the following browsers (sorted by popularity):
>> ?> ? ?- Safari
>> ?> ? ?- Android
>> ?> ? ?- Opera
>> ?> ? ?- Mozilla
>> ?> ? ?- Blackberry
>> ?>
>> ?> * Safari, Opera & Mozilla for mobile : they are probably mostly the same
>> ?> as the desktop version. I have not found emulators for them.
>>
>> Opera comes in two flavors for mobile devices: Opera Mini and Opera
>> Mobile. Opera Mobile is, indeed, close to the desktop version in the
>> sense that it runs the same display, javascript engine etc. on the
>> device.
>>
>> Opera Mini runs these engines in server parks in the fixed network and
>> tranfers a binary representation to a small viewer on the device. We
>> currently process around 60 billion pages per month and Wikipedia is
>> typically in the top 10 lists in the top 20 countries we publish
>> statistics for:
>>
>> ?http://www.opera.com/smw/2011/05/
>>
>> In the test swarm link you sent, Opera 10 and 11 are listed, but
>> not Opera Mini (which is currently at version 6). Could it be that
>> your sniffer doesn't pick up Opera Mini users?
>>
>> ?http://toolserver.org/~krinkle/testswarm/
>>
>> Here a sample UA string from a recent version of Opera Mini:
>>
>> ?Opera/9.80 (Android; Opera Mini/6.24556/25.657; U; en) Presto/2.5.25 Version/10.54
>>
>> And here's the Opera Mini emulator:
>>
>> ?http://www.opera.com/mobile/demo/
>>
>> While Wikipedia remains popular with Opera Mini users, there is a
>> technical problem which limits the user experience. Wikipedia uses
>> JavaScript to unfold sections in articles. Alas, executing JavaScript
>> requires a rountrip to the server (the Opera Mini server, that is)
>> which takes time and costs money. It would be better if articles were
>> unfolded by default for Opera Mini users.
>
> Open bug already in place .. any takers?
>
> https://bugzilla.wikimedia.org/show_bug.cgi?id=29517
>
> --tomasz
>
>
>
> ------------------------------
>
> Message: 8
> Date: Sun, 10 Jul 2011 16:35:57 -0400
> From: MZMcBride <z(a)mzmcbride.com>
> Subject: Re: [Wikitech-l] ?Easy? Bug Triage meeting notes
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <CA3F856D.127C1%z(a)mzmcbride.com>
> Content-Type: text/plain; charset="US-ASCII"
>
> Sumana Harihareswara wrote:
>> On 07/08/2011 02:20 PM, Mark A. Hershberger wrote:
>>> Still, despite the problems, hosting the triage in IRC meant that some
>>> developers interested in MediaWiki that Sumana invited were able to
>>> watch and volunteer developers were able able to participate.
>>
>> Mark: Thanks, as always, for running this triage. I have now added the
>> relevant bugs to
>>
>> http://www.mediawiki.org/wiki/Annoying_Little_Bug
>>
>> which you should feel free to hand out to developers interested in
>> writing their first MediaWiki patch.
>
> I sporadically made a complementary page (without references) at
> <http://www.mediawiki.org/wiki/Annoying_large_bugs> which lists more complex
> (and more exciting) bugs.
>
> The "easy" bugs list is nice, but it doesn't really spark much creative
> energy. A second page with more exciting (but harder to implement) bugs
> seemed appropriate. :-)
>
> I tried to list bugs that a lot of users are more likely to encounter, both
> on Wikimedia wikis and on their personal MediaWiki installs.
>
> MZMcBride
>
>
>
>
>
> ------------------------------
>
> Message: 9
> Date: Sun, 10 Jul 2011 21:09:01 -0400
> From: MZMcBride <z(a)mzmcbride.com>
> Subject: Re: [Wikitech-l] Wikimedia engineering report for June 2011
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID: <CA3FC56D.127D2%z(a)mzmcbride.com>
> Content-Type: text/plain; charset="ISO-8859-1"
>
> Guillaume Paumier wrote:
>> The report of Wikimedia engineering activities for June 2011 is now available:
>>
>> Blog version:
>> http://blog.wikimedia.org/2011/07/01/engineering-june-2011-report/
>> Wiki version:
>> http://www.mediawiki.org/wiki/Wikimedia_engineering_report/2011/June
>
> [quote]
> Summer of Research 2011
> <http://meta.wikimedia.org/wiki/Research:Wikimedia_Summer_of_Research_2011>
> ?? Asher Feldman and?Ryan Lane
> <http://www.mediawiki.org/wiki/User:Ryan_lane> ?created the systems
> infrastructure for the Summer of Research team to perform data mining and
> analysis work.
> [/quote]
>
> Do you have more info about this? It sounds like they're duplicating the
> Toolserver, but it's hard to say without knowing more.
>
> MZMcBride
>
>
>
>
>
> ------------------------------
>
> Message: 10
> Date: Sun, 10 Jul 2011 20:12:13 -0500
> From: Ryan Lane <rlane32(a)gmail.com>
> Subject: Re: [Wikitech-l] Wikimedia engineering report for June 2011
> To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
> Message-ID:
> <CALKgCA3zuZbGOQYfrwDB--Ju1Rt0f5w0ce6FEMRQHjD7VcqGeQ(a)mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
>> [quote]
>> Summer of Research 2011
>> <http://meta.wikimedia.org/wiki/Research:Wikimedia_Summer_of_Research_2011>
>> ?? Asher Feldman and?Ryan Lane
>> <http://www.mediawiki.org/wiki/User:Ryan_lane> ?created the systems
>> infrastructure for the Summer of Research team to perform data mining and
>> analysis work.
>> [/quote]
>>
>> Do you have more info about this? It sounds like they're duplicating the
>> Toolserver, but it's hard to say without knowing more.
>>
>
> They need access to information not available on the toolserver. This
> is just a single virtual machine that has mysql access to a database
> replica.
>
> - Ryan
>
>
>
> ------------------------------
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
>
> End of Wikitech-l Digest, Vol 96, Issue 12
> ******************************************
This past Wednesday, we had our second IRC triage. The focus was on
triaging bugs marked “easy”, verifying that they were still bugs and
making sure they were actually something a new coder could do.
This meant we had a larger number of bugs to cover. I had planned
only 2 minutes per bug in order to get through all of them in an hour
which caused problems. This short time limit created some problems
which means I'll have to be careful to restrict the number of bugs in
the future, or tell people the time for the triage will be longer than
one hour.
Still, despite the problems, hosting the triage in IRC meant that some
developers interested in MediaWiki that Sumana invited were able to
watch and volunteer developers were able able to participate.
Before talking about the ones that are already done or that we decided
were not something a new developer should attempt, I'll give you the
list of bugs that we confirmed as “easy”:
http://bugzilla.wikimedia.org/25909 Add a drop-down list for the tags
in RecentChanges and NewPages
http://bugzilla.wikimedia.org/26470 Add toggle for checkered or
transparent image background on file description pages
http://bugzilla.wikimedia.org/28162 Installer does not respect initial
DBport declaration
http://bugzilla.wikimedia.org/28173 Postgres defaults to a unix
socket - mention in install?
http://bugzilla.wikimedia.org/28296 Installer should honor &uselang=
parameter
http://bugzilla.wikimedia.org/28981 handle diffonly param on diffs
between deleted revision
http://bugzilla.wikimedia.org/29110 $wgFeedDiffCutoff doesn't affect
new pages
http://bugzilla.wikimedia.org/29311 [OutputPage] Create a method to
remove items from mModules
http://bugzilla.wikimedia.org/19295 Navigation headings should not be
lower-cased in German (and other languages)
http://bugzilla.wikimedia.org/8178 [low enhancement] Make table of
contents for category pages: Subcategories, Pages, Media
http://bugzilla.wikimedia.org/21511 Document all configuration
variables (usable in LocalSettings.php)
By walking through bugs in this way, we were able to close a few
right away as invalid or already fixed:
http://bugzilla.wikimedia.org/23442 javascript error in examine mode
http://bugzilla.wikimedia.org/27394 Message "lastmodifieddat" should
be in user language instead of for content
http://bugzilla.wikimedia.org/27047 Nicer design for pre elements in
Vector
http://bugzilla.wikimedia.org/25839 Add class for block ending time
<span> element
http://bugzilla.wikimedia.org/22770 wrap rights on
GlobalGroupPermissions into a CSS class
(Had a patch that looked usable, but yesterday, after the triage,
DieBuche pointed out that creating a new CSS class wasn't
necessary.)
These next ones had patches that needed to be to be reviewed and
applied:
http://bugzilla.wikimedia.org/28838 Remove redundant IE-CSSFix rules
from monobook/main.css
Applied in r91742
http://bugzilla.wikimedia.org/28147
MediaWiki:Centralauth-login-global - rewording suggested to make
it more concrete.
Applied in r91706, fixed up by SPQRobin in r91733.
http://bugzilla.wikimedia.org/27894 Move edit on double-click event
listener down to div#bodyContent
Brion and Krinkle have been reviewing the patch but since the
original submitter hasn't replied for 2 months (despite other
recent activity on the bug), this is a good place for another
developer to jump in and take over the work.
Of course, when you're looking over “easy” bugs, some of them get
fixed soon after:
http://bugzilla.wikimedia.org/23086 [extensions] AbuseFilter config
diff date and time should use user preference instead of UTC
SPQRobin fixed in r91619
And then there were the ones that were marked “easy” but, after
discussion, really weren't:
http://bugzilla.wikimedia.org/13862 Add a horizontal table of contents
to Special:SpecialPages
http://bugzilla.wikimedia.org/20781 Move 'mainpagetext' message to
installer's .i18n file once it exists
(Yes, this is easy, Except that it needs coordination with
Translatewiki.net. If we could move all of the MessagesXX
translations to Installer.i18n.php in one move, it wouldn't cause
any headaches for translators.)
http://bugzilla.wikimedia.org/24159 Remove uses of the error
suppression operator
(Most of the low hanging fruit has been done here. The ones
remaining can be a bit remaining can be a bit tricky.)
http://bugzilla.wikimedia.org/27501 List categories from foreign file
repositories on the File description page
(Need to rewrite the way the description page is fetched from the
foreign repo. Can be done via the API.)
http://bugzilla.wikimedia.org/27545 Extension:reCAPTCHA has badly
worded message
http://bugzilla.wikimedia.org/12532 File links on image description
page should be in alphanumeric (alphabetic) order
(Could be easy, but requires one to make sure you don't do
something stupid with the db query. A-Z Sorting all linked pages
is slow/unindexed, sorting only the first (random) 200 entries is
confusing for the user.)
http://bugzilla.wikimedia.org/29290 continue does not work in same
cases for prop=iwlinks&iwcontinue and list=iwbacklinks&iwblcontinue
http://bugzilla.wikimedia.org/26597 Allow toggling of persistent
cookies ("remember me") in API action=login
(Sam's initial fix was reverted, so maybe not so easy.)
Thanks, everyone, for helping out!
Mark.
I don't see why Wikipedia looks 99% fabulous in lynx and w3m, but blows
it on such a simple thing like dates in tables,
$ lynx -dump http://en.wikipedia.org/wiki/List_of_social_networking_websites
Please note the list is not exhaustive, and is limited to notable,
well-known sites.
Name Description/Focus Date launched [12]Registered users Registration
Global [13]Alexa^[14][1] Page ranking
[15]Academia.edu Social networking site for academics/researchers
02008-09-01September 2008 &0000000000211000000000211,000^[16][2] Open
&00000000000092940000009,294^[17][3]
[18]Advogato [19]Free and [20]open source software developers 01999
1999 &000000000001357500000013,575^[21][4] Open...
Why can't they go that extra one percent and clean all those silly zeros
out of their tables?
Yes those stubborn text browser users should be denied extra features, but can't they
at least be given something readable. Search engines would thank you
too.
Sure you can say "not a MediaWiki problem, go contact the website", but
if MediaWiki gave them the proper tools to make their templates, they
wouldn't need to make such a mess.