> Message: 8
> Date: Wed, 23 May 2012 21:49:57 +0200
> From: Platonides <Platonides(a)gmail.com>
> To: wikitech-l(a)lists.wikimedia.org
> Subject: Re: [Wikitech-l] HTMLMultiSelectField as <select
> Message-ID: <jpjf1s$b23$1(a)dough.gmane.org>
> Content-Type: text/plain; charset=ISO-8859-1
> On 23/05/12 19:16, Daniel Werner wrote:
> > Right now I am implementing a new option (as part of
> > https://bugzilla.wikimedia.org/show_bug.cgi?id=36425) for which I'd
> > use a <select multiple="multiple"/> html element with options. Right now
> > MediaWiki always generates a list of selectboxes instead of that when
> > the HTMLMultiSelectField class. We are talking about 280+ selectable
> > here, so for now we came to the conclusion that a real multi <select/>
> > would be nicer and less space consuming for now
> > I have already managed to implement this multiple select,
> > modifying HTMLMultiSelectField adding a new option 'usecheckboxes' which
> > can be set to false to disable the known behavior and use a select
> > instead.
> > enabled, we could still do something nicer, for example with something
> > jQuery chosen plugin here.
> > My question would just be, how I should implement these changes
> > Is it ok with the new option for HTMLMultiSelectField or should this be
> > new class inheriting from HTMLMultiSelectField? I think
> > HTMLMultiSelectField sounds more like describing what I just implemented
> > rather than a bunch of select boxes, but of course renaming the existing
> > one could "break" extensions (even though both are fully compatible and
> > interchangeable). So one option would be simply naming the new one
> > HTMLMultiSelectField2 if we don't want to stick with an additional
> > here.
> No. You shouldn't need to know that HTMLMultiSelectField2 is a
> MultiSelect but HTMLMultiSelectField uses checkboxes.
> Your useCheckboxes looks good.
> I recommend you to make it a tri-state value, so you could force
> checkboxes, select or let it decide (eg. checkboxes for < 100 elements,
> select for more)
Alright, just submitted this for review to gerrit:
I implemented it as tri-state now. By default 'usecheckboxes' will be true,
not set to a number. This could be changed (would make sense imo) but for
now I didn't want to do this since it could for example affect the default
search namespace user preference in wikis with many search namespaces. I
think the plain multiple select HTML element is not that nice because it is
not very obvious that you can do multiple selects by holding the control
having JS enabled I think before using this as default for huge multiselect
options. I think if all of that were implemented, 15 or 20 would be a good
default value for the option.
Sorry about the length of this mail, it reads faster than it looks.
I am working with the recentchanges and the cu_changes (checkuser)
mediawiki SQL tables. I would like to be able to filter bot activity,
unfortunately I am increasingly confused.
Things that I think I know:
- In the recentchanges<http://www.mediawiki.org/wiki/Manual:Recentchanges_table>
there is a `rc_bot` flag that should indicate whether the edit comes from a
- The checkuser table
is not documented on the mediawiki database layout
contains mostly the same information as the recentchanges table but for a
longer period of time. However, there is no bot flag as there is on the
recentchanges table - I don't know why not.
- There is a `bot` entry in the
A revision/recentchanges/cu_changes entry can be identified as bot by
joining the original table with user_groups on the user_id and by setting
- The user_groups method way of identifying bots is inefficient and the
data seems incomplete. For some other projects we have used various other
bot tables created by hand (on db1047: halfak.bot used during WSOR 2011 or
declerambaul.erik_bots containing the bots identified by Erik Zachte).
I would like to know the answers to the following questions:
1. *What is the meaning/purpose of the rc_bot flag on recentchanges? *There
are entries in the recentchanges table from editors that are flagged as
bots in the user_groups and the other bot tables but still have the rc_bot
flag set to 0.
mysql> select rc.rc_user_text from recentchanges rc join user_groups ug ON
(rc.rc_user=ug.ug_user) WHERE ug.ug_group = 'bot' and rc.rc_bot=0 limit 1;
| rc_user_text |
| ClueBot NG |
2. *Why is there no bot flag in the checkuser table? *A lot of the other
fields seem to be copied from the recentchanges table, why not the rc_bot
field? The check user table contains both entries that are flagged as bots
in the recentchanges table and entries that are flagged as bots in the
mysql> select cuc.cuc_user_text from recentchanges rc join cu_changes cuc
ON (rc.rc_user=cuc.cuc_user) WHERE rc.rc_bot=1 limit 1;
| cuc_user_text |
| MiszaBot III |
mysql> select cuc.cuc_user_text from cu_changes cuc join user_groups ug ON
(cuc.cuc_user=ug.ug_user) WHERE ug.ug_group = 'bot' limit 1;
| cuc_user_text |
| Robbot |
3. *Am I missing some fundamental information about how bots are handled?* This
is a frequently recurring request for data analytics and it seems the data
What is the most convenient, sane way to classify bot activity as such? Are
there any projects underway that aim to improve the situation? Any input,
pointers and recommendations are much appreciated.
Thanks a lot! Regards,
>"When a page reaches X level of quality, that version becomes the default." >When creating all the interface message for editing, viewing, and history,
>this is definitely not easy to get right and keep simple for new users. >Anyway, to be clear, you can make the "latest version" the default for all
>pages and manually make the "latest reviewed" version the default on a
>per-page basis already. You just can't use "quality versions" as the default version.
Hi, and thanks for the info. On an immediate practical level I will inform them of this at the wiki, and we will unfortunately be forced to turn off the last approved version as default so as not to offput new users.
In terms of the future, I respectfully disagree with you that there is any essential real-life problem for users if the extension were to be implemented in the way I described. Even for those who share your evaluation (which I don't), there should be no reason not to provide such basic functionality as at least an option. Furthermore, to manually make the "latest reviewed" version the default on a per-page basis is extremely cumbersome (even using bots) and not a reasonable solution to the problem.
To conclude, I personally think the extension you have developed (I was only recently reminded that you are the primary developer) is already one of the most important developments for the future of Mediawiki (and often under appreciated as such). Please let me know if there is any positive way I can become involved in helping this functionality become part of the program.
A little while ago Trevor Parscal changed our jsMessage setup to be a
floating auto-hiding notification bubble.
The end implementation felt half-baked to me. Since it just swapped text
for notification replacement. And didn't support multiple notifications.
It even reused the same id as the previous message which was pretty much a
completely different concept.
So I spent a night implementing a fully featured notification bubble
system. Something that should work for watchlists, VisualEditor, and
perhaps some other things like LQT, and perhaps anything we want to start
making more dynamic. Same goes for anyone with a good Gadget idea that
could use better notifications.
Here's a demo video of the new notification system:
The changeset is https://gerrit.wikimedia.org/r/#/c/19199/
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
This is something I've been meaning to bring up for some time, but have
just been delaying getting it done. For a bunch of reasons, we need to
look at disabling direct pushing on the master branch for all extensions in
Gerrit. This doesn't affect other branches, just master. There's a couple
of big reasons this is a problem right now, and why we need to change:
1) Eventually, we'll be running tests for all extensions (at the very least
linting if no phpunit tests have been written). Jenkins doesn't have the
change to -1 a commit if you skip review.
2) It doesn't give anyone a place to complain about the patch. Every
commit to master needs a place to say "Hey wait a minute" -- even if it's
already been merged.
3) Changes that are directly pushed aren't searchable from Gerrit. This is
more a feature request for Gerrit, but one that's easily worked around by
just pushing through Gerrit.
I realize that feature branches don't necessarily need the same level of
scrutiny, but the "primary" branch for a repository needs to be as public
as possible--direct pushing makes this difficult. I don't plan on changing
this requirement for any branch that's not "master" or "wmf/" (the latter
relating to deployment config). Before I make the change though, I
wanted to ask about it publicly to make sure there's no major blockers
to me doing so.
Thanks for any feedback you can give.
As you know, wikisource needs robust, well-defined data, and there's a
strict, deep relationship between wikisource and Commons since Commons
hosts images of books, in .djvu or .pdf files. Commons shares both images
and contents fo information page of images, so that any wiki project can
visualize a view-only "pseudo-page" accessing to a local page named as the
file name into Commons.
Working into self-made data semantization into it.wikisouce using a lot of
creative tricks, we discovered that it's hard/almost impossible to read by
AJAX calls the contents of pages of other projects since well-known same
origin policy, but that File: local pages are considered as coming from
"same origin" so that they can be read as any other local page, and this
AJAX call asking for the content of
gives back the html text of local File: view-only page, and this means that
any data stored into information page into Commons is freely accessible by
into information and/or (much better) Book and Creator templates can be
retrieved and parsed
Has this been described/used before? It seems a plain, simple way to share
and disseminate good, consistent metadata into any project; and this runs
from today, without any change on current wiki software.
If you like, I'm sharing a practical test use of this trick into
wikisource.org too, you can import User:Alex brollo/Library.js and a lot of
smallo, original scripts will be loaded; click on "metadata" botton from
any page connected to a File: page ( namespaces Index, Page) and you'll see
a result coming from such an AJAX call.
Alex brollo, from it.wikisource
Central Auth has been around for about 5 years now and we still lack a
API to interact with it. There is no
blocking/unblocking/locking/unlocking ability at all. see
https://bugzilla.wikimedia.org/show_bug.cgi?id=23821 who do I need to
bribe/torture/put a fire underneath in order to get basic access to
We have had a pretty productive week with the Wikidata team. The most pressing
issues concerning the Foundation team are however still the same:
The Sites table is pending finalization. It coming along nicely, and I think we
will have a new version early next week. Development is happening on a separate
development branch called "sites", any feedback is appreciated.
The Wikidata branch with the ContentHandler and related changes has seen some
improvements over the week, and I have once more merged the latest master into
the branch. However, I'm wondering how to best proceed now.
Until a few weeks ago, I let development on that branch rest, awaiting feedback
so I could be sure to be moving into the right direction. This didn't work out,
since the code was still too incomplete for a full review. I have now tied down
most loose ends, but I'm still getting no feedback. Would it be best to halt
development again? It's never going to be *finished*, there's always *something*
Version with helpful links:
1) Write small commits.
It's easier for other people to review small changes that only change
one thing. We'd rather see five small commits than one big one.
2) Respond to test failures and feedback.
Check your Gerrit settings and make sure you're getting email
notifications. If your code fails automated tests, or you got some
review already, respond to it in a comment or resubmission. Or hit the
Abandon button to remove your commit from the review queue while you
(To see why automated tests fail, click on the link in the "failed"
comment in Gerrit, hover over the failed test's red dot, wait for the
popup to show, and then click "console output.")
3) Don't mix rebases with changes.
When rebasing, only rebase. That makes it easier to use the "Old Version
History" dropdown, which greatly quickens reviews. If non-rebase changes
are made inside a rebase changeset, you have to read through a lot more
code to find it and it's non-obvious.
4) Add reviewers.
I try to help with this. If I notice an unreviewed changeset lingering,
then I add a review request or two. (These are requests -- there's no
way to assign a review to someone in Gerrit.) But it's faster if you do
it right after committing. Some tricks:
* Click the name of the repository ("Gerrit project"), e.g.
operations/debs/squid , and remove "status:open" from the search box to
find other changesets in that repository. The people who write and
review those changesets would be good candidates to add as reviewers.
* Search through other commit summaries and changesets. Example:
Matmarex and Foxtrott are interested in reviewing frontend changes, so I
search for "message:css" to find changesets that mention CSS in their
commit summaries to add them to. You can use this and regexes to find
changes that touch the same components you're touching, to find likely
reviewers. Learn more at
5) Review more.
Many eyes make bugs shallow. Read the code review guide and help out
with comments, "+1", and "-1". Those are nonbinding, won't cause merges
or rejections, and have no formal effect on the code review. But you'll
learn, gain reputation, and get people to return the favor by reviewing
you in the future. "How to review code in Gerrit" has the step-by-step
explanation. Example Gerrit search for MediaWiki commits that have not
had +1, +2, -1, or -2 reviews yet:
Engineering Community Manager
Wikimedia is at 1.20/wmf10 now. That means that it has been working
with 1.20 alpha for the past 20 weeks. Isn't is about time we start
preparing something usable called 1.20, 2.0, or whatever, for the
outside world, too? Previous experiences tell us that getting
something release ready takes at least 6 weeks, so we we'd want to
have a stable release by end of October, we'll have to starting doing
something very soon.
Product Manager Localisation
M: +31 6 50 69 1239
Support Free Knowledge: http://wikimediafoundation.org/wiki/Donate