> Message: 8
> Date: Wed, 23 May 2012 21:49:57 +0200
> From: Platonides <Platonides(a)gmail.com>
> To: wikitech-l(a)lists.wikimedia.org
> Subject: Re: [Wikitech-l] HTMLMultiSelectField as <select
> multiple="multiple"/>
> Message-ID: <jpjf1s$b23$1(a)dough.gmane.org>
> Content-Type: text/plain; charset=ISO-8859-1
>
> On 23/05/12 19:16, Daniel Werner wrote:
> > Right now I am implementing a new option (as part of
> > https://bugzilla.wikimedia.org/show_bug.cgi?id=36425) for which I'd
like to
> > use a <select multiple="multiple"/> html element with options. Right now
> > MediaWiki always generates a list of selectboxes instead of that when
using
> > the HTMLMultiSelectField class. We are talking about 280+ selectable
items
> > here, so for now we came to the conclusion that a real multi <select/>
> > would be nicer and less space consuming for now
> > I have already managed to implement this multiple select,
> > modifying HTMLMultiSelectField adding a new option 'usecheckboxes' which
> > can be set to false to disable the known behavior and use a select
element
> > instead.
> >
> > This would mainly be for the JavaScript-less ui. If javascript were
> > enabled, we could still do something nicer, for example with something
like
> > jQuery chosen plugin here.
> >
> > My question would just be, how I should implement these changes
preferably.
> > Is it ok with the new option for HTMLMultiSelectField or should this be
a
> > new class inheriting from HTMLMultiSelectField? I think
> > HTMLMultiSelectField sounds more like describing what I just implemented
> > rather than a bunch of select boxes, but of course renaming the existing
> > one could "break" extensions (even though both are fully compatible and
> > interchangeable). So one option would be simply naming the new one
> > HTMLMultiSelectField2 if we don't want to stick with an additional
option
> > here.
>
> No. You shouldn't need to know that HTMLMultiSelectField2 is a
> MultiSelect but HTMLMultiSelectField uses checkboxes.
> Your useCheckboxes looks good.
> I recommend you to make it a tri-state value, so you could force
> checkboxes, select or let it decide (eg. checkboxes for < 100 elements,
> select for more)
Alright, just submitted this for review to gerrit:
https://gerrit.wikimedia.org/r/#/c/8924/
I implemented it as tri-state now. By default 'usecheckboxes' will be true,
not set to a number. This could be changed (would make sense imo) but for
now I didn't want to do this since it could for example affect the default
search namespace user preference in wikis with many search namespaces. I
think the plain multiple select HTML element is not that nice because it is
not very obvious that you can do multiple selects by holding the control
key. There should be some JavaScript ui element replacing this for users
having JS enabled I think before using this as default for huge multiselect
options. I think if all of that were implemented, 15 or 20 would be a good
default value for the option.
Cheers
Daniel
I would like to open some discussion about
https://bugzilla.wikimedia.org/show_bug.cgi?id=40329
This bug is about the fact that we currently do a 'partial' transform of
the HTML5-invalid attribute 'align'.
We all agree that this is bad, what we need to figure out is what to do
next:
1: Disable the transform and output the align attribute even though it's
not valid HTML5. Solve validness later.
2: Remove the attribute from HTML5 and 'break' the content. Fix by users
(or bot).
3: Disable HTML5, correct the content of the wiki's (possibly with a bot)
and remove the attribute in HTML5 mode, reenable HTML5.
4: Fix the transform (not that easy)
My personal preference is with 1, since this is causing trouble now and
with 1 we solve immediate problems, we just add to the lack of valid HTML5
output that we already have. In my opinion 2 would be too disruptive and 3
would take too long.
Danny is of the opinion that we should never transform at the parser side
and that we should fix the content instead (2 or 3).
So, how best to fix the issue/what should be our strategy with regard to
content that is not HTML 5 valid in general ?
<Discuss>
DJ
Hi,
Sorry about the length of this mail, it reads faster than it looks.
I am working with the recentchanges and the cu_changes (checkuser)
mediawiki SQL tables. I would like to be able to filter bot activity,
unfortunately I am increasingly confused.
Things that I think I know:
- In the recentchanges<http://www.mediawiki.org/wiki/Manual:Recentchanges_table>
table
there is a `rc_bot` flag that should indicate whether the edit comes from a
bot.
- The checkuser table
cu_changes<http://www.mediawiki.org/wiki/Extension:CheckUser> (which
is not documented on the mediawiki database layout
page<http://www.mediawiki.org/wiki/Manual:Database_layout>)
contains mostly the same information as the recentchanges table but for a
longer period of time. However, there is no bot flag as there is on the
recentchanges table - I don't know why not.
- There is a `bot` entry in the
user_groups.ug_group<http://www.mediawiki.org/wiki/Manual:User_groups_table>
field.
A revision/recentchanges/cu_changes entry can be identified as bot by
joining the original table with user_groups on the user_id and by setting
ug_group=`bot`.
- The user_groups method way of identifying bots is inefficient and the
data seems incomplete. For some other projects we have used various other
bot tables created by hand (on db1047: halfak.bot used during WSOR 2011 or
declerambaul.erik_bots containing the bots identified by Erik Zachte).
I would like to know the answers to the following questions:
1. *What is the meaning/purpose of the rc_bot flag on recentchanges? *There
are entries in the recentchanges table from editors that are flagged as
bots in the user_groups and the other bot tables but still have the rc_bot
flag set to 0.
mysql> select rc.rc_user_text from recentchanges rc join user_groups ug ON
(rc.rc_user=ug.ug_user) WHERE ug.ug_group = 'bot' and rc.rc_bot=0 limit 1;
+--------------+
| rc_user_text |
+--------------+
| ClueBot NG |
+--------------+
2. *Why is there no bot flag in the checkuser table? *A lot of the other
fields seem to be copied from the recentchanges table, why not the rc_bot
field? The check user table contains both entries that are flagged as bots
in the recentchanges table and entries that are flagged as bots in the
user_groups.
mysql> select cuc.cuc_user_text from recentchanges rc join cu_changes cuc
ON (rc.rc_user=cuc.cuc_user) WHERE rc.rc_bot=1 limit 1;
+---------------+
| cuc_user_text |
+---------------+
| MiszaBot III |
+---------------+
mysql> select cuc.cuc_user_text from cu_changes cuc join user_groups ug ON
(cuc.cuc_user=ug.ug_user) WHERE ug.ug_group = 'bot' limit 1;
+---------------+
| cuc_user_text |
+---------------+
| Robbot |
+---------------+
3. *Am I missing some fundamental information about how bots are handled?* This
is a frequently recurring request for data analytics and it seems the data
is inconsistent.
What is the most convenient, sane way to classify bot activity as such? Are
there any projects underway that aim to improve the situation? Any input,
pointers and recommendations are much appreciated.
Thanks a lot! Regards,
Fabian
I'm planning to deploy Sender Policy Framework (SPF) for the wikimedia.org
domain on Weds October 5. SPF is a framework for validating outgoing mail,
which gives the receiving side useful information for spam filtering. The
main goal is to cause spoofed @wikimedia.org mail to be correctly
identified as such. It should also improve our odds of getting fundraiser
mailings into inboxes rather than spam folders.
The change should not be noticeable, but the most likely problem would be
legitimate @wikimedia.org mail being treated as spam. If you hear of this
happening please let me know.
Technical details are below for anyone interested . . .
Thanks,
jg
Jeff Green
Operations Engineer, Special Projects
Wikimedia Foundation
149 New Montgomery Street, 3rd Floor
San Francisco, CA 94105
jgreen(a)wikimedia.org
. . . . . . .
SPF overview http://en.wikipedia.org/wiki/Sender_Policy_Framework
The October 8 change will be simply a matter of adding a TXT record to the
wikimedia.org DNS zone:
wikimedia.org IN TXT "v=spf1 ip4:91.198.174.0/24 ip4:208.80.152.0/22
ip6:2620:0:860::/46 include:_spf.google.com ip4:74.121.51.111 ?all"
The record is a list of subnets that we identify as senders (all wmf
subnets, google apps, and the fundraiser mailhouse). The "?all" is a
"neutral" policy--it doesn't state either way how mail should be handled.
Eventually we'll probably bump "?all" to a stricter "~all" aka SoftFail,
which tells the receiving side that only mail coming from the listed
subnets is valid. Most ISPs will route 'other' mail to a spam folder based
on SoftFail.
Please bug me with any questions/comments!
Dear Markus, Yury, Semantic MediaWikians, and Wikitechians,
Thanks for your feedback about Semantic mediawiki and map data
coordinate-wise, in terms of colors:
"SMW does not have a special datatype for representing colours. You
can encode wavelengths as numbers. Sound data is not supported, nor is
any such support planned right now." (Markus K.)
In thinking through broadly, and planning for, for example, how both
modeling a virtual classroom (e.g. a chemistry classroom, in something
like an interactive, movie-realistic OpenSim or WoW) as well as a
virtual universe might work, coordinate-wise for color and sound,
vis-a-vis World University and School, as well as with Wikidata and
Wikibase, I'm curious about a number of coordinate-related questions,
particularly cross-language-wise, in Semantic Mediawiki's "annotating
semantic data within wiki pages, thus turning a wiki that incorporates
the extension into a semantic wiki" (from Wikipedia) for the future.
For example, will Google Translate in Android, for example, be able to
articulate code-wise with 1) Semantic Mediawiki, and 2) Wikidata and
Wikibase, say, ten years in the future, for color as well as sound
translation, for an universal translator
(http://worlduniversity.wikia.com/wiki/WUaS_Universal_Translator), for
example, when viewing Monet's sunrise -
http://www.med.yale.edu/neurobio/mccormick/fill_in_seminar/Slide2.JPG
- in the Sunhala language (from Sri Lanka) and asking questions of the
painting, or adding a Semanticwiki annotation (say, between the
Sinhala language and the Hungarian language, in text or voice), or,
further, even listening to the sounds of, while watching its colors
change, of the sun (presuming this generates sound) 92 million miles
away, and annotating this, at some point in the future. Even though
Semantic Mediawiki has no plans to support sound, if Wikidata and
Wikibase have plans to support sound files (or don't), say, to wiki
edit about the colors of the sun, or Monet's painting, via
coordinates, is there a way for World University and School, as we
grow (here's WUaS's Computational Linguistics' wiki subject, with a
number of MIT OCW courses to begin, for example), to build on Semantic
Mediawiki, Wikidata and Wikibase, but particularly Wikidata, for
sound? What URL might point me to Semantic Mediawiki, Wikidata and
Wikibase's plans, but particularly Wikidata's, for mapping coordinate
and sound development?
Coding for sound support seems possibly very valuable in the
development of Semantic Mediawiki, Wikidata and Wikibase, for the
future, in addition to developing sophisticated coordinates for, for
example, mapping the sun as the internet itself develops, but the
Semantic Mediawiki and Wikidata projects are immense as they are, so
I'm glad you're focusing them so knowledgeably and skillfully.
Thanks and cheers,
Scott
http://scottmacleod.comhttp://worlduniversity.wikia.com/wiki/World_University
On Mon, Sep 24, 2012 at 6:57 AM, Yury Katkov <katkov.juriy(a)gmail.com> wrote:
> Dear Scott,
> I would recommend you to encode the color as RGB\CMYX coordinates and
> connect it to the widget that can represent color based on its RGB
> coordinates.
> -----
> Yury Katkov
>
>
>
>
> On Fri, Sep 21, 2012 at 8:57 PM, Markus Krötzsch
> <markus(a)semantic-mediawiki.org> wrote:
>>
>> Dear Scott,
>>
>> SMW does not have a special datatype for representing colours. You can
>> encode wavelengths as numbers. Sound data is not supported, nor is any
>> such support planned right now.
>>
>> Markus
>>
>>
>> On 18/09/12 20:55, Scott MacLeod wrote:
>> > Markus and Semantic MediaWikians,
>> >
>> > Thinking broadly, is there a way to map to, for example, a specific
>> > color/hue in a specific painting, and even changing-colors in a
>> > kaleidoscopic exhibit, in a specific museum, in a hypothetical
>> > all-museums-in-all-languages' Museum (see World University and School
>> > beginning Museums' wiki Subject page -
>> > http://worlduniversity.wikia.com/wiki/Museums - for free, Creative
>> > Commons' content)?
>> >
>> > Or, similarly, to map to a specific color in the sun, 92 million miles
>> > away (http://worlduniversity.wikia.com/wiki/Astronomy)?
>> >
>> > Might it be fruitful, similarly, in Semantic MediaWiki to be able to
>> > map to sound with coordinates ... for example in the Music School at
>> > WUaS -
>> > http://worlduniversity.wikia.com/wiki/World_University_Music_School
>> > ... or to a Symphonic production -
>> >
>> > http://worlduniversity.wikia.com/wiki/Symphony_Orchestra_at_World_Universit…
>> > ?
>> >
>> > Is any of this, or related, possible, planned for, or sensible?
>> >
>> > Scott
>> >
>> >
>> >
>> >
>> >
>> > On Tue, Jun 19, 2012 at 2:27 AM, Kim Eik <kim(a)heldig.org> wrote:
>> >> I seem to have isolated the issue to the method getProperyValues in
>> >> the class SqlStubSemanticData. Here it seems that there is some kind
>> >> of mixup going on when i have defined a container to hold coordinates.
>> >> According to the code it seems that it tries to fetch a given array
>> >> entry with the key of the property in mStubPropVals. So if i have
>> >> defined the property "Polygon" that is of type _gpo (Graphical
>> >> Polygon), this type holds a list of _geo properties which is also
>> >> defined as a semantic property "Coordinates"
>> >>
>> >> However, when it this instance of Property is loaded in
>> >> SqlStubSemanticData, i can see that the mStubPropVals is populated
>> >> with a key of "Coordinates" when it is looking for a key "Polygon".
>> >> Any ideas on what is going on here?
>> >>
>> >> Can someone please elaborate the usage of SqlStubSemanticData and how
>> >> it is implemented in the project?
>> >>
>> >>
>> >> ------------------------------------------------------------------------------
>> >> Live Security Virtual Conference
>> >> Exclusive live event will cover all the ways today's security and
>> >> threat landscape has changed and how IT managers can respond.
>> >> Discussions
>> >> will include endpoint security, mobile security and the latest in
>> >> malware
>> >> threats. http://www.accelacomm.com/jaw/sfrnl04242012/114/50122263/
>> >> _______________________________________________
>> >> Semediawiki-devel mailing list
>> >> Semediawiki-devel(a)lists.sourceforge.net
>> >> https://lists.sourceforge.net/lists/listinfo/semediawiki-devel
>> >
>> >
>> >
>>
>>
>>
>> ------------------------------------------------------------------------------
>> Got visibility?
>> Most devs has no idea what their production app looks like.
>> Find out how fast your code is with AppDynamics Lite.
>> http://ad.doubleclick.net/clk;262219671;13503038;y?
>> http://info.appdynamics.com/FreeJavaPerformanceDownload.html
>> _______________________________________________
>> Semediawiki-user mailing list
>> Semediawiki-user(a)lists.sourceforge.net
>> https://lists.sourceforge.net/lists/listinfo/semediawiki-user
>
>
--
http://scottmacleod.com/worlduniversityandschool.htm
This email is intended only for the use of the individual or entity to
which it is addressed and may contain information that is privileged
and confidential. If the reader of this email message is not the
intended recipient, you are hereby notified that any dissemination,
distribution, or copying of this communication is prohibited. If you
have received this email in error, please notify the sender and
destroy/delete all copies of the transmittal. Thank you.
Git, Gerrit, and You! A Tutorial
Where: IRC/SIP/SSH
We want all our developers to feel comfortable with Git, git-review, and
Gerrit. So saper is leading a hands-on online training:
https://www.mediawiki.org/wiki/Project:WikiProject_Extensions/MediaWiki_Wor…
. Check [[Git/Workshop]] for testing access to the conference & lab setup.
Saper will be available for 3 hours, and there'll be a break in the
middle. Absolute beginners with Git might want to stay for the whole
three hours; people with some experience won't need as long.
Answer this poll to help saper choose a date: 17:30 UTC on 26 September,
27 September, 2 October, or 3 October.
http://www.doodle.com/pbdbrcrh5gdpvrfu
If you want to attend, please also answer this question: how much do you
already know?
http://www.doodle.com/zhn7buksgrg8e8rx
Thanks for doing this, saper.
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation
Hi all,
here's our weekly mail on core related stuff for Wikidata. Thanks to
everyone giving feedback and reviews, most notably Chris, Chad, Tim,
Matmarex, DJ, and Krinkle!
* the ContentHandler branch is being reviewed to land in Core next
week, as Rob said. There is a separate thread on that. Further
comments are welcome ->
https://bugzilla.wikimedia.org/show_bug.cgi?id=38622
* the Sites management branch has been reviewed by Chad and got a +1
and is awaiting a review with respect to its DB impact by Asher before
it can be merged. Further comments are welcome ->
https://gerrit.wikimedia.org/r/#/c/23528/
* The sort in jQuery has been further improved based on incoming
comments and reviews. As requested by Krinkle, instead of using an
event, a method can be used. So if anyone can do some JavaScript
reviewing here, it would be appreciated. ->
https://gerrit.wikimedia.org/r/#/c/22562/
We also have branched off the Wikibase extension itself to what is
destined to become Wikibase 0.1. This is undergoing security and
further review right now, so that it can be deployed. Comments are
welcome for any blockers or problems with that. Phase 1 is basically
done. -> https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/extensions/Wikibase.git;a…
We are continuing to work on the master branch with developing new
features and are working on Phase 2. Input is welcome.
Thanks for all the feedback!
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 B. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.