I'd like to invite you to the hackathon Wikimedia Foundation is putting
on next month in San Francisco.
https://www.mediawiki.org/wiki/San_Francisco_Hackathon_January_2012
We're focusing on outreach for this one, teaching new developers how to
build stuff using Gadgets and our web-accessible API and Phonegap. A
bunch of WMF's San Francisco engineers will be there, plus (we're
planning) Derk-Jan Hartman, Maarten Dammers, Daniel Kinzler, and a few
more experts from out of town. Whatever your level of expertise, we
welcome you, as a learner and as a hacker.
Please spread the word and please consider coming. Thanks!
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hello,
(sorry if someone already got this on an other list)
please try out Sztakipedia toolbar, which is meant to show how AI
technology could help Wikipedia editing.
The 90 sec intro video explains everything:
http://www.youtube.com/watch?v=8VW0TrvXpl4
alternatively, please check out the home page of the tool:
http://pedia.sztaki.hu, for installation instructions, etc.
Please give me feedback, good or bad at the talk page of the tool:
http://en.wikipedia.org/wiki/User_talk:MHeder/Sztakipedia
For me, your feedback is the most essential thing. As a PhD student my
resarch topic is about how to bring some AI into the editor interfaces
where people create knowledge. My primary target was Wikipedia, even though
I'm an occasional contributor only. Dealing the wiki system is no easy
matter though, so the system I introduce today is not implementing the
whole vision I had when we started the development with some other
students. Not even close.
However, I hope that the new visual editor the foundation is working will
could open up new options for the integration of AI. So even if you do not
prefer the current form of Sztakipedia, please share your vision on how we
should do this in the future.
Many thanks,
Mihály Héder
Computer and Automation Research Institute
Hungarian Academy of Sciences
Hey,
I recently added a bunch of wfDeprecated calls to deprecated functions in
core using the new version argument and got a lot of flak for this, since
it caused notices for a lot of people. What a lot of people told me to do
is wait with adding these one or two releases, or at the very least replace
all callers with new equivalent code. This might seem reasonable at first
glance, but sort of undoes the whole point of wfDeprecated IMO. As
extension author I want to be able to see if my code is using any
deprecated functions, not just functions deprecated more then 2 releases
ago. Furthermore, expecting people to release all callers is often
unrealistic, since putting in new code might be context dependent, and
break compatibility with older versions of MediaWiki in extensions where
this is not wanted. So lot's of deprecated methods end up without a
wfDeprecated call at all, causing stuff to suddenly break. Those are 2 very
good reasons why wfDeprecated calls should be added immediately after
deprecating a function.
So what about the notices? Since we have $wgDeprecationReleaseLimit you can
set it to 2 releases back and achieve exactly the same result as delaying
adding the wfDeprecated calls without all the problems associated with that
approach.
This appears to much effort for people though, so what about changing the
default value of $wgDeprecationReleaseLimit from false to $rel - 2? Any
objections to that?
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
We gave extensions access to two new committers:
* Nils (kroocsiogsi) will be working on the extension SoundManager2Button.
* Kaseluris-Nikos (synagonism) will be working on the "synagonism" skin.
Welcome!
(By the way, I added a few items to my commit access checklist at
https://wikitech.wikimedia.org/view/Svn.wikimedia.org#Add_users . Every
time I add a committer, I ensure that I link their mediawiki.org
username to their commit username so they get code review emails, give
the mediawiki.org username coder rights, and add the committer to
http://www.mediawiki.org/wiki/Developers . If there are other gotchas I
missed, let me know.)
--
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
Hi,
I have created a proof of concept of this, which contains the
information only for categorymembers. The code is at [1] and it looks
something like this (XML formatted):
<props>
<prop name="ids">
<properties>
<property name="pageid" type="integer" />
</properties>
</prop>
<prop name="title">
<properties>
<property name="ns" type="namespace" />
<property name="title" type="string" />
</properties>
</prop>
<prop name="type">
<properties>
<property name="type">
<type>
<t>page</t>
<t>subcat</t>
<t>file</t>
</type>
</property>
</properties>
</prop>
</props>
What do you think?
One problem I'm aware of is that the output uses “prop” and “property”
to mean something else. Do you have any suggestions for better naming?
After this, I will add the necessary information to the rest of the
API modules and then post a patch to bugzilla.
[1] https://github.com/svick/mediawiki/commit/868910637445ea0dcf3ad84bc1ee9fc33…
Petr Onderka
[[en:User:Svick]]
On Thu, Nov 10, 2011 at 11:37, Roan Kattouw <roan.kattouw(a)gmail.com> wrote:
> On Wed, Nov 9, 2011 at 11:36 PM, Petr Onderka <gsvick(a)gmail.com> wrote:
>> Is this information available somewhere? Is trying the query and
>> seeing what properties are returned the best I can do currently?
> Unfortunately, no, at least not programmatically.
>
>> Do
>> you think it would be a good idea if I (or someone else) modified
>> “action=paraminfo” to include this information in some form?
>>
> Yes, please do! Patches are very welcome.
>
> Roan
>
> _______________________________________________
> Mediawiki-api mailing list
> Mediawiki-api(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
Hey All,
There's a somewhat urgent issue being discussed on English Wikipedia at the
moment relating to offensive emails from a certain domain.
Discussion is here.<http://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Incide…>
I wonder if any ops people are able to quickly address this, perhaps by
blacklisting a domain (or specific email addresses) server side. Obviously
harassment of editors is a big problem so I 'm going for a pro-active "find
someone to fix it" approach :)
If not, anyone have suggestions?
Tom
Our wfArrayToCGI and wfCgiToArray conversion functions (they transform
data between 'foo=bar&hello=world' and array( 'foo' => 'bar', 'hello' =>
'world' ) formats) seam to be fairly messed up.
In a discussion over r104518 on the way it did query parameters to pass to
getLocalURL I noticed our lack of sane null handling in wfArrayToCGI.
However after examining it even more I found that the output of this area
of code seams completely messed up.
Some examples of how these behave:
> echo wfArrayToCGI(array('foo' => 'bar', 'baz' => '', 'asdf' => null,
> 'qwerty' => false));
foo=bar&asdf=&qwerty=
# In array -> cgi conversion we will omit a key for an empty string, but
yield an empty value for null and false
# Frankly it should be the other way around. An empty string is reasonable
input to expect an empty but present cgi key.
# By the way, this is our treatment of an empty value in our method that
goes in the opposite direction
> var_dump(wfCgiToArray('foo=bar&asdf='));
array(2) {
["foo"]=>
string(3) "bar"
["asdf"]=>
string(0) ""
}
# Naturally you can expect how a round trip is screwed up
> var_dump(wfArrayToCGI(wfCgiToArray('foo=bar&asdf=')));
string(7) "foo=bar"
# Because we treat an empty string as an omission instead of a value the
round trip omits something we had in the first place
# cgi to array conversion could use some proper handling of an edge case
too
> var_dump(wfCgiToArray('foo=bar&asdf=&qwerty'));
PHP Notice: Undefined offset: 1 in
/Users/daniel/Workspace/mediawiki/trunk/phase3/includes/GlobalFunctions.php
on line 388
array(3) {
["foo"]=>
string(3) "bar"
["asdf"]=>
string(0) ""
["qwerty"]=>
string(0) ""
}
# Personally I think that 'foo=bar&asdf=&qwerty' should yield an array like
# array( 'foo' => 'bar', 'asdf' => '', 'qwerty );
Does anyone think anything would break if I re-coded these two deep core
methods to work in a seemingly more sane way.
[r104518] https://www.mediawiki.org/wiki/Special:Code/MediaWiki/104518
--
~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://daniel.friesen.name]
At the GLAMCamp DC (February 10-12, Washington, DC, USA), we will be
working on improvements to Wikimedia technology -- see
https://meta.wikimedia.org/wiki/GLAMcamp_DC#Tasks . Specifically, the
GLAM folks are working on better mass upload tools for museums to use,
better reporting metrics, and building a glamwiki.org website. If you
want to help, let them know you're interested in attending. More info
below.
(I'll almost certainly be there, and it looks like at least one or two
other Foundation folks will be as well.)
best,
Sumana Harihareswara
Volunteer Development Coordinator
Wikimedia Foundation
-------- Original Message --------
Subject: [cultural-partners] GLAMcamp DC
Date: Wed, 30 Nov 2011 20:08:03 -0500
From: Lori Phillips <lori.byrd.phillips(a)gmail.com>
Hello everyone,
On the cusp of GLAMcamp Amsterdam, we’re excited to announce that GLAMcamp
DC will be hosted by the National Archives and Records Administration from
February 10-12, 2012.
As with past GLAMcamps, GLAMcamp DC will be a meeting/workshop for
Wikimedians active in outreach within cultural institutions (as well as for
those interested in becoming more active.) The event will target a small
group of community-focused and technology-focused Wikimedians in the United
States who will continue and solidify the key elements of the GLAM-Wiki
project.
What makes this GLAMcamp different is its focus on the US. This event will
essentially serve as the kicking off of US GLAM-Wiki organization, and will
help to fine tune the scope and goals of the US Cultural Partnerships
Coordinator and associated projects. One of the primary goals of GLAM-Wiki
in the US is to establish a formal listing of volunteers that can be called
upon to assist GLAMs, and this event will do much to galvanize such a group.
GLAMcamp DC will of course continue the good work that comes out of
GLAMcamp Amsterdam and will work to further improve the tools and
documentation associated with the wider GLAM-Wiki community.
While all are welcome to apply to attend, there are a limited number of
spots available. Please go here to apply:
http://meta.wikimedia.org/wiki/GLAMcamp_DC/Application
Or see the event planning page for more details:
http://meta.wikimedia.org/wiki/GLAMcamp_DC
Thanks so much,
Lori Phillips [[User:LoriLee]]
Sarah Stierch [[User:SarahStierch]]
Pete Forsyth [[User:Peteforsyth]]
--
Lori Phillips
Web Content Specialist | Wikipedian-in-Residence
The Children's Museum of Indianapolis
(phone number elided by Sumana) http://loribyrdphillips.com/
Hey all.
I'm looking for comments regarding an extension I knocked up today and its
viability for Wikimedia wikis:
https://github.com/Jarry1250/TranslateSvg
The extension removes the need for duplicating a file (an administrative
nightmare) when you want to translate it. It does this by creating an extra
information flow:
[[File:Example.svg|thumb|120px|lang=en]] => rsvg "...Example.svg"
"120px-en-Example.svg" (lang='en') => Displays in English
[[File:Example.svg|thumb|120px|lang=fr]] => rsvg "...Example.svg"
"120px-fr-Example.svg" (lang='fr') => Displays in French
It means rsvg SVG to PNG will finally "understand" the <switch> tag,
enabling fully translatable SVGs [1]. It would eventually work in tandem
with a translating interface, e.g. through TranslateWiki and/or a local
special page.
The only issue I know of is that the renderer will create a new file every
time you change the language parameter, causing a drain on memory/storage
space. How do we handle this for random thumb sizes at the moment?
All comments appreciated
Harry (User:Jarry1250)
[1] https://developer.mozilla.org/en/SVG/Element/switch