Hello!
The Wikimedia Developer Summit
<https://www.mediawiki.org/wiki/Wikimedia_Developer_Summit> is the annual
meeting to push the evolution of MediaWiki and other technologies
supporting the Wikimedia movement. The next edition will be held in San
Francisco on January 9-11, 2017.
We welcome all Wikimedia technical contributors, third party developers,
and users of MediaWiki and the Wikimedia APIs. We specifically want to
increase the participation of volunteer developers and other contributors
dealing with extensions, apps, tools, bots, gadgets, and templates.
Important deadlines:
- Monday, October 24: This is the last day to request travel
sponsorship. Applying takes less than five minutes.
- Monday, October 31: This is the last day to propose an activity. Bring
the topics you care about!
Subscribe to weekly updates: https://www.mediawiki.org/
wiki/Topic:Td5wfd70vptn8eu4
Please feel free to forward this email to anyone who might be interested in
attending!
Thanks,
Srishti
--
Srishti Sethi
ssethi(a)wikimedia.org
Folks: so Stas and I've spent two days talking to the wikidata folks and
I've learned some fun stuff. I'm sending this email to this mailing list,
because, well, I just have to write this down and this list cares about
things. Maybe they'll care about this thing. I'm not sending this to
wikidata's mailing list because I'm not really confident enough to send
this to experts yet.
Anyway, the most interesting bit that I learned so far is that the Wikidata
team claims that Wikidata doesn't describe truth. That might seem like a
silly difference at first but you start to get into trouble when you want
to query it and don't understand that. Think about it this way: there
are multiple values for the Jesus's birthday
<https://www.wikidata.org/wiki/Q302#P569> and wikidata actually doesn't
claim that either of them are true, just that they are *according to some
sources*. Look also at George Washington's spouse
<https://www.wikidata.org/wiki/Q23#P26>. She has a qualifier - the date of
their marriage. These qualifies are like preconditions to the truth.
Kinda. They aren't always used that way but you can sort of pretend.
But we can emerge from Cartesian doubt! Wikidata has some concept of "true
enough for most uses" called "best rank"
<https://www.wikidata.org/wiki/Help:Ranking>. Its a reasonably simple
concept that amounts to "the community decides". So the plan is to
implement queries against that first. This should be good enough for
wikigrok initially and faster to implement and query because it allows us
to ignore things like qualifiers and references.
Nik
Hello,
The MediaWiki core tests are a bit of a mess, tends to be slow and
require a working MediaWiki installation.
We could certainly extract tests that exercise a single function and do
not depend on having mw installed. Ie unit tests. Doing so, we could
have devs and CI run the lightweight/fast unit tests and skip the slow
tests, hopefully saving a lot of time.
Another interest, is that the resulting unit tests suite will give us a
good overview of our code coverage. We could probably generate a
coverage report quite fast tricking people in writing more unit tests to
enhance the metric.
I am not sure whom to bring the subject to nor how to get a champion
elected to pursue that Augeas cleaning. I am sure MediaWiki Core team
will have some good ideas :-]
There is a task https://phabricator.wikimedia.org/T87781
cheers,
--
Antoine "hashar" Musso
Chris has an epic in the mw-core backlog tracking OAuth fixes that we
could focus on instead of SOA Auth.
https://phabricator.wikimedia.org/T86869
There are a few things that are compelling about considering a shift
of focus to this for me:
* We already have a list of things to work on!
* Erik is *really* interested in improving OAuth
* We started this project and know that there are things we'd like to polish up
* This has a more visible impact than core code cleanup
* We can probably come up with metrics to go along with it
The down sides:
* Authn/z will need work eventually
* We need to keep working on SOA Auth RfC either way
Thoughts? I'd be happy to have a conference call with the current team
and anyone else who is interested to discuss this if it seems like
that would be more efficient.
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
I was in a meeting with Damon today and he brought up something that
might be interesting for us as a team to try out. He suggested that
all gerrit commits should be traceable to a phabricator ticket.
This is something that I have actually done at other companies to the
point of having it be required for git to accept the patch. The point
wouldn't be to have a direct one-to-one correlation between phab
tickets and each commit, but to have some traceability about the
larger goals that each unit of work is intended to serve. Obviously
when we are fixing things in response to open issues this is easy. It
takes a bit more practice to make sure that the spontaneous "I have to
clean this up because my eyes are bleeding" commits have something in
phab to associate the commit to.
Thoughts?
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
Hi Team!
My mom hand surgery today and its hit her a bit harder than we expected so
I might be on nursing duty some tomorrow. I'll be online but might not be
as responsive as I usually am.
Nik
Anyone building HHVM for install in /usr/local should get this patch
first:
https://github.com/hhvm/hhvm-third-party/pull/39
Before that change, installation would put a broken libpcre.a into
/usr/local/lib, which would then be used in subsequent builds. Hence
the errors "this version of PCRE is compiled without UTF support" we
were seeing yesterday.
FindPCRE.cmake would only enable unicode support in the bundled
library if it was actually going to link against it. So if there was a
system library available, it would compile and install the bundled
library with the default configuration.
-- Tim Starling
I have WP:BOLDly added a new "archive" column to the team phab board
that is hidden by default. The reason for this is to give us a way to
keep track of what has been finished in a given week without trying to
keep ourselves and others from closing tasks when they are completed
and without needing to create a new sprint tracking board each week.
There are lots of closed tasks in the done column at the moment, but I
will be moving them to the archive column as fast as my "omg irc spam"
feelings will let me. Once this has been done we should be able to
look at what was completed each week by viewing the board with the
"All tasks" filter set. I'll take responsibility for being the grunt
that moves things from done to archive following the weekly meeting
each week.
Bryan
--
Bryan Davis Wikimedia Foundation <bd808(a)wikimedia.org>
[[m:User:BDavis_(WMF)]] Sr Software Engineer Boise, ID USA
irc: bd808 v:415.839.6885 x6855
(and of course I used the wrong MW Core list the first time)
On Fri, Nov 14, 2014 at 9:00 AM, Greg Grossmeier <greg(a)wikimedia.org> wrote:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=69362
>
> It looks like Aaron was/is/did write a script to find disappeared
> files but there hasn't been an update for a while.
>
> Help?
>
> --
> Greg Grossmeier
> Release Team Manager
--
Greg Grossmeier
Release Team Manager