Hi,
Following up on a conversation on the gendergap email list, I am discussing
with Freenode the possibility of changing the default web client to one
that is friendlier and has a less technical feel, primarily for the benefit
of new users who access #wikipedia-en-help by clicking on a link. The
likely candidate for a new IRC client is Kiwi. If Freenode wants to
maintain their current default web client we can still use Kiwi if we run
it on Wikimedia pages. Would WMF or the volunteer dev community be willing
to implement this? If so, is filing a Bugzilla bug the best way to get the
wheels of progress to turn?
Pine
Toward the end of Wikimania I started working on some codesniffer stuff
that could be run against extensions, skins etc to see if a change in
mw-core has broken them.
My current WIP patch can be seen on gerrit and all comments are welcome!
https://gerrit.wikimedia.org/r/#/c/153399/
My plan would be to have this run as a nightly job on jenkins across all
extensions / skins in gerrit, thus everyday we would clearly be able to see
if we had broken anything!
The codesniffer stuff is also something that other extension developers
could pull into their integration system (say using github and travis) and
run across their extensions.
--
Addshore
On a lot of wikis I would benefit from receiving the contents of each new fresh category member to my inbox. Was this considered before?
Example use-case below: these pages all belong to a category. (Another use case is patrolling new pages in a category.)
(There may be a need to allow people edit by email, outside of web browser, but that's a bit complicated and I can't think of /how/ to do it, yet.)
svetlana
On Tue, 12 Aug 2014, at 17:44, Dan Garry wrote:
> On 12 August 2014 02:39, svetlana <svetlana(a)fastmail.com.au> wrote:
> >
> > There needs to be a central place, like the Wikimedia blog, but dedicated
> > to tech things - actively announcing everything WM ENGINEERING are doing,
> > both in products and in core.
>
>
> There is. It's called the monthly report. See here for July's for
> example: *https://www.mediawiki.org/wiki/Wikimedia_Engineering/Report/2014/July
> <https://www.mediawiki.org/wiki/Wikimedia_Engineering/Report/2014/July>*
>
> --
> Dan Garry
> Associate Product Manager, Mobile Apps
> Wikimedia Foundation
Use-case: someone is being annoying, but is willing to ignore me voluntarily on-wiki.
Are there any software tools which may assist them in this where doing so is O.K. (such as not showing my edits in review queue where a page defaults to old stable version anyway, hiding my edits from recent changes and watchlists, etc)?
In such cases there is a potential that I do some harm behind their closed eyes (such as malicious edits) so there needs to be a fine line where the target user is, for example, a reviewer.
I searched on the web but I couldn't find past related software, documentation, or discussions.
svetlana
For very small abandoned projects, nobody is around to do anything, even to review a patch.
For a slightly bigger project, someone is around to review a patch and explain you how to write one, and where to get started. The process of you writing it is slow, and the process of them reviewing it may also be slow.
For a next bigger project, 2-3 volunteer maintainers are around. They hear every word from feedback and prioritise them based on how easy they are to work around, and how well they fit a project scope. (Usually a lot of things fit project scope as long as it's designed and coded well.)
As a project goes bigger, more time is dedicated to unit tests. A volunteer would read feedback and review patches, but more effort would be spent into making the project not fall apart. Code review, unit tests, issue tracker, and again everything people request eventually gets in somewhere. Again someone is around who can explain you where to get started if you'd like to pick up your own request or someone else's.
As a project goes even bigger, it usually remains modular. It is relatively easy for you to get started by reading a very small module and coding your thing in. Again someone is around who sees you as a new potential volunteer, and is happy to explain you how things work.
...
When a project has an employee (or a GSOC student) coming in, something needs to prioritise their work. What issues to hand over to them to get them complete in no time?
...
Answer to this question may vary. It is hard to answer it right - there is more than one way to do it.
Some things come to my head about what should not be done:
1/ Miss community feedback. [For bigger projects, you /will/ have to consciously skip (note: this does not mean miss, it is a different word) some, to not make the thing a features mess. Simplicity is the key of success.]
2/ Come up with big non-modular projects.
3/ Lose consistency by working on big enhancements out of the blue, leaving the rest of functionality or interface behind.
4/ Write code which fails to be reusable.
5/ Fail to write documentation.
6/ Fail to showcase, reveal, and expose the features the employee added, which are reusable.
7/ Fail to support community work.
8/ Fail to meet original project mission.
Could someone please point me specifically to where (4) and (6) are not failing within Wikimedia Engineering?
In other words:
- What reusable things come out of each Wikimedia Engineering project?
- Where can I find out about them easily without asking you to find them for me by hand?
svetlana
If we're going to store JavaScript gadgets and Lua modules in a central
wiki (this is planned, I suppose), some coding guidelines would be
certainly useful.
Should a Request for Comment be created for that? We can reuse MediaWiki
conventions for JavaScript, of course.
Moreover, I find that there should be a way to run linters automatically
over JavaScript gadgets and Lua modules.
Even though CodeEditor already has this feature, it would be helpful to
have a service (on Labs?) that periodically analyzes scripts on a wiki
and reports errors, maybe notifying the author of a breakage via e-mail.
Here are some already existing tools for similar purposes:
https://tools.wmflabs.org/stylize/http://lintbridge.wmflabs.org
This is an idea I've had for a while, and I'd like to see if there's any
interest, or on the contrary concerns, about it. I would like to explore
(and if I have official blessing, champion) the idea of asking corporations
with software engineering staff if they would be willing to let their
employees volunteer their expertise and time to mediawiki while ideally
still being on their employer's payroll. I mean engineering in the wide
sense of the term, including Ops, QA, etc. and maybe even UX.
This would allow engineers to take a break for a predetermined duration
from their usual work duties and contribute in a very productive manner to
our open source projects. And maybe to other open source projects than
mediawiki, but I think our project in particular is a great starting point.
I see this as a flexible scheme. It doesn't really matter if people can do
it for a day, a month, or a year, I believe that these inter-organization
exchanges could have great value.
*Background*
Earlier this year the WMF's Multimedia team, which I'm a part of, had a
volunteer working full-time with us, Aaron Arcos. Aaron used to work at
Google and left to spend a year offering his software engineering skills to
several non-profits. His work with us, bringing his experience from large
projects at Google, was invaluable. He mentioned that when he told his
former Google co-workers about his idea, some were interested and tempted
to follow his example.
As some of you may now, Facebook is currently lending the WMF engineering
resources in order to help with our HHVM deployment to production.
>From my subjective perspective, as someone who's paid to be a software
engineer, I would definitely enjoy the ability to do something like that at
certain points of my career. There's always a lot to learn by being thrown
into the deep end of another organization's software development.
In fact, in the corporate world, Twitter and Etsy have identified these
benefits and are doing this between themselves:
http://thenextweb.com/insider/2012/09/11/twitter-etsy-run-engineer-exchange…
In our own wiki world, we have Wikipedians in residence:
https://en.wikipedia.org/wiki/Wikipedian_in_residence
*The idea*
This is my take on it, and I'm really interested to hear some feedback and
brainstorm on this. I think that starting talking to interested parties
will be what gives shape and structure to the idea.
First and foremost I would see this outreach aimed at the engineers
themselves. Because worst case scenario, if their employer isn't willing to
donate continued payroll for that person while they're in residence, we
should facilitate people like Aaron Arcos who are willing to donate their
time and skills entirely for free. There may be engineers out there at the
Googles and Facebooks of the world who don't know or might forget that they
could help projects like mediawiki greatly if they took a break from their
job and worked on open source for a while.
Secondly, I think that such a scheme would be easily pitched to companies
(including other non-profits) as a training opportunity. As much as
experienced engineers coming into the project have a lot to teach us, we
also have a lot of interesting knowledge to teach in return, and the
experience of working on this codebase alone, the scale of the traffic
we're dealing with, etc., can have incredible training value.
I imagine this scheme as being entirely flexible. For a short period or a
long period, still paid by their former employer or not, we should foster
experienced engineers participating in our project for a period of time. We
already participate in outreach to people with less experienced developers
through GSoC and similar (maybe we're not doing enough of that for some
people, but that's another topic!), and I think there is an unexplored
opportunity in trying to do this with experienced folks.
Lastly, while everything I describe here is probably possible on an
individual basis and does happen occasionally, I believe that having a
catchy name (eg. "engineers in residence"), and an official scheme for it
would greatly increase the frequency of it happening.
I could keep going on and on about this, but let's see what others think
based on this rough idea. And if you're at Wikimania right now and are
interested in discussing this topic, find me.
Hi,
the discussion stopped with the question:
How to test the new rendering on betalabs?
So, does who knows the answer?
Best
Moritz
On Tue, Jul 15, 2014 at 12:19 PM, Moritz Schubotz <schubotz(a)tu-berlin.de> wrote:
> Hi Chris,
>
> By default the Math extension users the dedicated host
> mathoid.testme.wmflabs.org this host is accessible from the beta cluster.
> A bug related to the beta cluster is available here
>
> https://bugzilla.wikimedia.org/show_bug.cgi?id=66516
>
> But I have no clue how to change the config for betalabs.
>
> Best
>
> Moritz
> Am 15.07.2014 17:06 schrieb "Chris McMahon" <cmcmahon(a)wikimedia.org>:
>
>> On Tue, Jul 15, 2014 at 1:10 AM, Moritz Schubotz <physik(a)physikerwelt.de>
>> wrote:
>>
>> > Hi Chris,
>> >
>> > me too.
>> > How can I implement in in beta labs?
>> >
>>
>> I'd say to start by filing a bugzilla ticket for Wikimedia
>> Labs/deployment-prep.
>> Then it is a matter of registering the proper extensions and config in
>> puppet.
>> Would mathoid need a dedicated host?
>>
>>
>> >
>> > Best
>> > Moritz
>> >
>> > On Mon, Jul 14, 2014 at 6:35 PM, Chris McMahon <cmcmahon(a)wikimedia.org>
>> > wrote:
>> > > I would really like to see this follow the standard deploy scheme:
>> > > implement it in beta labs; then enable it for mediawiki.org and
>> > test2wiki;
>> > > then enable it on production cluster nodes.
>> > > -Chris
>> > >
>> > >
>> > > On Mon, Jul 7, 2014 at 3:07 AM, Moritz Schubotz
>> > > <physik(a)physikerwelt.de>
>> > > wrote:
>> > >
>> > >> Hi,
>> > >>
>> > >> during the last year the math extension achieved a goal defined back
>> > >> in 2003. Support of MathML. In addition there is SVG support for
>> > >> MathML disabled browsers. (See http://arxiv.org/abs/1404.6179 for the
>> > >> details)
>> > >> I would like to give Wikipedia users a chance to test this new long
>> > >> awaited feature.
>> > >> Therefore we would need a mathoid instance that is accessible from
>> > >> the
>> > >> production cluster. Greg Grossmeier already created the required
>> > >> table
>> > >> in the database. (Sorry for the "friction" connected with this
>> > >> process)
>> > >> Currently the MathJax team is working on a phantom.js less method to
>> > >> render texvc to mathml and svg. Some days ago I have tested that it,
>> > >> and it works quite well. I would appreciate a discussion with ops
>> > >> that
>> > >> to figure out how this can be can go to production. The original idea
>> > >> was to use jenkins to build the mathoid debian package. Even though
>> > >> the debian package builds without any issues in the launchpad ppa
>> > >> repo
>> > >> jenkins can not build the package. If there is a reference project
>> > >> that uses jenkins to build debian packages that go to production this
>> > >> would really help to figure out what is different for mathoid and why
>> > >> the package building does not work even though it works on launchpad.
>> > >>
>> > >> Best
>> > >> Physikerwelt
>> > >>
>> > >> PS: I was informed that there is a related RT that I can not access
>> > >> https://rt.wikimedia.org/Ticket/Display.html?id=6077
>> > >>
>> > >> --
>> > >> Mit freundlichen Grüßen
>> > >> Moritz Schubotz
>> > >>
>> > >> _______________________________________________
>> > >> Wikitech-l mailing list
>> > >> Wikitech-l(a)lists.wikimedia.org
>> > >> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> > > _______________________________________________
>> > > Wikitech-l mailing list
>> > > Wikitech-l(a)lists.wikimedia.org
>> > > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> >
>> >
>> >
>> > --
>> > Mit freundlichen Grüßen
>> > Moritz Schubotz
>> >
>> > Telefon (Büro): +49 30 314 22784
>> > Telefon (Privat):+49 30 488 27330
>> > E-Mail: schubotz(a)itp.physik.tu-berlin.de
>> > Web: http://www.physikerwelt.de
>> > Skype: Schubi87
>> > ICQ: 200302764
>> > Msn: Moritz(a)Schubotz.de
>> >
>> > _______________________________________________
>> > Wikitech-l mailing list
>> > Wikitech-l(a)lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>> >
>> _______________________________________________
>> Wikitech-l mailing list
>> Wikitech-l(a)lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Hi!
I made a XML dump with --current option, then replaced some of external
domain links [http://somedomain.org] in it. When I import the dump back,
these pages aren't updated. I think that's because text processors /
editors do not update sha1 / timestamp fields. Why doesn't
maintenance/importDump.php recalculate and compare sha1 of actual page
content? How should I touch timestamp / sha1 xml field text in the modified
dump? Is there any ready solution? Or, shall I use pywikibot instead (that
will be longer and slower)?
Dmitriy