Today the pywikipedia project is migrating from Subversion to Git. If
you operate a bot, and you are currently checking out pywikipedia's
source code from Subversion, please see
http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Gerrit to learn how
to switch.
Thank you, pywikipediabot community, for this important tool! I hope
you get even more contributions after switching to Git.
-Sumana
-------- Original Message --------
Subject: [Wikitech-ambassadors] Pywikipediabot is migrating to git
Date: Tue, 23 Jul 2013 20:38:18 +0430
From: Amir Ladsgroup <ladsgroup(a)gmail.com>
Reply-To: Coordination of technology deployments across
languages/projects <wikitech-ambassadors(a)lists.wikimedia.org>
To: wikitech-ambassadors(a)lists.wikimedia.org
Hi folks,
As you probably know pywikipedia is migrating to git and after 26 July
SVN checkouts can't be updated so if bot operators don't switch to
git, their bots might not work properly.
I already sent global message for all targets but It's good to inform
local bot operators and put a message in bots' noticeboards
There is a blog post:
http://blog.wikimedia.org/2013/07/23/pywikipediabot-moving-to-git-on-july-2…
A technical manual for bot operators:
http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Gerrit
If people have question, they can ask in mailing list or IRC channel
of pywikipedia
Best
--
Amir
_______________________________________________
Wikitech-ambassadors mailing list
Wikitech-ambassadors(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-ambassadors
Hi,
as a result of the discussion in [1] we now have a PATCH_TO_REVIEW
status in Bugzilla, which has replaced the "patch-in-gerrit" keyword.
The plan is to soon make Gerrit set this status automatically in a bug
report in Bugzilla (via the patch in [2]) when a patch mentions a bug
number (see the commit message guidelines in [3]).
Hence knowing which bug report already has some kind of patch to work
with will become less error-prone and less "untidy".
As before, people working on fixing a bug report are very welcome to set
themselves as the assignee of the bug report.
andre
PS: Related: The legacy keywords "patch", "patch-need-review" [4] and
"patch-reviewed" continue to exist for older tickets which have old
patches in Bugzilla (which could be reviewed and transferred to Gerrit
if they still make sense), as there's no easy solution to that problem.
[1] http://lists.wikimedia.org/pipermail/wikitech-l/2013-June/069805.html
[2] https://gerrit.wikimedia.org/r/#/c/75834/
[3] http://www.mediawiki.org/wiki/Gerrit/Commit_message_guidelines
[4] https://bugzilla.wikimedia.org/buglist.cgi?keywords=patch-need-review&keywo…
--
Andre Klapper | Wikimedia Bugwrangler
http://blogs.gnome.org/aklapper/
On Wed, Jul 24, 2013 at 2:06 AM, Subramanya Sastry
<ssastry(a)wikimedia.org> wrote:
> Hi John and Risker,
>
> First off, I do want to once again clarify that my intention in the previous
> post was not to claim that VE/Parsoid is perfect. It was more that we've
> fixed sufficient bugs at this point that the most significant "bugs" (bugs,
> not missing features) that need fixing (and are being fixed) are those that
> have to do with usability tweaks.
How do you know that? Have you performed automated tests on all
Wikipedia content? Or are you waiting for users to find these bugs?
> My intention in that post was also not
> one to put some distance between us and the complaints, just to clarify that
> we are fixing things as fast as we can and it can be seen in the recent
> changes stream.
>
> John: specific answers to the edit diffs you highlighted in your post. I
> acknowledge your intention to make sure we dont make false claims about
> VE/Parsoid's usability. Thanks for taking the time for digging them up.
> My answers below are made with an intention of figuring out what the issues
> are so they can be fixed where they need to be.
>
>
> On 07/23/2013 02:50 AM, John Vandenberg wrote:
>>
>> On Tue, Jul 23, 2013 at 4:32 PM, Subramanya Sastry
>> <ssastry(a)wikimedia.org> wrote:
>>>
>>> On 07/22/2013 10:44 PM, Tim Starling wrote:
>>>>
>>>> Round-trip bugs, and bugs which cause a given wikitext input to give
>>>> different HTML in Parsoid compared to MW, should have been detected
>>>> during automated testing, prior to beta deployment. I don't know why
>>>> we need users to report them.
>>>
>>>
>>> 500+ edits are being done per hour using Visual Editor [1] (less at this
>>> time given that it is way past midnight -- I have seen about 700/hour at
>>> times). I did go and click on over 100 links and examined the diffs. I
>>> did
>>> that twice in the last hour. I am happy to report clean diffs on all
>>> edits
>>> I checked both times.
>>>
>>> I did run into a couple of nowiki-insertions which
>>> is, strictly speaking not erroneous and based on user input, but is more
>>> a
>>> usability issue.
>>
>> What is a dirty diff? One that inserts junk unexpectedly, unrelated
>> to the user's input?
>
>
> That is correct. Strictly speaking, yes, any changes to the wikitext markup
> that arose from what the user didn't change.
>
>> The broken table injection bugs are still happening.
>>
>>
>> https://en.wikipedia.org/w/index.php?title=Sai_Baba_of_Shirdi&curid=144175&…
>>
>> If the parser isnt going to be fixed quickly to ignore tables it
>> doesnt understand, we need to find the templates and pages with these
>> broken tables - preferably using SQL and heuristics and fix them. The
>> same needs to be done for all the other wikis, otherwise they are
>> going to have the same problems happening randomly, causing lots of
>> grief.
>
>
> This maybe related to this:
> https://bugzilla.wikimedia.org/show_bug.cgi?id=51217 and I have a tentative
> fix for it as of y'day.
Fixes are of course appreciated. The pace of bugfixes is not the problem ...
> VE and Parsoid devs have put in a lot and lot of effort to recognize broken
> wikitext source, fix it or isolate it,
My point was that you dont appear to be doing analysis of how of all
Wikipedia content is broken; at least I dont see a public document
listing which templates and pages are causing the parser problems, so
the communities on each Wikipedia can fix them ahead of deployment.
I believe there is bug about automated testing of the parser against
existing pages, which would identify problems.
I scanned the Spanish 'visualeditor' tag's 50 recentchanges earlier
and found a dirty diff, which I believe hasnt been raised in bugzilla
yet.
https://bugzilla.wikimedia.org/show_bug.cgi?id=51909
50 VE edits on eswp is more than one day of recentchanges. Most of
the top 10 wikis have roughly the same level of testing going on.
That should be a concern. The number of VE edits is about to increase
on another nine Wikipedias, with very little real impact analysis
having been done. That is a shame, because the enwp deployment has
provided us with a list of problems which will impact those wikis if
they are using the same syntax, be it weird or broken or otherwise
troublesome.
> and protect it across edits, and
> roundtrip it back in original form to prevent corruption. I think we have
> been largely successful but we still have more cases to go that are being
> exposed here which we will fix. But, occasionally, these kind of errors do
> show up -- and we ask for your patience as we fix these. Once again, this
> is not a claim to perfection, but a claim that this is not a significant
> source of corrupt edits. But, yes even a 0.1% error rate does mean a big
> number in the absolute when thousands of pages are being edited -- and we
> will continue to pare this down.
Is 0.1% a real data point, or a stab in the dark? Because I found two
in 100 on enwp; Robert found at least one in 200 on enwp; and I found
1 in 50 on eswp.
>> In addition to nowikis, there are also wikilinks that are not what the
>> user intended
>>
>>
>> https://en.wikipedia.org/w/index.php?title=Ben_Tre&curid=1822927&diff=56543…
>>
>> https://en.wikipedia.org/w/index.php?title=Celton_Manx&curid=28176434&diff=…
>
>
> You are correct, but this is not a dirty diff. I dont want to claim this is
> an user error entirely -- but a combination of user and software error.
fwiw, I wasnt claiming these or the ones that followed were dirty
diffs; these are other problems which the software is a contributor
to, *other* than the nowiki cases we know so well.
>> Here is three edits to try to add a section header and a sentence,
>> with a wikilink in the section header.
>> (In the process they added other junk into the page, probably
>> unintentionally.)
>>
>>
>> https://en.wikipedia.org/w/index.php?title=Port_of_Davao&action=history&off…
>
> What is the problem here exactly? (that is a question, not a challenge).
> The user might have entered those newlines as well.
The VE UI is confusing, and did many silly things during those edits.
The user had to resort to editing in source editor to clean it up.
Step through the diffs.
--
John Vandenberg
It appears that CustomEditor is no longer in includes/Wiki.php like the
documentation says it is.[0]
Any idea where this hook relocated to?
Thank you,
Derric Atzrott
Computer Specialist
Alizee Pathology
[0]: https://www.mediawiki.org/wiki/Manual:Hooks/CustomEditor
Hi,
A project I've been working on for the last three months, via a Wikimedia
Individual Engagement Grant, finally had its first release today. It's the
Miga Data Viewer, and it provides a lightweight framework for browsing and
navigating through structured data in CSV file, which can for easily
browsing through, among other things, Wikipedia and Wikidata data. You can
read more about it here:
http://wikiworks.com/blog/2013/07/23/announcing-miga/
...and on the Miga homepage, where the software can also be downloaded:
http://migadv.com
Thanks,
Yaron
--
WikiWorks · MediaWiki Consulting · http://wikiworks.com
You may be interested in this.
Cristian
---------- Forwarded message ----------
From: Karthik Nadar <karthikndr(a)wikimedia.in>
Date: 2013/7/23
Subject: [Wiki Loves Monuments] Job Offer: Technical Position for WLM 2013
To: Wiki Loves Monuments Photograph Competition
<wikilovesmonuments(a)lists.wikimedia.org>
Dear WLMers,
We at the Wiki Loves Monuments 2013 international coordination team
are looking forward to hire a contractor to take care of the
maintenance of the infrastructure behind Wiki Loves Monuments, and
everything that is needed to run smoothly the contest. Yes, this will
be a paid contract and we would expect the person to work for us for
three months from August up-to October.
We are looking for a candidate of either gender with the following skills:
Python — at an experienced level;
MySQL —at an experienced level;
PHP — at an experienced level;
CSS/JS/HTML at a basic level;
Be a quick learner: need to learn basic MediaWiki code;
Among the tasks the contractor will need to perform you have: running
“Erfgoedbot” and adding countries to the monuments’ database. They
will need to write PHP–based tools (statistics) using data from a
MySQL database.
The contractor, when selected, will be working for the Wiki Loves
Monuments international coordination team. He will sign a contract
with Wikimedia Nederlands, the fiscal sponsor of the Wiki Loves
Monuments 2013 international project.
If you are interested, please forward your resume and your portfolio
to cristian.consonni(a)wikimedia.it. The deadline to send applications
is 29 July 2013. The Wiki Loves Monuments international team consists
of volunteers and so, it might take time from our side to review your
applications, but we will make sure it doesn’t takes more than a week.
Regards,
Karthik Nadar and Cristian Consonni,
On behalf of the WLM international team.
_______________________________________________
Wiki Loves Monuments mailing list
WikiLovesMonuments(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikilovesmonumentshttp://www.wikilovesmonuments.org
Afternoon,
Anyone else having issues getting at Gitblit? I've not been able to get at it
for the past half hour or so.
Thank you,
Derric Atzrott
Computer Specialist
Alizee Pathology
https://meta.wikimedia.org/wiki/Grants:IdeaLab
IdeaLab is an incubator for people to share ideas to improve Wikimedia
projects and collaboratively develop them into plans and grant
proposals.
I'm cross-posting to the developer and researcher lists because I
could imagine some of you following this path:
idea for research -> IdeaLab -> learning to use publicly available
data sources -> quick prototyping via User Metrics, replicated
databases in Labs, stats.wikimedia.org, and Limn -> idea for a bigger
project with research & editor engagement implications -> idea
refinement in IdeaLab -> grant proposal
Now is a good time to start so you can get a grant proposal in by the
30 September deadline, requesting up to USD 30,000. More information:
https://meta.wikimedia.org/wiki/Grants:IEG#ieg-learn
Hope this is helpful!
--
Sumana Harihareswara
Engineering Community Manager
Wikimedia Foundation