Hey,
I am looking for a way to obtain some information on the commits made to
some WMF hosted git repo within the last n seconds. My current solution is
using githubs since parameters as follows:
https://api.github.com/repos/wikimedia/mediawiki-extensions-Wikibase/commit…
This however does not really work for me since there is a lag of a minute
or two before the commit gets pushed to github. So if I ask for the commits
made in the last 30 seconds, I will never get anything back.
Is there a similar API in gitweb, gerrit or something else accessible over
http?
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
*I got the idea of this extension via the mailing list itself
*I plan on implementing a *pronunciation recording extension*
***More Details can be found out on
https://www.mediawiki.org/wiki/User:Rahul21/Gsoc
*You Could Contact me on the IRC(Rahul_21) or email me back.
Thank You
Rahul Maliakkal
On 04/05/2013 03:01 PM, Isabel Gancedo wrote:
> Everything seems to be working fine; jobs run a maximum warp and the queue
> is becoming tiny.
Excellent. I see this line in the "New features" list from the Release
notes:
The Job system was refactored to allow for different backing stores
for queues as well as cross-wiki access to queues, among other
things. The schema for the DB queue was changed to support better
concurrency and reduce deadlock errors.
Perhaps that deserves more attention?
> There is just one issue that I would like to mention: in my database
> (MySQL) pre-upgrade jobs have job_random set to 0 and do not seem to be
> picked up -not even when I use the option --type=replaceText.
>
> I can repeat the replace text operations, so this is not big problem for
> me. However, if this is the normal behaviour for an upgrade, maybe it
> should be mentioned in the notes somewhere.
This may be a bug that was introduced during the refactoring. Could you
file one?
https://bugzilla.wikimedia.org/enter_bug.cgi?product=MediaWiki&cc=mah@every…
Thank you for your helpful comments!
--
http://hexmode.com/
It is not the case that simple clear questions have simple clear
answers, not even in the world of pure ideas, and much less so in
the messy real world of everyday life.
-- Gregory Chaitin, “Paradoxes of Randomness”
Hey,
I'm curious what the list thinks of deprecating and eventually removing the
Hooks class. Some relevant info:
/**
* Hooks class.
*
* Used to supersede $wgHooks, because globals are EVIL.
*
* @since 1.18
*/
https://github.com/wikimedia/mediawiki-core/blob/master/includes/Hooks.php#…
I personally find the comment hilarious and hope you see why when looking
at the "class". Looks like usage in core and extensions is not to
extensive, so switching to something more sane seems quite feasible.
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
*
Hello,
The Wikimedia Language Engineering team [1] invites everyone to join the
team’s monthly office hour on April 10, 2013. We have some exciting updates
about our ongoing projects, some of which have also been shared in our
recent blog posts[2]. During this session we would like to walk through
some of them. The team would also like to introduce a new outreach program
which was mentioned in the last office hour held on 13th March 2013 [3].
Event details and the general agenda is mentioned below.
See you all at the IRC office hour!
regards
Runa
Event Details:
==========
Date: 2013-04-10 (Wednesday)
Time: 1700 UTC, 1000 PDT
IRC channel: #wikimedia-office on irc.freenode.net
Agenda:
1.
Introductions
2.
Translate UX - Deployment and other news
3.
Language Mavens - an outreach initiative with the Wikimedia language
communities
4.
MediaWiki Language Extension Bundle (MLEB) Release
5.
Q/A - We shall be taking questions during the session. Questions can
also be sent to runa at wikimedia dot org <runa(a)wikimedia.org> before
the event and can be addressed during the office-hour.
[1] http://wikimediafoundation.org/wiki/Language_Engineering_team
[2]
http://blog.wikimedia.org/c/technology/features/internationalization-and-lo…
[3] http://meta.wikimedia.org/wiki/IRC_office_hours/Office_hours_2013-03-13
*
--
Language Engineering - Outreach and QA Coordinator
Wikimedia Foundation
Hey,
I have some extensions that make use of non-MW dependent PHP libraries.
These libraries come with PHPUnit tests that can be run as standalone. I'd
like them to be run together with the tests for my extensions making use of
them whenever I run them using phase3/tests/phpunit/phpunit.php. So far I
have been doing this by registering the tests via the UnitTestsList hook as
if they where regular extension tests. This is not ideal though since then
I need to maintain this list of tests and make sure I update it whenever
the third party library changes. The library itself does not provide a list
of test files, as it's runner just runs everything in the relevant test
directory, much like as MW does for core tests. Clearly that is not ideal,
so I am looking for a way to not have to maintain such lists. Is there a
way to register a whole directory so that the MW phpunit.php script runs
all the tests in it?
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil.
--
Hi all,
tl;dr: I've cleaned up the mediawiki/core repo, and performance for fetch/clone
operations should be noticeably faster.
So, due to some recent upgrades in Gerrit, we've now got GC support in JGit as a
result. Gerrit supports this functionality, which will greatly reduce
the size of our
repositories on-disk. It also makes use of an improved bitmap
algorithm for fetches
& clones, making them wayyy faster. I've now run this on mediawiki/
core, and the
repo went from 3.0G down to ~620M on disk. You should now notice all
read actions
(over https & ssh) to be way faster--a fresh clone should now only be
limited by your
bandwidth, not how fast Gerrit can serve the repo from disk.
If you notice *any* problems with mediawiki/core, please let me know
immediately.
If everything looks good, we'll set this up as a cron to run weekly or
something for
all repositories.
-Chad
I recently tried to create a small javascript to parse user signature times
on talk pages [1]. With it the readers would have been able to see the
signature's timestamp in their preferred timezone (and as a side effect it
would be consistent with the revision history) without breaking caching.
The main problem I had was that MediaWiki didn't provide a class attribute
wrapping the date nor the whole signature. There is a bug report on that
topic: https://bugzilla.wikimedia.org/show_bug.cgi?id=25141
After some time spent looking for a system message it seems the only
solution is to edit includes/parser/Parser.php. I added a span with a class
at this line:
https://gerrit.wikimedia.org/r/gitweb?p=mediawiki/core.git;a=blob;f=include…
It works fine (at least in my own - limited - case).
Some questions related to what I've done:
- Would it be possible to wrap dates, or at least signatures in a class, as
suggested in bug #25141? it may be useful for others too
-- or perhaps to have a system message for dates (I'm not sure it's a good
idea)
- Is Parser.php the good place for that?
- Would a similar solution see its way into MediaWiki some day? or are
there problems I'm not seeing? (I'm not suggesting my script would be it,
it's a quick and dirty hack created with very limited knowledge.)
[1] using Moment.js <http://momentjs.com/>