Interesting email coming through Chad about e tool being developed to
retrieve metrics from Gerrit.
I wonder how this effort is doing next to gerrit-stats. Overlapping,
complementarity... I will look further, but if the Gerrit experts in the
house can have a look, all the better.
--
Quim
---------- Forwarded message ----------
From: Lundh, Gustaf <Gustaf.Lundh(a)sonymobile.com>
Date: Fri, Oct 12, 2012 at 7:30 AM
Subject: RE: Gerrit Metrics
To: Remy Bohmer <linux(a)bohmer.net>, Janne Hellsten <jjhellst(a)gmail.com>
Cc: Stephen Roberts <stephen.roberts4(a)gmail.com>, Repo and Gerrit
Discussion <repo-discuss(a)googlegroups.com>,
"jose.lobato(a)isis-papyrus.com" <jose.lobato(a)isis-papyrus.com>
For some of my Gerrit-metrics I'm using the Gerrit-event module in
Gerrit-trigger[1] to collect events through the stream-event SSH
interface.
I collect the interesting data and feed it to a Graphite instance.
This way I can plot real-time data in terms of:
Changes per hour (or minute) and also the rate of comments/merges/etc.
One of my colleagues has also created python interface for listening
and parsing the events. Not sure if it has been open-sourced yet.
[1] https://github.com/jenkinsci/gerrit-trigger-plugin
Best regards
Gustaf
-----Original Message-----
From: repo-discuss(a)googlegroups.com
[mailto:repo-discuss@googlegroups.com] On Behalf Of Remy Bohmer
Sent: den 29 december 2011 22:58
To: Janne Hellsten
Cc: Stephen Roberts; Repo and Gerrit Discussion;
jose.lobato(a)isis-papyrus.com
Subject: Re: Gerrit Metrics
Hi,
2011/12/29 Janne Hellsten <jjhellst(a)gmail.com>:
>>> I've written Haskell code to talk to Gerrit via SSH and parse the
>>> JSON responses to Haskell data structures. This is pretty handy for
>>> further data mining in Haskell. Ping me if you're interested, I can
>>> make the code available if someone finds it useful.
>>
>> Here is the ping ;-)
>> I find it useful, where can I find the code?
>
> You can find the GerritJson module here:
> https://github.com/nurpax/gerrit-json
>
> I only recently went back to Haskell so there are probably many ways
> in which the code can be improved.
Thanks for sharing!
I will look into it in detail next year/week ;-)
Kind regards,
Remy
--
To unsubscribe, email repo-discuss+unsubscribe(a)googlegroups.com
More info at http://groups.google.com/group/repo-discuss?hl=en
--
To unsubscribe, email repo-discuss+unsubscribe(a)googlegroups.com
More info at http://groups.google.com/group/repo-discuss?hl=en
Hi!
I could use some help looking over extensions.
Since the ContentHandler stuff has been merged into the core, several much-used
functions and hooks have been deprecated. I have tried to find and replace all
calls in core, but a lot of extensions are still using the old stuff. They will
still work for all text-based content, but will generate a ton of warnings, and
will fail tests (and make core tests fail).
I have already posted updates for three extensions on gerrit:
* Gadgets: I3d28a3b8
* TitleBacklist: Ib3a00d89
* SpamBlacklist: I72d9ad58
So, it would be great if you (yes, you!) could help out by looking for
deprecated stuff in extensions. Grep for it and/or set $wgDevelopmentWarnings to
true on your wiki. Run test cases. See what causes warnings, and fix them.
Most prominently, the following functions have been deprecated:
* WikiPage::getText() was replaced by WikiPage::getContent()
* Revision::getText() was replaced by Revision::getContent()
* Article::getContent() was replaced by WikiPage::getContent()
The new functions all return Content objects instead of text.
Similarly, the ArticleSaveComplete hook and several more have been replaced by
hooks that use Content objects instead of page text.
Please have a look at docs/contenthandler.txt for an overview of the
architecture and a list of deprecated hooks.
-- daniel
PS: I'm unsure what level of backwarsd compatibility is desired for extensions.
I have tried to by B/C in the SpamBlacklist extension, but didn't care about it
for TitleBlacklist and Gadgets... could fix that, if it's required.
> Hello,
> On Wikipedia the right to import articles via Special:Import is bound to
> the user group 'administrator' by default.
>
> On nds.wp we discussed that it would be useful for us if we could give
> import rights to some trusted users without giving them admin rights. It
> would be unnecessary overhead if these users have to contact an admin
> every time they want to translate an article from another wiki.
>
> Could you give me some technical advice how we could do this and how
> much effort it is? What would be the easiest way to achieve this and
> what exactly would I have to request on Bugzilla to get the
functionality?
>
> There seems to be the user group ''importer' but apparently it has
> additional rights to the import rights of an admin and cannot be used to
> achieve what we want, am I right? So I guess we need to create a new
> separate user group?
>
> Thank You
> User:Slomox
> Marcus Buck
Quickest and easiest is to simply get the communities approval for both
the rights (point to existing discussion) and the people who are going to
have the rights (point to discussion) and ask for a steward (1) to assign
the rights. I doubt that you are going to have huge amounts, and someone
usually needs to assign rights, so a bugzilla often becomes superfluous
with the simpler task of getting the stewards to do it. The right exists,
and can be assigned already, just needs telling someone with the capability
to do it.
(1) https://meta.wikimedia.org/wiki/Steward_requests/Permissions
Regards, Andrew
Hello,
I updated one of my wikis today from f2138b1 to 9299bab032a85c1a421436da04a595b79f2b9d6c (git master as I write this) and after running update.php
I get this:
A database error has occurred. Did you forget to run maintenance/update.php after upgrading? See: https://www.mediawiki.org/wiki/Manual:Upgrading#Run_the_update_script
Query: SELECT page_id,page_len,page_is_redirect,page_latest,page_content_model FROM "page" WHERE page_namespace = '0' AND page_title = 'Test11' LIMIT 1
Function: LinkCache::addLinkObj
Error: 42703 ERROR: column "page_content_model" does not exist
LINE 1: ...*/ page_id,page_len,page_is_redirect,page_latest,page_conte...
^
Backtrace:
#0 /usr/home/saper/public_html/pg/w/includes/db/DatabasePostgres.php(477): DatabaseBase->reportQueryError('ERROR: column ...', '42703', 'SELECT page_id...', 'LinkCache::addL...', false)
#1 /usr/home/saper/public_html/pg/w/includes/db/Database.php(942): DatabasePostgres->reportQueryError('ERROR: column ...', '42703', 'SELECT page_id...', 'LinkCache::addL...', false)
#2 /usr/home/saper/public_html/pg/w/includes/db/Database.php(1367): DatabaseBase->query('SELECT page_id...', 'LinkCache::addL...')
#3 /usr/home/saper/public_html/pg/w/includes/db/Database.php(1458): DatabaseBase->select('page', Array, Array, 'LinkCache::addL...', Array, Array)
#4 /usr/home/saper/public_html/pg/w/includes/cache/LinkCache.php(222): DatabaseBase->selectRow('page', Array, Array, 'LinkCache::addL...', Array)
#5 /usr/home/saper/public_html/pg/w/includes/Title.php(2895): LinkCache->addLinkObj(Object(Title))
#6 /usr/home/saper/public_html/pg/w/includes/Title.php(4320): Title->getArticleID()
#7 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(416): Title->exists()
#8 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(465): WikiPage->exists()
#9 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(204): WikiPage->getContentModel()
#10 /usr/home/saper/public_html/pg/w/includes/WikiPage.php(190): WikiPage->getContentHandler()
#11 /usr/home/saper/public_html/pg/w/includes/Action.php(92): WikiPage->getActionOverrides()
#12 /usr/home/saper/public_html/pg/w/includes/Action.php(139): Action::factory('view', Object(WikiPage))
#13 /usr/home/saper/public_html/pg/w/includes/Wiki.php(144): Action::getActionName(Object(RequestContext))
#14 /usr/home/saper/public_html/pg/w/includes/Wiki.php(528): MediaWiki->getAction()
#15 /usr/home/saper/public_html/pg/w/includes/Wiki.php(447): MediaWiki->main()
#16 /usr/home/saper/public_html/pg/w/index.php(59): MediaWiki->run()
#17 {main}
Look like LinkCache.
Can this be quickly fixed or do we need to revert this?
//Saper
Hello,
On Wikipedia the right to import articles via Special:Import is bound to
the user group 'administrator' by default.
On nds.wp we discussed that it would be useful for us if we could give
import rights to some trusted users without giving them admin rights. It
would be unnecessary overhead if these users have to contact an admin
every time they want to translate an article from another wiki.
Could you give me some technical advice how we could do this and how
much effort it is? What would be the easiest way to achieve this and
what exactly would I have to request on Bugzilla to get the functionality?
There seems to be the user group ''importer' but apparently it has
additional rights to the import rights of an admin and cannot be used to
achieve what we want, am I right? So I guess we need to create a new
separate user group?
Thank You
User:Slomox
Marcus Buck
Something changed this month in git making this not work anymore,
function JidanniLessRedNavigation($sktemplate,$links){
foreach($links['namespaces'] as $ns=>&$value){
if($value['context']=='talk' && $value['class']=='new' && !$sktemplate->getTitle()->quickUserCan('createtalk')){
unset($links['namespaces'][$ns]);}
if($ns=='category' && $value['class']=='selected new'){
$value['class']='selected';
if(isset($links['actions']['watch'])){unset($links['actions']['watch']);}}}
return true;}
$wgHooks['SkinTemplateNavigation'][]='JidanniLessRedNavigation';
Maybe I should just give up on trying to use this as the internals it
depends on keep on getting changed.
Some MediaWiki forms have elements with type=email. Up until 2 weeks
ago, en-wiki would strip that out as it's invalid in old-school HTML.
But now that $wgHtml5 is true, it flows to the browser.
A nifty result is mobile browsers will use a custom on-screen keyboard
with @ and .com in it for type=email. A new result is HTML5 browsers
will not submit a form with an invalid e-mail in it. A strange result
is, Special:ChangeEmail [1] already does client-side validation of
e-mail, so users on HTML5 browsers get duelling tooltips! (See
screenshot [2] attached to Bug 40909 [3].)
Other MediaWiki forms that use Html or HTMLForm to specify HTML5 field
types will also have behavior changes. If you don't want the
browser's validation, Munaf Assaf figured out you can disable it [4]
or you can sort-of override the styles for the browser's validation
tooltip [5]. Does anyone have experience of doing client-side
validation in conjunction with the browsers' own validation? Ideally
a jQuery plug-in would build up client-side validation on top of the
browser's HTML5 support while hiding their big differences in
implementation.
Thanks in advance,
[1] https://en.wikipedia.org/wiki/Special:ChangeEmail
[2] http://bug-attachment.wikimedia.org/attachment.cgi?id=11173
[3] https://bugzilla.wikimedia.org/show_bug.cgi?id=40909
[4] http://www.w3.org/TR/html5/attributes-common-to-form-controls.html#attr-fs-…
[5] http://jsfiddle.net/trixta/qTV3g/
--
=S Page software engineer on E3
Hi all!
As discussed last week with Rob, I have no prepared a merge request that
introduces the ContentHandler into MediaWiki core. This is a major building
block for the Wikidata project. I hope the merge will be completed soon, since
this will grow stale fast.
The merge request is here: https://gerrit.wikimedia.org/r/27194
Since Gerrit doesn't show nice diffs for merges,
here's a squashed version: https://gerrit.wikimedia.org/r/27191
Please let us know very soon if there are any serious problems. The branch has
been reviewed before, and I resolved several remaining issues over the last
days, so I hope there are no more big issues left.
-- daniel
Hello,
Scribunto currently has only some basic APIs to do essential things like
access the parser frame and write things into debug log. Intending to
change it, I wrote an API specification for an in-script API, which Lua
scripts should be able to use in order to access certain MediaWiki
features and interfaces. I have previously submitted this specification
during the hackathon early this summer to the mailing list:
< https://www.mediawiki.org/wiki/Extension:Scribunto/API_specification>
Now I have collected the feedback, fixed the specification accordingly
and intend to move on and actually implement it, expect for the mw.query
part (which require more benchmarking and more design considerations).
If you have comments regarding the specification, I would highly
appreciate them now, before I actually write the code.
—Victor.