Hi guys,
Hmmm, I am in digest mode in the mailing list and I am not sure on how to respond efficiently to the digest email. Let's try.
Message: 1 Date: Thu, 07 Nov 2013 18:52:06 +0100 From: Alvaro del Castillo acs@bitergia.com To: mediawiki-api@lists.wikimedia.org Subject: [Mediawiki-api] Working in a MediaWiki activity analyzer Message-ID: 1383846726.14529.135.camel@lenovix Content-Type: text/plain; charset="UTF-8"
Hi guys,
We are working in an analyzer to gather the activity from MediaWiki sites using the cool MediaWiki API.
Right now the idea is to get a list of wiki pages using:
"action=query&list=allpages&aplimit=500"
and then for each page get all revisions with:
action=query&prop=revisions&titles=Main%20Page&rvlimit=500
so we have all revisions activity (pretty similar to having all commits to source code) for a MediaWiki site.
We are doing it as Open Source in:
https://github.com/MetricsGrimoire/MediaWikiAnalysis
Any comments are welcomed!
After having all the data we plan to use:
https://github.com/VizGrimoire/VizGrimoireR
for data analysis (SQL+R) combination
and for doing the viz:
https://github.com/VizGrimoire/VizGrimoireJS
Kudos to Sumana Harihareswara for pointing me to this list!
Cheers
|_____/| Alvaro del Castillo San Félix [o] [o] acs@bitergia.com - Chief Technical Officer (CTO) | V | http://www.bitergia.com | | "Bridging the gap between developers and stakeholders" -ooo-ooo-
Message: 2 Date: Thu, 7 Nov 2013 13:29:56 -0500 From: "Brad Jorsch (Anomie)" bjorsch@wikimedia.org To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Subject: Re: [Mediawiki-api] Working in a MediaWiki activity analyzer Message-ID: CAEepRSswUtAT7UXP7A8QAv6rXD34F6A0gG2O2d_OmAWeVbJN1Q@mail.gmail.com Content-Type: text/plain; charset=ISO-8859-1
If you're using WMF sites, you'll want to download database dumps instead. See http://dumps.wikimedia.org/.
No, we are not using WMF sites. It is for getting metrics for website using MediaWiki as its engine.
On Thu, Nov 7, 2013 at 12:52 PM, Alvaro del Castillo acs@bitergia.com wrote:
Hi guys,
We are working in an analyzer to gather the activity from MediaWiki sites using the cool MediaWiki API.
Right now the idea is to get a list of wiki pages using:
"action=query&list=allpages&aplimit=500"
and then for each page get all revisions with:
action=query&prop=revisions&titles=Main%20Page&rvlimit=500
so we have all revisions activity (pretty similar to having all commits to source code) for a MediaWiki site.
We are doing it as Open Source in:
https://github.com/MetricsGrimoire/MediaWikiAnalysis
Any comments are welcomed!
After having all the data we plan to use:
https://github.com/VizGrimoire/VizGrimoireR
for data analysis (SQL+R) combination
and for doing the viz:
https://github.com/VizGrimoire/VizGrimoireJS
Kudos to Sumana Harihareswara for pointing me to this list!
Cheers
|_____/| Alvaro del Castillo San Félix [o] [o] acs@bitergia.com - Chief Technical Officer (CTO) | V | http://www.bitergia.com | | "Bridging the gap between developers and stakeholders" -ooo-ooo-
Mediawiki-api mailing list Mediawiki-api@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/mediawiki-api
-- Brad Jorsch (Anomie) Software Engineer Wikimedia Foundation
Message: 3 Date: Thu, 07 Nov 2013 22:40:14 +0100 From: Platonides platonides@gmail.com To: MediaWiki API announcements & discussion mediawiki-api@lists.wikimedia.org Subject: Re: [Mediawiki-api] Working in a MediaWiki activity analyzer Message-ID: 527C08BE.5010607@gmail.com Content-Type: text/plain; charset=UTF-8; format=flowed
On 07/11/13 19:29, Brad Jorsch (Anomie) wrote:
If you're using WMF sites, you'll want to download database dumps instead. See http://dumps.wikimedia.org/.
Indeed. Moreover, for your usecase stub dumps may be enough for you.
Message: 4 Date: Thu, 7 Nov 2013 23:02:28 -0500 From: Nathan Larson nathanlarson3141@gmail.com To: "MediaWiki API announcements & discussion" mediawiki-api@lists.wikimedia.org Subject: Re: [Mediawiki-api] Working in a MediaWiki activity analyzer Message-ID: CAF-JeUwx-VA1Hs3wcNjo-GP7JsVECrdki-Bs5o91uVZYT+D1xQ@mail.gmail.com Content-Type: text/plain; charset="iso-8859-1"
On Thu, Nov 7, 2013 at 12:52 PM, Alvaro del Castillo acs@bitergia.comwrote:
Hi guys,
We are working in an analyzer to gather the activity from MediaWiki sites using the cool MediaWiki API.
Sounds like an interesting project. I am working on something similar that will start with the most recent dump and do API queries to stay up to date. Perhaps we can collaborate? I created https://www.mediawiki.org/wiki/MediaWikiAnalysis as a starting point for further on-wiki discussion, if you're interested. Thanks.
We have no dumps at all so we must use the API. I have finished a first version of the tool and you can see it in action in this sample:
http://bitergia.com/projects/redhat-rdo-dashboard/browser/mediawiki.html
It is showing the evolution in time of reviews (better editions?) and authors (better editors?). Also the panel is showing some global data about the WikiMedia site and the Top Authors (Editors).
Our idea is to continue with the development in order to viz the MediaWiki website and its community evolution.
I am taking a look to:
https://www.mediawiki.org/wiki/MediaWikiAnalysis
Cool! I plan to continue working and I'll try to share the work in this wiki page.
Next goals: show pages evolution, continue with a new query when the limit is reached and do the work incremental so we can minimize the effort for updating the info.
Guys, I plan to create such a viz also for mediawiki.org under the project:
http://korma.wmflabs.org/browser/
Cheers!