[Labs-l] Labs newb

Golden Ring goldenring.wp at gmail.com
Mon Apr 6 08:13:05 UTC 2015


Many thanks for taking time to give good answers.  Lots of useful stuff in
there I'll have to process.

Regards,
GoldenRing

On 6 April 2015 at 17:11, Ricordisamoa <ricordisamoa at openmailbox.org> wrote:

>  Il 06/04/2015 08:27, Golden Ring ha scritto:
>
>  On 6 April 2015 at 14:03, Ricordisamoa <ricordisamoa at openmailbox.org>
> wrote:
>
>>  Il 06/04/2015 02:18, Golden Ring ha scritto:
>>
>> I've been thinking recently about how to do recent changes patrol
>> better.  I've prototyped a tool, which you can see athttp://recent-changes.appspot.com/.
>>
>>
>>  Nice. It reminds me of rech <https://tools.wmflabs.org/pltools/rech/>...
>>
>>
>  Yes, very similar concept.   Is there a reason that rech is wikidata
> only?
>
>
> I guess that is because its author, Pasleim
> <https://www.wikidata.org/wiki/User:Pasleim>, is mainly active there and
> didn't think it may have been useful to other projects. Or maybe it's
> supposed to implement some Wikidata-specific features in the future.
>
>     This is currently implemented on Google AppEngine, basically
>> becausethat's what I had to hand when I set out and already knew
>> something about using.  It uses the MediaWiki API to retrieve diffs.
>> This is not ideal for a few reasons, not least because it wouldn't
>> take very heavy use of the tool before I'd have to start paying for
>> it, which would probably mean putting ads on it.  I can't be dealing
>> with all that.
>>
>>
>>  I suppose the cost is related to Google charging for bandwidth use
>> beyond a threshold?
>> Since the app needs JavaScript anyway, you could simply retrieve recent
>> changes on the client, thus avoiding much of the server-side traffic.
>>
>>
>  Yes, Google charges for both CPU time and bandwidth use beyond the free
> quota (1GB bandwidth either way + 28 instance-hours per day).
>
>  Retrieving changes from the client side was what I attempted first, but
> of course it has to be hosted somewhere, and unless that's on the wiki
> concerned, then you have to deal with the cross-site nature of the API
> requests.  My impression is that this requires the wiki to be configured to
> explicitly allow requests from the domain serving the page.
>
>
> Just pass dataType: "jsonp" to $.ajax(), it will magically work! (
> background <https://www.mediawiki.org/wiki/Manual:Ajax#Limitations>)
>
>
>  [snip]
>
>>  Labs is precisely for external tools, and I'd say Tool Labs best fits
>> your needs.
>> To enhance MediaWiki's built-in patrolling functionality, you should read
>> this <https://www.mediawiki.org/wiki/How_to_become_a_MediaWiki_hacker>
>> instead.
>> Use your judgement and common sense
>> <https://lists.wikimedia.org/pipermail/wikimedia-l/2014-October/075137.html>
>> to decide whether it's better to develop your tool on Tool Labs or as part
>> of MediaWiki (either core or an extension).
>>
>>
>  I'm not absolutely clear on the best choice here.   On one hand, I'd
> like the tool to end up something like Special:NewPagesFeed
> <https://en.wikipedia.org/wiki/Special:NewPagesFeed>.  I guess this
> points towards developing it as built-in mediawiki functionality, rather
> than an external tool.  On the other hand, I've developed it because I
> actually want to use it; my impression is that getting changes into
> mediawiki, and then deployed onto en wikipedia, is not easy.  Probably for
> pretty good reasons, but still not easy.
>
>
> Only you know how your tool will look like :-)
> Special:NewPagesFeed is provided by the PageTriage extension
> <https://www.mediawiki.org/wiki/Extension:PageTriage>, so if your goal is
> to improve it or build upon it tightly, here you go.
> If you're still undecided, I advise you to install MediaWiki+PageTriage on
> your device and have a look at the code while experimenting with Tool Labs.
>
>
>  If I go down the external tool route, then I guess the tool gets hosted
> at eg. tools.wmflabs.org; is that right?  On the other hand, an external
> tool hosted there doesn't have access to the production wikipedia databases
> and would have to continue getting data through the mediawiki API; is that
> right?
>
>
> #1: yes. Follow the links at tools.wmflabs.org to create a Labs account
> (if you don't have one yet), request access to the Tools project and create
> a new tool. It will be hosted at tools.wmflabs.org/<yourtoolname>.
> #2: no. Tools accounts have access to replicas of the production databases
> with private data redacted: read Connecting to the database replicas
> <https://wikitech.wikimedia.org/wiki/Help:Tool_Labs/Database#Connecting_to_the_database_replicas>
> for access. However, some data are only exposed via the API, and (my
> understanding is that) databases are generally only used directly in
> performance-critical applications which benefit from the power of SQL.
>
>
>  TBH I'm not sure I've got a lot of clue about the architecture of
> MediaWiki; is it described anywhere, beyond, "It uses PHP, MySQL and
> jQuery"?
>
>
> It seems you're asking for the Developer hub
> <https://www.mediawiki.org/wiki/Developer_hub>, though it may be hard to
> grasp for a 'newb' ;-)
>
>
>  Sorry for having so many questions!
>
>
> You're welcome!
>
>
>  Regards,
> GoldenRing
>
>>
>
>
> _______________________________________________
> Labs-l mailing listLabs-l at lists.wikimedia.orghttps://lists.wikimedia.org/mailman/listinfo/labs-l
>
>
>
> _______________________________________________
> Labs-l mailing list
> Labs-l at lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/labs-l
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.wikimedia.org/pipermail/labs-l/attachments/20150406/b5daf262/attachment-0001.html>


More information about the Labs-l mailing list