On 17/02/13 00:44, Luca Martinelli wrote:
As of now, we write templates and we put data into them, article by article - but this is going to change during 2013, with the implementation of Wikidata, since we'll put data in the common repo and then just call them on local projects.
Is this new prog language going to affect this? I mean, will it help us to write new templates which will call those data more easily?
I believe that some members of the Wikidata team are interested in allowing Lua modules to fetch data directly from Wikidata. We have added a couple of hooks to Scribunto to support this, but the library hasn't been designed or implemented yet, as far as I can tell.
-- Tim Starling
Am 18.02.2013 um 13:06 schrieb Tim Starling tstarling@wikimedia.org:
On 17/02/13 00:44, Luca Martinelli wrote:
As of now, we write templates and we put data into them, article by article - but this is going to change during 2013, with the implementation of Wikidata, since we'll put data in the common repo and then just call them on local projects.
Is this new prog language going to affect this? I mean, will it help us to write new templates which will call those data more easily?
I believe that some members of the Wikidata team are interested in allowing Lua modules to fetch data directly from Wikidata. We have added a couple of hooks to Scribunto to support this, but the library hasn't been designed or implemented yet, as far as I can tell.
It is in the works and I would love if you find the time to review it once it's up on Gerrit (which should happen this week).
Lua scripting is indeed a big deal for us at Wikidata — structured data from Wikidata is available as JSON which can be thrown around and iterated over as Lua tables. This has the potential to unleash some niftyness in Wikidata-based Infobox templates.
On Mon, Feb 18, 2013 at 1:15 PM, Jens Ohlig jens.ohlig@wikimedia.de wrote:
It is in the works and I would love if you find the time to review it once it's up on Gerrit (which should happen this week).
Lua scripting is indeed a big deal for us at Wikidata — structured data from Wikidata is available as JSON which can be thrown around and iterated over as Lua tables. This has the potential to unleash some niftyness in Wikidata-based Infobox templates.
To expand on what Jens said: There'll be two main ways to include Wikidata's data in Wikipedia articles. For the simple ones there'll be a template-like syntax and for the more complex things Lua will be the way to get it. The syntax for the former is at http://meta.wikimedia.org/wiki/Wikidata/Notes/Inclusion_syntax_v0.2
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
On Mon, Feb 18, 2013 at 1:20 PM, Lydia Pintscher lydia.pintscher@wikimedia.de wrote:
On Mon, Feb 18, 2013 at 1:15 PM, Jens Ohlig jens.ohlig@wikimedia.de wrote:
It is in the works and I would love if you find the time to review it once it's up on Gerrit (which should happen this week).
Lua scripting is indeed a big deal for us at Wikidata — structured data from Wikidata is available as JSON which can be thrown around and iterated over as Lua tables. This has the potential to unleash some niftyness in Wikidata-based Infobox templates.
To expand on what Jens said: There'll be two main ways to include Wikidata's data in Wikipedia articles. For the simple ones there'll be a template-like syntax and for the more complex things Lua will be the way to get it. The syntax for the former is at http://meta.wikimedia.org/wiki/Wikidata/Notes/Inclusion_syntax_v0.2
Sorry current version is at http://meta.wikimedia.org/wiki/Wikidata/Notes/Inclusion_syntax
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
How useful would it be for Lua to access to the query/content/parser API be?
I am suspecting there could be a lot of creative usages of this, including getting data from the wikidata (which won't have to do anything special to enable this)
On Mon, Feb 18, 2013 at 7:15 AM, Jens Ohlig jens.ohlig@wikimedia.de wrote:
Am 18.02.2013 um 13:06 schrieb Tim Starling tstarling@wikimedia.org:
On 17/02/13 00:44, Luca Martinelli wrote:
As of now, we write templates and we put data into them, article by article - but this is going to change during 2013, with the implementation of Wikidata, since we'll put data in the common repo and then just call them on local projects.
Is this new prog language going to affect this? I mean, will it help us to write new templates which will call those data more easily?
I believe that some members of the Wikidata team are interested in allowing Lua modules to fetch data directly from Wikidata. We have added a couple of hooks to Scribunto to support this, but the library hasn't been designed or implemented yet, as far as I can tell.
It is in the works and I would love if you find the time to review it once it's up on Gerrit (which should happen this week).
Lua scripting is indeed a big deal for us at Wikidata — structured data from Wikidata is available as JSON which can be thrown around and iterated over as Lua tables. This has the potential to unleash some niftyness in Wikidata-based Infobox templates.
-- Jens Ohlig Software developer Wikidata project
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
For later. As discussed before, access via HTTP is probably hardly an option for the Wikimedia wikis (and they are our priority), but for other wikis that will be crucial.
Cheers, Denny
2013/2/18 Yuri Astrakhan yuriastrakhan@gmail.com
How useful would it be for Lua to access to the query/content/parser API be?
I am suspecting there could be a lot of creative usages of this, including getting data from the wikidata (which won't have to do anything special to enable this)
On Mon, Feb 18, 2013 at 7:15 AM, Jens Ohlig jens.ohlig@wikimedia.de wrote:
Am 18.02.2013 um 13:06 schrieb Tim Starling tstarling@wikimedia.org:
On 17/02/13 00:44, Luca Martinelli wrote:
As of now, we write templates and we put data into them, article by article - but this is going to change during 2013, with the implementation of Wikidata, since we'll put data in the common repo and then just call them on local projects.
Is this new prog language going to affect this? I mean, will it help us to write new templates which will call those data more easily?
I believe that some members of the Wikidata team are interested in allowing Lua modules to fetch data directly from Wikidata. We have added a couple of hooks to Scribunto to support this, but the library hasn't been designed or implemented yet, as far as I can tell.
It is in the works and I would love if you find the time to review it
once
it's up on Gerrit (which should happen this week).
Lua scripting is indeed a big deal for us at Wikidata — structured data from Wikidata is available as JSON which can be thrown around and
iterated
over as Lua tables. This has the potential to unleash some niftyness in Wikidata-based Infobox templates.
-- Jens Ohlig Software developer Wikidata project
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 18/02/13 23:24, Yuri Astrakhan wrote:
How useful would it be for Lua to access to the query/content/parser API be?
I am suspecting there could be a lot of creative usages of this, including getting data from the wikidata (which won't have to do anything special to enable this)
If you provided full access to ApiQuery*, resource limiting would be fairly difficult, since it provides lots of ways to do unlimited table scanning regardless of result set size. As for the Wikidata application -- the interface would be awkward compared to something made specifically for interfacing Wikidata with Lua.
-- Tim Starling
Totally agree about the wikidata interface convenience, but I suspect it might not cover all usages, in which case it will be a good way for users to start implementing workarounds, and for us to notice the need and meet it with new features.
I don't know enough about template performance impact, although I do hope that a low-limit API call does not impact much more than a complex template with many links/templates/categories. In any case, lets put this on back-burner for later evaluation.
On Mon, Feb 18, 2013 at 7:29 AM, Tim Starling tstarling@wikimedia.orgwrote:
On 18/02/13 23:24, Yuri Astrakhan wrote:
How useful would it be for Lua to access to the query/content/parser API
be?
I am suspecting there could be a lot of creative usages of this,
including
getting data from the wikidata (which won't have to do anything special
to
enable this)
If you provided full access to ApiQuery*, resource limiting would be fairly difficult, since it provides lots of ways to do unlimited table scanning regardless of result set size. As for the Wikidata application -- the interface would be awkward compared to something made specifically for interfacing Wikidata with Lua.
-- Tim Starling
Wikitech-l mailing list Wikitech-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikitech-l
On 02/18/2013 04:29 AM, Tim Starling wrote:
As for the Wikidata application -- the interface would be awkward compared to something made specifically for interfacing Wikidata with Lua.
I am still not convinced that the interface would be awkward. A general method like
dataTable = mw.data.wikidata({ param1="foo", param2="bar" })
looks pretty simple to me. Maybe the wikidata-specific interface will be more convenient to use, but I doubt that the difference will be significant.
Gabriel
On Mon, Feb 18, 2013 at 7:24 AM, Yuri Astrakhan yuriastrakhan@gmail.com wrote:
How useful would it be for Lua to access to the query/content/parser API be?
I am suspecting there could be a lot of creative usages of this, including getting data from the wikidata (which won't have to do anything special to enable this)
Personally, I'd be extremely wary of allowing Lua (which is run in the middle of the page parse, remember) to be issuing HTTP requests on WMF sites. Especially after seeing how slow using ForeignAPIRepo for file accesses makes my local test wiki.[1]
IMO it would be better for accesses to go through interfaces designed for the access, where the back end for the interface can optimize and cache and explicitly limit the access, instead of opening it up to arbitrary API calls in the middle of the parse. This back end could of course have options to work like ForeignDBRepo on WMF sites and ForeignAPIRepo on non-WMF sites, which would be transparent to the end user.
On Mon, Feb 18, 2013 at 7:48 AM, Yuri Astrakhan yuriastrakhan@gmail.com wrote:
in which case it will be a good way for users to start implementing workarounds, and for us to notice the need and meet it with new features.
"Workarounds" like enwiki's Template:Str sub?[2] There will probably be some of that anyway, but we don't need to encourage people to do so.
[1]: For those that don't know, ForeignAPIRepo allows one MediaWiki wiki to include files from another by making calls to the prop=imageinfo API, in a similar manner to how WMF wikis use ForeignDBRepo to include Commons images by looking directly in Commons's database.
[2]: It's possible to implement {{str left|text|len}} using the {{padleft}} parser function. So enwiki has a template {{str index|text|pos}} that "gets" the character at 'pos' by basically taking {{str left|text|pos}} and seeing whether it's equal to {{str left|text|pos-1}}A, {{str left|text|pos-1}}B, {{str left|text|pos-1}}C, {{str left|text|pos-1}}D, and so on. And then {{str sub|text|start|len}} is implemented by calling {{str index|text|pos}} for every position between start and start+len.
On 18.02.2013, 18:57 Brad wrote:
On Mon, Feb 18, 2013 at 7:24 AM, Yuri Astrakhan yuriastrakhan@gmail.com wrote:
How useful would it be for Lua to access to the query/content/parser API be?
I am suspecting there could be a lot of creative usages of this, including getting data from the wikidata (which won't have to do anything special to enable this)
Personally, I'd be extremely wary of allowing Lua (which is run in the middle of the page parse, remember) to be issuing HTTP requests on WMF sites. Especially after seeing how slow using ForeignAPIRepo for file accesses makes my local test wiki.[1]
No HTTP access is needed, however, this changes nothing: allowing API calls from Lua opens such a nice can of worms that we don't want to do it at the same time as Lua rollout, too much fun.
wikitech-l@lists.wikimedia.org