Hi,
Lydia mentioned in her summary a major discussion about Wikidata in the Hebrew Wikipedia. The discussion was in Hebrew of course, so I'll bring a little summary of it.
Eleven people supported the installation of Wikidata. Nobody objected \o/
Despite the wide support, some issues and questions were raised:
1. How is the coordination with interwiki links bot operators progressing? Will the bots be smart enough not to do anything to articles that are already listed in the repository and have the correct links displayed? Will the bots be smart enough to update the repo in the transition period, when some Wikipedias have Wikidata and some don't? Will the bots be smart enough not to do anything with articles that have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
2. What are the numbers after the Q in the titles in the repo site? - I replied that they are just sequential identifiers without any additional meaning. Maybe it can be added to the FAQ.
3. Several people complained about instability in the links editing pages in the repo: They saw messages about network problems when they tried to edit links. I experienced this a couple of times, too. I also saw a complete crash with a "memory full" error once.
4. Somebody noticed that the testing sites don't support unified accounts (CentralAuth). The production system will, right?
5. Somebody complained that it's too easy to remove a link from a repo - clicking the "remove" link is enough. I mentioned it in a bug report: https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
6. And this is probably the biggest issue: The workflow for adding an interlanguage link is cumbersome and in some cases the interface elements are undiscoverable.
-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
On Fri, Oct 12, 2012 at 4:45 PM, Amir E. Aharoni amir.aharoni@mail.huji.ac.il wrote:
Hi,
Lydia mentioned in her summary a major discussion about Wikidata in the Hebrew Wikipedia. The discussion was in Hebrew of course, so I'll bring a little summary of it.
Thanks! Google Translate isn't exactly amazing for Hebrew -> English (but of course still a lot better than nothing) ;-)
Eleven people supported the installation of Wikidata. Nobody objected \o/
\o/
Despite the wide support, some issues and questions were raised:
- How is the coordination with interwiki links bot operators progressing?
Will the bots be smart enough not to do anything to articles that are already listed in the repository and have the correct links displayed? Will the bots be smart enough to update the repo in the transition period, when some Wikipedias have Wikidata and some don't? Will the bots be smart enough not to do anything with articles that have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
We'll have to put more work into that over the next weeks. The current status is that some bots are running nicely on the demo system. Some stuff is at http://meta.wikimedia.org/wiki/Wikidata/Bots
- What are the numbers after the Q in the titles in the repo site? -
I replied that they are just sequential identifiers without any additional meaning. Maybe it can be added to the FAQ.
Ok will add it to http://meta.wikimedia.org/wiki/Wikidata/Deployment_Questions as soon as I can. Anyone want to link that page from the FAQ?
- Several people complained about instability in the links editing
pages in the repo: They saw messages about network problems when they tried to edit links. I experienced this a couple of times, too. I also saw a complete crash with a "memory full" error once.
Yes that should be better now. There is a bug that's being worked on to fix it: https://bugzilla.wikimedia.org/show_bug.cgi?id=40823
- Somebody noticed that the testing sites don't support unified
accounts (CentralAuth). The production system will, right?
Yes. We didn't set this up for the demo system. Will be set up for the real deal.
- Somebody complained that it's too easy to remove a link from a repo
- clicking the "remove" link is enough. I mentioned it in a bug
report: https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
Thanks for filing. Not sure what the plan there is.
- And this is probably the biggest issue: The workflow for adding an
interlanguage link is cumbersome and in some cases the interface elements are undiscoverable.
Do you have specifics about which elements are hard to discover?
Cheers Lydia
2012/10/12 Lydia Pintscher lydia.pintscher@wikimedia.de:
On Fri, Oct 12, 2012 at 4:45 PM, Amir E. Aharoni amir.aharoni@mail.huji.ac.il wrote:
- Somebody complained that it's too easy to remove a link from a repo
- clicking the "remove" link is enough. I mentioned it in a bug
report: https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
Thanks for filing. Not sure what the plan there is.
It is very high on our priority list. We expect to tackle this in this month.
- And this is probably the biggest issue: The workflow for adding an
interlanguage link is cumbersome and in some cases the interface elements are undiscoverable.
Do you have specifics about which elements are hard to discover?
I think what Amir means is that for articles that do not have a language link yet, it is basically impossible to add those. (I.e. the workflow is, go to Wikidata, create a new item, then link that back to your article. Not cool.)
We have started on this story in this sprint, and expect to have something by the end of the month as well. This means that, if everything goes well, it would be there in time for deployment to he.wikipedia.
No promises, though :) -- but we are giving our best.
I hope that helps, Cheers, Denny
2012/10/12 Denny Vrandečić denny.vrandecic@wikimedia.de:
2012/10/12 Lydia Pintscher lydia.pintscher@wikimedia.de:
On Fri, Oct 12, 2012 at 4:45 PM, Amir E. Aharoni amir.aharoni@mail.huji.ac.il wrote:
- Somebody complained that it's too easy to remove a link from a repo
- clicking the "remove" link is enough. I mentioned it in a bug
report: https://bugzilla.wikimedia.org/show_bug.cgi?id=40200
Thanks for filing. Not sure what the plan there is.
It is very high on our priority list. We expect to tackle this in this month.
Thank you!
- And this is probably the biggest issue: The workflow for adding an
interlanguage link is cumbersome and in some cases the interface elements are undiscoverable.
Do you have specifics about which elements are hard to discover?
I think what Amir means is that for articles that do not have a language link yet, it is basically impossible to add those. (I.e. the workflow is, go to Wikidata, create a new item, then link that back to your article. Not cool.)
Yep, this one.
-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
2012/10/12 Lydia Pintscher lydia.pintscher@wikimedia.de:
Ok will add it to http://meta.wikimedia.org/wiki/Wikidata/Deployment_Questions as soon as I can. Anyone want to link that page from the FAQ?
I linked it.
-- Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי http://aharoni.wordpress.com “We're living in pieces, I want to live in peace.” – T. Moore
Just to answer these questions for my _java_ interwiki bot MerlIwBot:
Am 12.10.2012 16:45, schrieb Amir E. Aharoni:
Will the bots be smart enough not to do anything to articles that are already listed in the repository and have the correct links displayed?
Will the bots be smart enough not to do anything with articles that have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
yes, because this is wikidata independent for my java bot and worked already before. So running my bot is always save on these wikis. non-1-to-1 langlinks can be moved partly to wikidata because mixing if wikidata and local langlinks is possible. But please note that since change https://gerrit.wikimedia.org/r/#/c/25232/ which is live with 1.21.wmf2 no multiple langlinks are display anymore.
Will the bots be smart enough to update the repo in the transition period, when some Wikipedias have Wikidata and some don't?
My bot can update the repository but will cause much load on local wikis. That's because currently there is no possibility to know if a langlinks is stored local or at wikidata. Local langlinks can be stored on main page or any included page. So the whole source code of every included page must be checked first. I created a feature request which could solve this problem: https://bugzilla.wikimedia.org/show_bug.cgi?id=41345 . If hope this will be added before wikidata goes live.
To update the repository my bot need to know the corresponding repository script url for a local wiki. Currently http://wikidata-test-repo.wikimedia.de/w/api.php is hard coded at my bot framework. But this repository url will change for hewiki. There is currently no way to request this info using api. I also created a feature requests for this: https://bugzilla.wikimedia.org/show_bug.cgi?id=41347
Merlissimo
Is it helpful? http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata
On 10/24/12, Merlissimo merl@toolserver.org wrote:
Just to answer these questions for my _java_ interwiki bot MerlIwBot:
Am 12.10.2012 16:45, schrieb Amir E. Aharoni:
Will the bots be smart enough not to do anything to articles that are already listed in the repository and have the correct links displayed?
Will the bots be smart enough not to do anything with articles that have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
yes, because this is wikidata independent for my java bot and worked already before. So running my bot is always save on these wikis. non-1-to-1 langlinks can be moved partly to wikidata because mixing if wikidata and local langlinks is possible. But please note that since change https://gerrit.wikimedia.org/r/#/c/25232/ which is live with 1.21.wmf2 no multiple langlinks are display anymore.
Will the bots be smart enough to update the repo in the transition period, when some Wikipedias have Wikidata and some don't?
My bot can update the repository but will cause much load on local wikis. That's because currently there is no possibility to know if a langlinks is stored local or at wikidata. Local langlinks can be stored on main page or any included page. So the whole source code of every included page must be checked first. I created a feature request which could solve this problem: https://bugzilla.wikimedia.org/show_bug.cgi?id=41345 . If hope this will be added before wikidata goes live.
To update the repository my bot need to know the corresponding repository script url for a local wiki. Currently http://wikidata-test-repo.wikimedia.de/w/api.php is hard coded at my bot framework. But this repository url will change for hewiki. There is currently no way to request this info using api. I also created a feature requests for this: https://bugzilla.wikimedia.org/show_bug.cgi?id=41347
Merlissimo
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
Am 24.10.2012 14:07, schrieb Amir Ladsgroup:
Is it helpful? http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata
How should this page be helpful for me? I think pwb interwiki bot has the same problems. It has even more problems because the non api version which is used by most interwiki bot operators isn't able to detect all currently included langlinks. How do you plan to update langlinks on wikis having wikidata partly used? Perhaps it is useful if sb. would answer these three questions for the pwb interwikibot, too.
On 10/24/12, Merlissimo merl-DgbeHs7HPCJM656bX5wj8A@public.gmane.org wrote:
Just to answer these questions for my _java_ interwiki bot MerlIwBot:
Am 12.10.2012 16:45, schrieb Amir E. Aharoni:
Will the bots be smart enough not to do anything to articles that are already listed in the repository and have the correct links displayed?
Will the bots be smart enough not to do anything with articles that have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?
yes, because this is wikidata independent for my java bot and worked already before. So running my bot is always save on these wikis. non-1-to-1 langlinks can be moved partly to wikidata because mixing if wikidata and local langlinks is possible. But please note that since change https://gerrit.wikimedia.org/r/#/c/25232/ which is live with 1.21.wmf2 no multiple langlinks are display anymore.
Will the bots be smart enough to update the repo in the transition period, when some Wikipedias have Wikidata and some don't?
My bot can update the repository but will cause much load on local wikis. That's because currently there is no possibility to know if a langlinks is stored local or at wikidata. Local langlinks can be stored on main page or any included page. So the whole source code of every included page must be checked first. I created a feature request which could solve this problem: https://bugzilla.wikimedia.org/show_bug.cgi?id=41345 . If hope this will be added before wikidata goes live.
To update the repository my bot need to know the corresponding repository script url for a local wiki. Currently http://wikidata-test-repo.wikimedia.de/w/api.php is hard coded at my bot framework. But this repository url will change for hewiki. There is currently no way to request this info using api. I also created a feature requests for this: https://bugzilla.wikimedia.org/show_bug.cgi?id=41347
Merlissimo
Wikidata-l mailing list Wikidata-l-RusutVdil2icGmH+5r0DM0B+6BGkLq7r@public.gmane.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l