Forwarding this to the discussion list for Wikidata.
Sven
On Sat, Apr 27, 2013 at 10:53 AM, Maarten Dammers <maarten(a)mdammers.nl>wrote:
> Hi everyone,
>
> Now is a good time to start information about our cultural heritage to
> Wikidata. Not everything is possible yet, but we can at least start with
> the simple things that need to be done anyway:
> * Have items for every monument article and list
> * Add claims to every monument article (P31) and list (P360) linking them
> to the article about the local cultural heritage (example: Rijksmonument)
> * Add country (P17) to all lists and articles
>
> Some things that are also possible (but not in all countries yet):
> * Add the identifier
> * Add the type of building
> * Add the administrative unit (state/province/municipality, etc)
> * Image
> * Commons category
>
> To keep track of this, we created https://www.wikidata.org/wiki/**
> Wikidata:Cultural_heritage_**task_force<https://www.wikidata.org/wiki/Wikidata:Cultural_heritage_task_force>. Who wants to help out? You don't need to be a bot or a wikidata wizard,
> just start with looking up the article about cultural heritage in your
> region. You can see several examples in the list.
>
> For bot owners: I'm using a claimit.py . It's in the rewrite branch of
> Pywikipedia. You can use it to mass add claims to Wikidata. For example for
> the lists of Rijksmonumenten:
> claimit.py -catr:Lijsten_van_**rijksmomumenten -namespace:0 P360 Q916333
>
> Maarten
>
>
> ______________________________**_________________
> Wiki Loves Monuments mailing list
> WikiLovesMonuments(a)lists.**wikimedia.org<WikiLovesMonuments(a)lists.wikimedia.org>
> https://lists.wikimedia.org/**mailman/listinfo/**wikilovesmonuments<https://lists.wikimedia.org/mailman/listinfo/wikilovesmonuments>
> http://www.wikilovesmonuments.**org <http://www.wikilovesmonuments.org>
On Sun, Apr 28, 2013 at 5:05 AM, Paul Selitskas <p.selitskas(a)gmail.com> wrote:
> How about if I don't want such fallback to work for me? What if I'd
> like to see what is labeled and what is not? Have you considered this
> a user option with a flexible fallback schema or a site-wide
> preference with a fixed one?
>
> In general, this is a very good Wikidata feature yet not implemented.
> And thanks for raising the category redirects once again! :)
>
I guess being able to see what's translated and what's not can be
resolved by appending language names to labels when it's falling back
to another language.
On Sun, Apr 28, 2013 at 5:55 AM, Lukas Benedix
<benedix(a)zedat.fu-berlin.de> wrote:
> Hi,
>
> If I understand your proposal on User:Liangent/wb-lang right you want to
> write an extension or implement the language-fallback in directly wikibase.
>
> I thougt about doing something about the missing-lang-issue by myself and
> think writing a gadget would have a higher possibility to get it deployed on
> wikidata.org. And users who don't like it could easily disable the gadget in
> their preferences.
I guess I prefer to patch Wikibase directly.
>
> You should keep in mind that the userinterface of such a feature is not easy
> to design.
>
So there're two weeks used for collecting feedback about designs in my proposal.
On Sun, Apr 28, 2013 at 10:03 AM, Daniel Friesen
<daniel(a)nadir-seen-fire.com> wrote:
> We already have a way to handle that kind of thing with normal language
> fallbacks. &uselang=qqx.
> Wikidata should be able to do something similar trivially.
>
I don't understand how &uselang=qqx would work for this. Any explanation?
-Liangent
Hello,
I've drafted my proposal about language fallback and conversion issues
for Wikidata at [1].
Currently Wikidata stores multilingual contents. Labels (names,
descriptions etc) are expected to be written in every language, so
every user can read them in their own language. But there're some
problems currently:
* If some content doesn't exist in some specific language, users with
this exact language set in their preferences see something meaningless
(its ID instead). This renders some language with fewer users (thus
fewer labels filled) even unusable.
* There're some similar languages which may often share the same
value. Having strings populated for every language one by one wastes
resources and may allow them out of sync later.
* Even for languages which are not "that similar", MediaWiki already
has some facility to transliterate (aka. convert) contents from its
another sister language (aka. variant) which can be used to provide
better results for users.
This proposal aims at resolving these issues by displaying contents
from another language to users based on user preferences (some users
may know more than one languages), language similarity (language
fallback chain), or the possibility to do transliteration, and allow
proper editing on these contents.
Although Wikidata is in its fast development stage, lots of data have
been added to it. The later we resolve these issues, the more
duplications may be created which will require more clean up work in
the future, like what we had to face before / when the language
converter (that transliteration system) was introduced for the Chinese
Wikipedia. So I'm planning to do this project in this summer.
There's also a backup proposal about category redirects at [2]. I
wrote it because I really want to see it implemented too, either by me
or someone else. Some of its contents may be also useful for other
participants willing to do this project.
Comments are welcome and appreciated.
[1] https://www.mediawiki.org/wiki/User:Liangent/wb-lang
[2] https://www.mediawiki.org/wiki/User:Liangent/cat-redir
-Liangent
Hello,
I've been having discussions about my GSoC 2013 project with the Wikidata
group on the IRC(#mediawiki-wikidata) for a few days and have completed the
first draft of my proposal. I'd really appreciate some feedback on it. I
welcome any queries that you may have and would love to get tips on how to
improve it.
http://www.mediawiki.org/wiki/User:Pragunbhutani/GSoC_2013_Proposal
On Sumanah's suggestion, I've limited the scope of the project to 6 weeks
to allow time for code review and bug fixes. I thought it best to run it by
Wikidata-I before sharing it with Wikitech-I for their opinion.
Many thanks!
--
Pragun Bhutani
http://pragunbhutani.in
Skype : pragun.bhutani
Heya folks :)
Lot's of good stuff happened around Wikidata this week. Your summary
is here: http://meta.wikimedia.org/wiki/Wikidata/Status_updates/2013_04_26
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
CCing wikidata.
I don't think this is a good approach. We shouldn't be breaking API just
because there is a new under-the-hood feature (wikibase). From the API
client's perspective, it should work as before, plus there should be an
extra flag notifying if the sitelink is stored in wikidata or locally.
Sitelinks might be the first, but not the last change - e.g. categories,
etc.
As for the implementation, it seems the hook approach might not satisfy all
the usage scenarios:
* Given a set of pages (pageset), give all the sitelinks (possibly filtered
with a set of wanted languages). Rendering page for the UI would use this
approach with just one page.
* langbacklinks - get a list of pages linking to a site.
* filtering based on having/not having specific langlink for other modules.
E.g. list all pages that have/don't have a link to a site X.
* alllanglinks (not yet implemented, but might be to match corresponding
allcategories, ...) - list all existing langlinks in the site.
We could debate the need of some of these scenarios, but I feel that we
shouldn't be breaking existing API.
On Thu, Apr 25, 2013 at 2:24 PM, Brad Jorsch <bjorsch(a)wikimedia.org> wrote:
> Language links added by Wikidata are currently stored in the parser
> cache and in the langlinks table in the database, which means they
> work the same as in-page langlinks but also that the page must be
> reparsed if these wikidata langlinks change. The Wikidata team has
> proposed to remove the necessity for the page reparse, at the cost of
> changing the behavior of the API with regard to langlinks.
>
> Gerrit change 59997[1] (still in review) will make the following
> behavioral changes:
> * action=parse will return only the in-page langlinks by default.
> Inclusion of Wikidata langlinks may be requested using a new
> parameter.
> * list=allpages with apfilterlanglinks will only consider in-page
> langlinks.
> * list=langbacklinks will only consider in-page langlinks.
> * prop=langlinks will only list in-page langlinks.
>
> Gerrit change 60034[2] (still in review) will make the following
> behavioral changes:
> * prop=langlinks will have a new parameter to request inclusion of the
> Wikidata langlinks in the result.
>
> A future change, not coded yet, will allow for Wikidata to flag its
> langlinks in various ways. For example, it could indicate which of the
> other-language articles are Featured Articles.
>
> At this time, it seems likely that the first change will make it into
> 1.22wmf3.[3] The timing of the second and third changes are less
> certain.
>
>
> [1]: https://gerrit.wikimedia.org/r/#/c/59997
> [2]: https://gerrit.wikimedia.org/r/#/c/60034
> [3]: https://www.mediawiki.org/wiki/MediaWiki_1.22/Roadmap
>
> --
> Brad Jorsch
> Software Engineer
> Wikimedia Foundation
>
> _______________________________________________
> Mediawiki-api-announce mailing list
> Mediawiki-api-announce(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
>
Heya folks :)
The start of phase 2 has just been deployed on all 274 remaining Wikipedias \o/
http://blog.wikimedia.de/2013/04/24/wikidata-all-around-the-world/
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
I am completely amazed by a particularly brilliant way that Wikipedia uses
Wikidata. Instead of simply displaying the data from Wikidata and removing
the local data, a template and workflow is proposed, which...
* grabs the relevant data from Wikidata
* compares it with the data given locally in the Wikipedia
* displays the Wikipedia data
* adds a maintenance category in case the data is different
This allows both communities to check the maintenance category, provide a
security net for vandal changes, still notice if some data has changed,
etc. -- and to phase out the local data over time when they get comfortable
and if they want to. It is a balance of maintenance effort and data quality.
I am not saying that is the right solution in every use case, for every
topic, for every language. But it is a perfect example how the community
will surprise us by coming up with ingenious solutions if they get enough
flexibility, powerful tools, and enough trust.
Yay, Wikipedia!
The workflow is described here:
<
http://en.wikipedia.org/wiki/Template_talk:Commons_category#Edit_request_on…
>
There is an RFC currently going on about whether and how to use Wikidata
data in the English Wikipedia, coming out of the discussion that was here a
few days ago. If you are an English Wikipedian, you might be interested:
<
http://en.wikipedia.org/wiki/Wikipedia:Requests_for_comment/Wikidata_Phase_2
>
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey all,
I just got a warning from Ops that our log table is growing extremely fast.
One write up by this is here:
<https://bugzilla.wikimedia.org/show_bug.cgi?id=47415>
Basically, a vast majority of edits on Wikidata are written to the log
table as they are autopatrolled. And since we have a lot of edits, this
makes the table grow very very quickly.
We would like to:
* stop logging so many edits
* drop those logs that are already there about patrolling
We want to understand how that influences your workflows and what we can do
about that.
Please speak up if this change would be an issue.
Cheers,
Denny
--
Project director Wikidata
Wikimedia Deutschland e.V. | Obentrautstr. 72 | 10963 Berlin
Tel. +49-30-219 158 26-0 | http://wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
To be honest, I see this in the same way like Adrian and Vito. MF-W has even posted Wikidata's current OS stats on the project chat, which clearly indicates no need for oversighters at Wikidata for now, in my opinion. That's also the reason why I didn't support the requests for oversight permissions. It is clearly not like there are many requests which cannot be handled by stewards on time, anymore. Honestly, I'm also a bit disappointed because we are focussing so much on, in my opinion, pointless right discussions. Currently, the most common task is to suppress revealed admin IP addresses. There's an easy solution: check whether you are logged in and read the messages which are showing up if you try to edit a page being logged out.
The target of Wikidata is still to develop a free knowledge base.
Regards,
Leon
> ---------- Forwarded message ----------
> From: Adrian Raddatz <ajraddatz(a)gmail.com>
> Date: 23 April 2013 21:51
> Subject: Re: [Wikidata-l] Oversight nominations
> To: "Discussion list for the Wikidata project." <wikidata-l(a)lists.wikimedia.org>
>
>
> Meh. The oversight right in general is pointless, just a legal separation that for some reason can't be handled by admins. Oh well, I don't make the rules, and if people really want more flags we might as well let them ask. At least we aren't replacing the 40+ stewards with just two ovs, and have a whopping seven candidates. I'm not sure what to think of that, actually... Especially since that's more than past os requests done on Wikidata.
>
>
> On Tue, Apr 23, 2013 at 4:35 PM, Vito <vituzzu.wiki(a)gmail.com> wrote:
> Il 23/04/2013 21:28, Adrian Raddatz ha scritto:
>
> I completely agree, and don't know why there is such a move towards localizing oversight (and I imagine checkuser will come next). It should be noted that stewards can still oversight as part of cross-wiki work that they are doing, per the voted-upon policy.
>
> Maybe I'm a bit biased since I'm a stewie but frankly I don't see so much need for local oversighters, local checkusers might be ok (but actually I saw only crosswiki issues needing a check on wikidata) but ovs are imho pointless.
>
>
>
> Vito
>
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l
>
>
>
> --
> Adrian Raddatz, R.C.N.
> Naval Cadet
> Royal Military College Saint-Jean
> Government of Canada
>
> _______________________________________________
> Wikidata-l mailing list
> Wikidata-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata-l