Hello folks,
Our next Wikidata office hour will take place on September 27th, between
18:00 and 19:00 (Berlin time, UTC+2), on the #wikimedia-office
<irc://irc.freenode.net/wikimedia-office> IRC channel.
Please bring any topic you want to discuss with Lydia, me or other members
of the Wikidata team :)
See you soon,
--
Léa Lacroix
Community Communication Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
I've been experimenting a bit with queries that contain regular expressions and have noticed that they seem to be triggering 502 Bad Gateway errors. Or perhaps it's just a coincidence and there were other things going on around 10AM GMT?
Here's an example query where I'm looking for cities that start with "Silver":
---
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#>
PREFIX wd: <http://www.wikidata.org/entity/>
PREFIX wdt: <http://www.wikidata.org/prop/direct/>
SELECT ?s ?label
WHERE {
?s wdt:P31 wd:Q515 .
?s rdfs:label ?label
FILTER(regex(?label, "^Silver"))
}
LIMIT 15
---
Am I doing something wrong in the query? Occasionally it seems to work, but most of the time it waits for a while and then I get the 502 error. Any guidance you may have would be appreciated.
//Ed
I think this question has come up before, but I was wondering if there was anyone has a working Wikidata Suggest widget of some kind that works similarly to Freebase Suggest which has finally been turned off completely:
https://developers.google.com/freebase/v1/suggest
I've been using Freebase entities in the past to provide some measure of data normalization, and am in the middle of converting my entities over to Wikidata. The specific application is a jobs board that needs to control the names for Organizations, Subjects and Locations [1,2].
I need to create suggest like functionality in the application and was wondering if anyone else had done anything like a Freebase Suggest already. If this is a gap area I was thinking of creating a React component that uses the Wikidata SPARQL endpoint [3] which I was pleased to see already supports CORS. But I'm not married to React, so if someone has another JavaScript library that does this I'd love to hear from you.
//Ed
[1] https://github.com/code4lib/shortimer/issues/38
[2] http://jobs.code4lib.org
[3] https://query.wikidata.org/bigdata/namespace/wdq/sparql
Hello,
My name is Kaushal and currently I and a friend are working on AskPlatypus
as part of our master thesis. We want to add a module to AskPlatypus which
answers mathematical questions with the use of Wikidata. As a first step we
want to add more mathematical formulas to Wikidata. We extracted a lot of
them from Wikipedia. There are 17838 formulas now. It would be great to get
them uploaded into primary source tool.
The list of formulas in primary source tool syntax is attached here.
Please have look. It would be great if someone could upload them into the
primary sources tool.
Greetings
Kaushal
Hey everyone :)
Wiktionary is our third-largest sister project, both in term of active
editors and readers. It is a unique resource, with the goal to provide
a dictionary for every language, in every language. Since the
beginning of Wikidata but increasingly over the past months I have
been getting more and more requests for supporting Wiktionary and
lexicographical data in Wikidata. Having this data available openly
and freely licensed would be a major step forward in automated
translation, text analysis, text generation and much more. It will
enable and ease research. And most importantly it will enable the
individual Wiktionary communities to work more closely together and
benefit from each other’s work.
With this and the increased demand to support Wikimedia Commons with
Wikidata, we have looked at the bigger picture and our options. I am
seeing a lot of overlap in the work we need to do to support
Wiktionary and Commons. I am also seeing increasing pressure to store
lexicographical data in existing items (which would be bad for many
reasons).
Because of this we will start implementing support for Wiktionary in
parallel to Commons based on our annual plan and quarterly plans. We
contacted several of our partners in order to get funding for this
additional work. I am happy that Google agreed to provide funding
(restricted to work on Wikidata). With this we can reorganize our team
and set up one part of the team to continue working on building out
the core of Wikidata and support for Wikipedia and Commons and the
other part will concentrate on Wiktionary. (To support and to extend
our work around Wikidata with the help of external funding sources was
our plan in our annual plan 2016:
https://meta.wikimedia.org/wiki/Grants:APG/Proposals/2015-2016_round1/Wikim…)
As a next step I’d like us all to have another careful look at the
latest proposal at
https://www.wikidata.org/wiki/Wikidata:Wiktionary/Development. It has
been online for input in its current form for a year and the first
version is 3 years old now. So I am confident that the proposal is in
a good shape to start implementation. However I’d like to do a last
round of feedback with you all to make sure the concept really is
sane. To make it easier to understand there is now also a pdf
explaining the concept in a slightly different way:
https://commons.wikimedia.org/wiki/File:Wikidata_for_Wiktionary_announcemen…
Please do go ahead and review it. If you have comments or questions
please leave them on the talk page of the latest proposal at
https://www.wikidata.org/wiki/Wikidata_talk:Wiktionary/Development/Proposal….
I’d be especially interested in feedback from editors who are familiar
with both Wiktionary and Wikidata.
Getting support for Wiktionary done - just like for Commons - will
take some time but I am really excited about the opportunities it will
open up especially for languages that have so far not gotten much or
any technological support.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/029/42207.
Hi!
Right now we have SPARQL query examples page in
https://www.mediawiki.org/wiki/Wikibase/Indexing/SPARQL_Query_Examples,
and it gathered a large amount of excellent contributions (for which I'd
like to take this opportunity to thank everybody who helped!)
However, MediaWiki site is really not the best wiki for this page - it
should properly be on Wikidata IMO. And we have a copy of it on
https://www.wikidata.org/wiki/Wikidata:SPARQL_examples - except that is
is not up-to-date, is still not used by WDQS GUI and many people don't
actually know it exists.
So, I'd like to propose a migration plan:
0. Put a message on MediaWiki page warning of migration and directing
people to use Wikidata one. Maybe even protect the page for a short
time? This of course will be done only after we decide to perform the
migration.
1. Sync Wikidata page to MediaWiki one (that'll require some work due to
template differences, etc. but I can do it in a couple of hours).
2. Switch WDQS GUI to use the Wikidata page.
3. Replace MediaWiki page with redirect to wikidata one with appropriate
message.
I can do all of it (except maybe for protecting part, I'd need to ask
some admins I guess), but I wanted to make a public announcement /
request for feedback before doing anything big. So please comment if you
see any problem with the proposed plan or any objections. If no
objections are raised, I currently plan to do it sometime about Wed-Thu
this week.
Thanks,
--
Stas Malyshev
smalyshev(a)wikimedia.org
Hi Gerard (and everyone interested in list generation)
Dynamic lists are an interesting point since currently most (manually
generated) Lists of Wikipedia are static.
I would love to hear more about the issues and workflows around dynamic
lists to understand their use better. Some things that are on my mind:
- What are things that a dynamic lists helps you with? (do you have some
real-life examples?)
- Where might a static list be a better approach? (do you have some
real-life examples?)
- What are current workflows with Listeria? I think I remember that people
in our interviews (thanks to everyone who participated!) mentioned that the
generation process currently needs to be triggered manually. Is that just
because there is no automatic way or does it provide you with advantages,
too?
If there are any other things that might help us to understand your needs
better, please share, ideally with an example (in my experience, this makes
it much easier to understand the workflow, motivation and it’s context)
Jan
> Message: 6
> Date: Fri, 2 Sep 2016 11:18:13 +0200
> From: Gerard Meijssen <gerard.meijssen(a)gmail.com>
> To: "Discussion list for the Wikidata project."
> <wikidata(a)lists.wikimedia.org>
> Subject: Re: [Wikidata] List generation input
> Message-ID:
> <CAO53wxWaQ2vyVwg5f94X3G0E3T91gFAuA27uzCYo793RXApEBQ@mail.
> gmail.com>
> Content-Type: text/plain; charset="utf-8"
>
> Hoi,
> I learned that for writing about gender issues the list generated by
> Listeria are used. The big advantage is that these lists are not fixed.
> Regularly people write new articles about women and consequently these
> lists change.
>
> It is this functionality that is really needed and I do not get from the
> lists examples that need such regular updates.
> Thanks,
> GerardM
>
Hi,
I'd like to write integration tests for https://www.wikidata.org/wiki/User:FLOSSbot and use test.wikidata.org for that. I figured out how to create properties[1] with pywikibot and search for them[2]. But I was not able to figure out how to remove them afterwards, to not clutter test.wikidata.org. The https://www.mediawiki.org/wiki/Wikibase/API page does not show anything obvious.
Any idea ?
Cheers
[1]
import pywikibot
site = pywikibot.Site("test", "wikidata")
print(str(site.editEntity({'new': 'property'}, {'datatype':'item',
"labels":
{"en":
{"language":"en","value":"FLOSS3property"}
}
})))
[2]
import pywikibot
site = pywikibot.Site("test", "wikidata")
for e in site.search_entities('FLOSS', 'en', type='property'):
print(str(e))
--
Loïc Dachary, Artisan Logiciel Libre