I'm pleased to announce a massive data donation (our biggest yet?)
from Quora. At my request, they have given us over 1.5 million topic
IDs, which Magnus has kindly loaded into Mix'n'Match for checking and
import as Quora topic ID (P3417). The data is in catalogue 319:
https://tools.wmflabs.org/mix-n-match/?mode=catalog_details&catalog=319
As you can see, there are over 259K automated matches. They include
some duplicates, typos or non-English alternative names. Those of you
who use Quora may choose to resolve (or report) them there. Otherwise,
you can may simply "remove" them in Mix'n'Match.
Now you can't complain that you have nothing to do over the holiday period ;-)
Merry Christmas!
--
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk
Hello all,
The SPARQL endpoint we are running at http://query.wikidata.org has several
measures in place in order to ensure it stays up and running and available
for everyone, for example the 30 sec query timeout. This is necessary but
also prevents some useful queries from being run. One way around this is
Linked Data Fragments. It allows for some of the query computation to be
done on the client-side instead of our server.
We have set this up now for testing and would appreciate your testing and
feedback. You can find out more about Linked Data Fragments
<http://linkeddatafragments.org/concept/> and documentation for our
installation
<https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Linked_Da…>.
Also, you can see a demo of client-side SPARQL evaluation and LDF server
usage here: http://ldfclient.wmflabs.org/
Please note - it's in no way a production service for anything, just a
proof-of-concept deployment of LDF client. If you like how it works, you
can get it from the source
<https://github.com/LinkedDataFragments/jQuery-Widget.js> and deploy it on
your own setup.
Feel free to ask Stas (Smalyshev (WMF)) for any further question!
--
Léa Lacroix
Community Communication Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Anyway, this is great news! I hope that it gets adopted by the community.
Congratulations, Yuri!
I was going to suggest a Wikidata property, but I see that the data type
for datasets is not there yet:
https://phabricator.wikimedia.org/T151334
On Thu, Dec 22, 2016 at 8:48 PM, Yuri Astrakhan <yastrakhan(a)wikimedia.org>
wrote:
> Micru, thanks, I think Datasets sounds like a good name too!
>
> On Thu, Dec 22, 2016 at 2:44 PM David Cuenca Tudela <dacuetu(a)gmail.com>
> wrote:
>
> > On Thu, Dec 22, 2016 at 8:38 PM, Brad Jorsch (Anomie) <
> > bjorsch(a)wikimedia.org
> > > wrote:
> >
> > > On Thu, Dec 22, 2016 at 2:30 PM, Yuri Astrakhan <
> > yastrakhan(a)wikimedia.org>
> > > wrote:
> > >
> > > > Gift season! We have launched structured data on Commons, available
> > from
> > > > all wikis.
> > > >
> > >
> > > I was momentarily excited, then I read a little farther and discovered
> > this
> > > isn't about https://commons.wikimedia.org/wiki/Commons:Structured_data
> .
> > >
> >
> > Same here, I think it needs a better name...
> >
> > What about calling it datasets or structured datasets?
> >
> > Cheers,
> > Micru
> > _______________________________________________
> > Wikitech-l mailing list
> > Wikitech-l(a)lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikitech-l
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
--
Etiamsi omnes, ego non
On 22 December 2016 at 18:56, Stas Malyshev <smalyshev(a)wikimedia.org> wrote:
> If it were *less*
> results, I'd assume it is some network/delivery problem, but don't know
> yet what to think about 3 more results.
>
> I get 128 and 131 results intermittently, so I suspect some kind of bug,
> not sure where yet.
Like Ruben already pointed, suspect the pagination of TPF in this case.
Regards,
Jan
Hi Léa,
As one of the researchers working on Linked Data Fragments,
I want to congratulate you and Wikidata on your Triple Pattern Fragments endpoint!
I've been playing around with it a bit,
and wanted to send you this cool federated query:
http://bit.ly/wikidata-dbpedia-viaf
It combines data from Wikidata, DBpedia, and VIAF live from the Web
to answer the question: which works were created by cubist painters?
I hope you all have fun with this these new query opportunities!
Best,
Ruben
--
Ruben Verborgh
Postdoctoral Researcher in Semantic Hypermedia
Ghent University – imec – IDLab
https://ruben.verborgh.org/ – @RubenVerborgh
I have not maintained WDQ (http://wdq.wmflabs.org/) for many months. It
just keeps "ticking away", with the occasional restart required. Its data
is likely out of sync with Wikidata proper, and most of its functionality
is better served by SPARQL (https://query.wikidata.org/).
So we (Tool Labs tech and I) want to shut it down over the next few months.
I did some previous analysis (http://magnusmanske.de/wordpress/?p=410) and
it looks like WDQ is pretty much obsolete. The one thing people seem to
like about it is the simple syntax, but that doesn't alone seem to be a
valid reason to keep it around, especially with a converter (
https://tools.wmflabs.org/wdq2sparql/w2s.php) in place.
If there are other reasons to keep WDQ alive, please let me know ASAP!
Hello all,
We have been working on improving the way you can add references on
Wikidata, and I'm glad to announce that Citoid script can now be tested.
Citoid is user script for editing references that can automatically
populate parts of the reference using the citoid api if a reference
contains reference URL (P854), PubMed ID (P698) or DOI (P356). If one of
those properties is present, an "autofill" link is added next to the
"remove" link/button.
You can try the script and give your feedback here.
<https://www.wikidata.org/wiki/Wikidata:Project_chat#Citoid_script>
Thanks a lot to Katie for working on this!
--
Léa Lacroix
Community Communication Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.
Dear SPARQL team,
in SQID, I now frequently see WDQS responses of type 429 when trying to
load a page (doing this will usually issue a few dozen queries for
larger pages). How many SPARQL queries are users allowed to ask in a
certain time and how should tools behave when they hit this limit?
Best regards,
Markus