Hey folks :)
On request Commons is the next sister project to get access to the
data on Wikidata. We'll be doing this on December 2nd. Please help
update and expand https://commons.wikimedia.org/wiki/Commons:Wikidata
and https://www.wikidata.org/wiki/Wikidata:Wikimedia_Commons. Two
caveats: 1) This is restricted to accessing data from the item
connected to the page via sitelink. Access to data from arbitrary
items will follow in January/February. 2) This is not for storing meta
data about individual files. This will come later as part of the
structured data on Commons project
(https://commons.wikimedia.org/wiki/Commons:Structured_data) and be
stored on Commons itself.
Looking forward to seeing what great things this will make possible again!
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hi Wikidata folks,
I've been struggling against bug 72348 [1], which leads the dumps to
contain both old and new style records. This makes parsing extra
challenging.
I tried to follow the bugzilla trail, but it isn't clear where the bug
stands. Do you have any guesses about when this issue is likely to get
resolved? Days? Weeks? Months?
Thanks for your help!
-Shilad
[1] https://bugzilla.wikimedia.org/show_bug.cgi?id=72348
--
Shilad W. Sen
Associate Professor
Mathematics, Statistics, and Computer Science Dept.
Macalester College
ssen(a)macalester.edu
http://www.shilad.comhttps://www.linkedin.com/in/shilad
651-696-6273
https://lists.wikimedia.org/pipermail/wikitech-l/2014-November/079546.html
----
Stas will be working in the MediaWiki Core team, mostly focused on
performance issues (though right now is getting up to speed on the
Wikidata Query Service[4], figuring out what we need to do to make it
suitable for widespread deployment of WikiGrok[5]).
----
Nice!
Nemo
Hi,
I haven't seen this mentioned in the context of Wikidata yet, so here:
The latest beta version of the Wikipedia app for iOS (iPhone, iPad, iPod)
shows descriptions from Wikidata as summaries in the search results.
If you have an iOS device and want to see it in action, see the
instructions here:
https://www.mediawiki.org/wiki/Wikimedia_Apps#Stay_on_the_Cutting_Edge
--
Amir Elisha Aharoni · אָמִיר אֱלִישָׁע אַהֲרוֹנִי
http://aharoni.wordpress.com
“We're living in pieces,
I want to live in peace.” – T. Moore
I'm bringing this up as my proof-by-construction answer to a
knock-down-drag-out thread earlier where people complained about the
difficulty of running queries against DBpedia and Wikidata.
I think some people will find the product described below to be a faster
road to where they are heading in the short term. In the longer term I am
thinking a v4 or v5 infovore may be able to evaluate the contexts of facts
in Wikidata and thus create a world view which can be quality controlled
for particular outcomes.
-----
Well, Infovore 3.1 happened quickly after Infovore because I made a quick
attempt to get my Jena up to date and found it was easy to update, so I
did. The importance here is that there is a lot of cool stuff going on
with Jena, such as the RDFThrift serialization format, and also some
Hadoop I/O tools written by Rob Vesse, and tracking the latest version
helps us connect with that. Release page here:
https://github.com/paulhoule/infovore/releases/tag/v3.1
Infovore 3.1 was used to process the Freebase RDF Dump to create a
quality-controlled RDF data set called :BaseKB; generally queries look
the same on Freebase and :BaseKB, but :BaseKB gives the right answers,
faster, and with less memory consumption. This week's release is in the
AWS cloud:
s3://basekb-now/2014-11-09-00-00/
something very close to this is going to become :BaseKB Gold 2. This is
simpler and better product that the last Gold release from Spring 2014.
Here are a few reasons:
* Unicode escape sequences in Freebase are now converted to Unicode
characters in RDF
* The rejection rate of triples has dramatically dropped, because of both
changes to Infovore and improvements in Freebase content
* The product is now packaged as a set of files partitioned and sorted on
subject; this means you can download one file and get a sample of facts
about a given topic; there is no longer the "horizontal division"
Between duplicate fact filtering and compression, :BaseKB Now is nearly
half the size of the Freebase RDF Dump.
If you're interested please join the mailing list at
https://groups.google.com/forum/#!forum/infovore-basekb
Sorry, not sure if this is the right place to post this bug report?
https://www.wikidata.org/wiki/Wikidata:List_of_properties/all#Medicine
reports quite a few messages like:
*The time allocated for running scripts has expired.**The time allocated
for running scripts has expired.**The time allocated for running scripts
has expired.**The time allocated for running scripts has expired.**The time
allocated for running scripts has expired.**The time allocated for running
scripts has expired.**The time allocated for running scripts has expired.**The
time allocated for running scripts has expired.**The time allocated for
running scripts has expired.**The time allocated for running scripts has
expired.**The time allocated for running scripts has expired.**The time
allocated for running scripts has expired.*
Some questions:
I was looking through the configuration trying to debug my issues from my last
email and noticed the list of blacklisted IDs. They appear to be numbers with
special meaning. I was curious about two things, why are they blacklisted and
what is the meaning of the remaining number?
* 1: I imagine that this just refers to #1
* 23: Probably refers to the 23 enigma
* 42: Life the universe and everything
* 1337: leet
* 9001: ISO 9001, which deals with quality assurance
* 31337: Elite
The only number that left me lost was 720101010. I couldn't figure this one
out. This list is located in
extensions/Wikibase/repo/config/Wikibase.default.php
------------
Doing a quick grep for the jquery.ui.menu within the Wikidata extension folder
I came up with the following results:
./extensions/ValueView/lib/resources.php: 'jquery.ui.autocomplete',
// needs jquery.ui.menu
./lib/resources/jquery.wikibase/resources.php: 'jquery.ui.menu',
./lib/resources/jquery.wikibase/resources.php: 'jquery.ui.menu',
Just fooling around I decided to comment out the two lines requiring
jquery.ui.menu in the jquery.wikibase/resources.php file. Refreshing the page
I ran into the same error as before, but this time with a module called JSON.
Does the Wikidata extension depend on another extension that adds these
resource loader modules? According to ResourceLoader/Default_modules on
Mediawiki.org, Mediawiki does not ship with jquery.ui.menu. This does seem a
bit weird, because I believe the comment that jquery.ui.autocomplete requires
jquery.ui.menu is correct and ResourceLoader includes jquery.ui.autocomplete by
default.
Am I missing something? I'm definitely confused at this point.
------------
As I wrote this email I was called into my managers office to be informed that
I am being laid off as part of a corporate restructuring. They've decided to
outsource all of IT. So while I am still curious about the answers to the
questions laid out in this email and the previous, this is no longer a project
I will be actively working on. I may still fool around with trying to get a
Wikibase installation set up on my own.
My email for this mailing list will also be changing to my personal email of
zellfaze(a)zellfaze.org beginning some time in the next few days.
Thank you,
Derric Atzrott
Hey all,
so I see there is some work being done on mapping Wikidata data model
to RDF [1].
Just a thought: what if you actually used RDF and Wikidata's concepts
modeled in it right from the start? And used standard RDF tools, APIs,
query language (SPARQL) instead of building the whole thing from
scratch?
Is it just me or was this decision really a colossal waste of resources?
[1] http://korrekt.org/papers/Wikidata-RDF-export-2014.pdf
Martynas
http://graphityhq.com
Trying to set up Wikibase and I might have something
misconfigured somewhere, or I may have stumbled upon a bug.
I've installed Wikibase per the instructions, made some
very minor tweaks to the configuration and added my first
property and item.
Going to the item page with (?debug=true) I get an error
message in Firebug:
Error: Unknown dependency: jquery.ui.menu
throw new Error( 'Unknown dependency: ' + module );
http://hostname/load.php?debug=true&lang=en&modules=jquery%2Cmediawiki&only…
pts&skin=vector&version=20141107T203616Z
I can't seem to add statements to items. I can't see
what I may have done wrong, but I am working off the
assumption that I made a mistake somewhere and that this
isn't a bug.
Thank you,
Derric Atzrott
Computer Specialist
Alizee Pathology