Hey :)
It seems that badges support is stalled on a decision about how
exactly to define the set of available badges if I read
https://bugzilla.wikimedia.org/show_bug.cgi?id=40810 correctly. Can we
make a decision and move forward? It's the most voted on bug we have.
Cheers
Lydia
--
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Technical Projects
Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey,
Regarding https://gerrit.wikimedia.org/r/#/c/77862/4
The approach implemented in PS4 is different then the one I was thinking of
originally when we discussed this. It has
{
qualifiers: { snaks: {}, order: [] }
}
rather then
{
qualifiers: {},
qualifiers-order: []
}
While I agree that the former is nicer, it is a breaking change in our API.
I'm thus not merging it as long as it is not clear that we are all well
aware of this breakage and find it an acceptable price for having a nicer
format. If we decide to go forward with this, it should also be announced
to our API users and properly documented.
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
Hi everyone,
Summary: We are looking for ways of how to map individual triples from
DBpedia to something that can be consumed by the Wikidata API.
Long story:
Denis Lukovnikov is a Google summer of code student, working on new and
pretty user interface for DBpedia.
A demo of the work in progress can be found at [1].
Besides the new design, a further goal of the project is to offer a
framework for triple-level actions.
One of the planned actions is about pushing DBpedia triples to Wikidata:
For example, I notice that the Wikidata page for my home town "Berndorf
in Lower Austria" does not contain the population:
http://www.wikidata.org/wiki/Q666615
Looking at the corresponding DBpedia entry, this information actually
exists there:
http://dbpedia.org/resource/Berndorf,_Lower_Austria
The new DBpedia interface should offer a button next to the "population
8728" triple which enables transfer of this information to Wikidata.
In another GSoC project, Hady Elsahar is working on mappings between the
wikidata RDF vocabulary and the DBpedia vocabulary.
This means, we can in principle map DBpedia RDF data to Wikidata RDF.
However, looking at the Wikidata API [2] there is
action=wbcreateclaim *
with the example:
api.php?action=wbcreateclaim&entity=q42&property=p9001&snaktype=novalue&token=foobar&baserevid=7201010
So the core question is, how can we map e.g. properties such as
wikidata:population (if that existed) to their respective Wikidata
property identifier (Q12345)?
This goes for any property that may occur in an RDF dump, such as:
http://www.wikidata.org/wiki/Special:EntityData/Q666615.nt
Ideally I think we are looking for either a Wikidata service or a
dataset similar to [3] what is what we have done in the LinkedGeoData
Project:
In this project, we map OpenStreetMap (OSM) data to RDF, and we maintain
our RDF mappings in the database [3] together with the original OSM data.
Then we use an RDB2RDF mapper configured with the view definitions in
[4] to expose the relational database as Linked Data [5] and a virtual
SPARQL endpoint [6].
Cheers,
Claus
[1] WiP demo:
http://dbpv.cstadler.aksw.org/#/page/Berndorf,_Lower_Austria (currently
loading may take a while for pages with many inverse statements).
[2] WikiData API: http://www.wikidata.org/w/api.php
[3] Raw-Data to RDF mappings for OpenStreetMap:
https://github.com/GeoKnow/LinkedGeoData/blob/master/linkedgeodata-core/src…
[4] RDB2RDF view definitions:
https://github.com/GeoKnow/LinkedGeoData/blob/master/linkedgeodata-core/src…
[5] Eiffel Tower resource: http://linkedgeodata.org/page/triplify/way5013364
[6] Explanation of the unterlying SQL:
http://linkedgeodata.org/vsnorql/?query=Explain+Select+*+{+%3Chttp%3A%2F%2F…
--
Dipl. Inf. Claus Stadler
Department of Computer Science, University of Leipzig
Research Group: http://aksw.org/
Workpage & WebID: http://aksw.org/ClausStadler
Phone: +49 341 97-32260
Hi,
there seems to be a problem with the dumps or with the way in which we
interpret the dumps.
Currently, daily dumps come with a maxrevid.txt file that is supposed to
give the largest revision number in the dump. For example, daily dump of
1st Aug 2013 has max id 62860640 [1]. I guess that's true.
The wda scripts (that also create the statistics and digested dumps we
publish) use this number to figure out if a daily is still relevant or
if it is already contained in the latest full dump. For this, we need to
get the maximal revision id of the full dump. We do this by reading the
file site_stats.sql.gz, where we look for the line starting with INSERT
INTO `site_stats` and take a revision number from there (third number in
the insert tuple). For example, for the dump of 27 July 2013, this
number is 63069374 [2].
There is a problem here, since the maximal revision in the dumps of 27
July 2013 is not actually that high (the history dump of that date is
incomplete, but the current revs dump is done and has max rev 61983867
[3]). Thus, our scripts ignore several days of dailies.
Before I go and work on this, my question is whether this is an error in
our script (i.e., the number we take from sitestats is not supposed to
be the max revision) or an error in the dumps (i.e., sitestats was
exported wrongly).
Cheers,
Markus
[1] http://dumps.wikimedia.org/other/incr/wikidatawiki/20130801/maxrevid.txt
[2]
http://dumps.wikimedia.org/wikidatawiki/20130727/wikidatawiki-20130727-site…
[3] This can be seen in the comments for the dump at
http://dumps.wikimedia.org/wikidatawiki/20130727/
--
Markus Kroetzsch, Departmental Lecturer
Department of Computer Science, University of Oxford
Room 306, Parks Road, OX1 3QD Oxford, United Kingdom
+44 (0)1865 283529 http://korrekt.org/
Hey,
I wanted to play a bit with some JavaScript things and decided to start
implementing the Ask language in JS. Though I have not gotten far yet in
actual implementation, I had a lot of fun setting up the build process and
CI tools.
https://github.com/JeroenDeDauw/AskJS
The tests can be run on the command line (unlike the ones we have for
Wikibase), they are run by TravisCI as well, and code coverage reports are
created by each Travis build using coveralls.io \o/
I plan to try out Grunt and see how that compares to Jake, and look into
how to make this work in browsers as well, and setting up the CI so it
verifies it does.
(I am currently doing this in my free time, not as WMDE employee)
Cheers
--
Jeroen De Dauw
http://www.bn2vs.com
Don't panic. Don't be evil. ~=[,,_,,]:3
--
My take on assertions, which I also tried to stick to in Wikibase, is as follows:
* A failing assertion indicates a "local" error in the code or a bug in PHP;
They should not be used to check preconditions or validate input. That's what
InvalidArgumentException is for (and I wish type hints would trigger that, and
not a "fatal error"). Precondition checks can always fail, never trust the
caller. Assertions are things that should *always* be true.
* Use assertions to check postconditions (and perhaps invariants). That is, use
them to assert that the code in the method (and maybe class) that contains the
assert is correct. Do not use them to enforce caller behavior.
* Use boolean expressions in assertions, not strings. The speed advantage of
strings is not big, since the expression should be a very basic one anyway, and
strings are awkward to read, write, and, as mentioned before, potentially
dangerous, because they are eval()ed.
* The notion of "bailing out" on "fatal errors" is a misguided remnant from the
days when PHP didn't have exceptions. In my mind, assertions should just throw
an (usually unhandled) exception, like Java's AssertionError.
I think if we stick with this, assertions are potentially useful, and harmless
at worst. But if there is consensus that they should not be used anywhere, ever,
we'll remove them. I don't really see how the resulting boiler plate would be
cleaner or safer:
if ( $foo > $bar ) {
throw new OMGWTFError();
}
-- daniel
Am 31.07.2013 00:28, schrieb Tim Starling:
> On 31/07/13 07:28, Max Semenik wrote:
>> I remeber we discussed using asserts and decided they're a bad
>> idea for WMF-deployed code - yet I see
>>
>> Warning: assert() [<a href='function.assert'>function.assert</a>]:
>> Assertion failed in
>> /usr/local/apache/common-local/php-1.22wmf12/extensions/WikibaseDataModel/DataModel/Claim/Claims.php
>> on line 291
>
> The original discussion is here:
>
> <http://thread.gmane.org/gmane.science.linguistics.wikipedia.technical/59620>
>
> Judge for yourself.
>
> -- Tim Starling
>
>
> _______________________________________________
> Wikitech-l mailing list
> Wikitech-l(a)lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>
Hi folks,
On our WMF/WMDE Wikidata coordination call, Daniel suggested we
revisit bug 44098 [1], asserting that /entity/Q1234 (for example)
should trigger a 303, not a 302 HTTP status code.
In reading the HTTP spec for this, it would appear that 303 expressly
prohibits caching the response[2]. I'm not sure if that's going to be
a problem, since I don't know if we do any 302 response caching (note
it's just the 303 redirect itself that can't be cached, not the
target). However, the spec is also pretty ambiguous about whether 302
or 303 is more appropriate.
At any rate, Daniel was concerned that this has been an issue for a
while, but the only artifact I'm seeing of active conversation on this
topic is bug 44098, which is closed. Daniel, to make sure we track
this, assuming we don't quickly come to a resolution in this thread,
could you file a new bug requesting the 302 response be updated to
303?
Thanks
Rob
[1] https://bugzilla.wikimedia.org/44098
[2] http://tools.ietf.org/html/rfc2616#section-10.3.4
A quick heads up:
I just tried runnign our phpunit tests with PHPUnit 3.6.10, and got about 60 of
these errors:
InvalidArgumentException: You must not expect the generic exception class.
So, apparently, phpunit 3.6 doesn't want the expected exception to be
"Exception". That's annoying. Apparently this will be fixed in 3.7...
Well, I thought I'd let you know.
-- daniel
On Wed, Jul 31, 2013 at 7:28 PM, Tim Starling <tstarling(a)wikimedia.org>wrote:
> The php.ini option assert.bail is 0 by default.
So? It's the same way in Java. You have to turn on assertions. It's kind of
natural to assume that if assertions are off the won't cause fatal errors.
*-- *
*Tyler Romeo*
Stevens Institute of Technology, Class of 2016
Major in Computer Science
www.whizkidztech.com | tylerromeo(a)gmail.com