Hey Steffen and Andy,
Continuing what I started on Twitter here, as some more characters might be
It seems that both our projects (FLOW3 and Wikidata) are in a similar
situation. We are using Gerrit as CR tool, and TravisCI to run our tests.
And we both want to have Travis run tests for all patchsets submitted to
Gerrit, and then +1 or -1 on verified based on the build passing or
failing. To what extend have you gotten such a thing to work on your
project? Is there code available anywhere? If both projects can use the
same code for this, I'd be happy to contribute to what you already have.
Jeroen De Dauw
Don't panic. Don't be evil. ~=[,,_,,]:3
the ApiWikibase class currently mainly supports the editing of single
entities or getting claims etc, but still only for one single entity.
Api modules like GetEntities do not really fit into this class. Also,
there will be lots of Query Api modules in the future (see
https://bugzilla.wikimedia.org/show_bug.cgi?id=55967) who need their own
Api base class.
Thus I propose to rename the current ApiWikibase class to something like
ApiWikibaseSingle (I know the name is horrible but it shows what I mean)
and create another class called ApiWikibaseQuery or sth like that for
the query modules. Finally there could be a base class called
ApiWikibase again that provides general functions like getVersion() or
We have deployed new code to Wikidata. This includes various bug fixes and
some new features.
Changes of note include:
* Item ID is displayed next to the item label (see paper cut )
* Improved appearance of Wikimedia Commons site link section
In the API:
* EditEntity module allows editing / adding claims and creating entities
* API has a new merge items module.
* Precision for time values is now validated. The API no longer accepts
time values more precise than a day, as such values are handled
inconsistently in the UI. (see bug 54939 )
* Coordinate values with null precision are no longer accepted by the API
for new claims.
In the clients (Wikivoyage/Commons now, Wikipedia on Thursday):
* The Wikidata flag 'D' on watchlist and recentchanges is styled.
Wikimedia Germany e.V. | NEW: Obentrautstr. 72 | 10963 Berlin
Phone (030) 219 158 26-0
Wikimedia Germany - Society for the Promotion of free knowledge eV Entered
in the register of Amtsgericht Berlin-Charlottenburg under the number 23
855 as recognized as charitable by the Inland Revenue for corporations I
Berlin, tax number 27/681/51985.
Hi! I'm working on a new version of wikibase-api-php based on the
> WikibaseDataModel component (the idea is to let the bots based on
> wikibase-api-php edit the Entity object and, when it's done, push back the
> changes to wikidata with the wbeditentity api). But, in order to make the
> conversion between Entity object and API representation (in the two
> directions) I need the code that is in the \Wikibase\lib\Serializer
> namespace of
> the Wikibase extension. Do you think it's possible to move these classes
> into a stand alone component? They doesn't looks to have strong
> dependancies on the other parts of the Wikibase extension.
It would be nice to see some data access abstraction mechanism on top of
our web API. So great to hear you are working on that \o/
There have been some emails about the serialization code mentioned lately,
as it is causing a number of problems and is, as you noticed, not really
reusable in its current state. What I'd like to see happen is this code
being refactored and put into its own component. We now have a much better
idea of how to do this cleanly going on our experiences with the
serialization code for the Ask library. So while it is not extremely tricky
of a think to do, it does require quite some work. And for now, we are sort
of blocked on creating new components due to requirements of the WMF
deployment process. It might still be some time before that is resolved.
Quoting myself from an email that was send last month on the internal list:
The old serialization and deserialization code (both db and external)
> is causing a lot of problems. The remaining bad dependencies in DataModel
> are caused by it, and cannot be removed without design flaws in the
> serialization code being fixed. New code, such as the datatype id stuff
> described above, can also not be implemented properly without refactoring
> first. A lot of the code is static, it all violates SRP and does not do
> proper dependency injection. I suggest creating a new serialization and
> deserialization component for the objects defined by the DataModel. This
> component could be shipped together with DataModel itself (and reside in
> the same git repo). It would be build on top of the Serializationcomponent  and would thus work similar to the
> serialization and deserialization code in Ask and WikibaseQuery, which
> has no known issues. This new component would then be used by Wikibase Repo
> and Wikibase Client instead of the code in lib, which would be removed.
> Some code build on top of the new component might need to be added in lib,
> as we might already have, or later on need, (de)serailization functionality
> that does not fit in the new component, which would only be dependent on
> DataModel (and its dependencies).
> It has been clear to me for quite some time that this code is not ideal (I
> already send some mail about this in the context of WikibaseQuery). Only
> recently I realized that fixing this essentially blocks removal of the last
> really bad dependencies in DataModel. Since we are running into situations
> where we need to choose between increasing the mess or fixing it, I think
> it is high time we do the later.
Jeroen De Dauw
Don't panic. Don't be evil. ~=[,,_,,]:3
Today I spotted some new code on gerrit breaking basic dependency rules,
making it clear that at least in Wikibase.git, the number if such
violations is not going down. It might in fact be increasing. Hence I'll
outline the basics here, in the hope people stop introducing these
* Wikibase Repo depends on WikibaseLib
* Wikibase Client depends on WikibaseLib
* Client and Repo do not depend on each other
>From this follows:
* Using Repo code in Lib is not allowed
* Using Client code in Lib is not allowed
* Using Client code in Repo is not allowed
* Using Repo code in Client is not allowed
There is no good reason to violate these points. "But", you might say,
"there are valid use cases for doing so". And yeah, if you want to follow
the Big Ball of Mud design pattern, you'd be right. If you run into a case
where at first glance it looks like you need to violate one of these rules,
* your design is very likely flawed in some way - violating these rules is
a strong design smell
* the flaw might well be solved by applying inversion of control
Rethinking your design might take some effort and time, esp if you already
committed to it by writing a pile of code. Laziness is however not an
excuse for creating design problems and ignoring things such as the acyclic
dependencies principle. To minimize the amount of work everyone needs to
do, including yourself, try to hold these basic rules into account when
designing your code. They are not complicated, and the benefit well
outweighs the little effort.
I expect people to adhere to these rules when submitting code, and when
reviewing. If new violations keep being introduced, we'll have to look at
enforcing these things automatically. I do hope we can muster the little
discipline to avoid the need for this.
Jeroen De Dauw
Don't panic. Don't be evil. ~=[,,_,,]:3
just created a wmf-like deployment schedule for Wikidata  making the
"hardcoded" board in the office obsolete.
We should link to a page with a proper changelog, list of potential
breaking changes, new features, etc.. for each deployment. I've linked it
to aude's "deployment highlights" page for now until someone comes up with
a better idea.
Software Developer - Wikidata - http://www.wikidata.org
Imagine a world, in which every single human being can freely
share in the sum of all knowledge. That‘s our commitment.
Wikimedia Deutschland e.V. | Obentrautstraße 72 | 10963 Berlin
Phone +49 (0)30 219 158 260
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 B. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/681/51985.
The EasyRdf submodule was updated to a new version to avoid
a bug with the serialization of integers in turtle format.
https://gerrit.wikimedia.org/r/#/c/86858/ has been merged
This will require you to run 'git submodule update' after
pulling master otherwise RDF unit tests will fail on master.
(crossposting from http://blog.wikimedia.de/?p=17250)
In early 2010 I met Denny and Markus for the first time in a small
room at the Karlsruhe Institute of Technology to talk about Semantic
MediaWiki, its development and its community. I was intrigued by the
idea they'd been pushing for since 2005 - bringing structured data to
Wikipedia. So when the time came to assemble the team for the
development of Wikidata and Denny approached me to do community
communications for it there was no way I could have said no. The
project sounded amazing and the timing was perfect since I was about
to finish my studies of computer science. In the one and a half years
since then we have achieved something amazing. We've built a great
technical base for Wikidata and much more importantly we've built an
amazing community around it. We've built the foundation for something
extraordinary. On a personal level I could never have dreamed where
this one meeting in a small room in Karlsruhe has taken me now.
>From now on I will be taking over product ownership of Wikidata as its
Up until today we've built the foundation for something extraordinary.
But at the same time there are still a lot of things that need to be
worked on by all of us together. The areas that we need to focus on
* Building trust in our data. The project is still young and the
Wikipedia editors and others are still wary of using data from
Wikidata on a large scale. We need to build tools and processes to
make our data more trustworthy.
* Improving the user experience around Wikidata. Building Wikidata to
the point where it is today was a tremendous technical task that we
achieved in a rather short time. This though meant that in places the
user experience has not gotten as much attention. We need to make the
experience of using Wikidata smoother.
* Making Wikidata easier to understand. Wikidata is a very geeky and
technical project. However to be truly successful it will need to be
easy to get the ideas behind it.
These are crucial for Wikidata to have the impact we all want it to
have. And we will all need to work on those - both in the development
team and in the rest of the Wikidata community.
Let's make Wikidata a joy to use and get it used in places and ways we
can't even imagine yet.
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata
Wikimedia Deutschland e.V.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.