Hi all,
I am happy to announce the release of Wikidata Toolkit 0.5.0 [1], the Java library for programming with Wikidata and Wikibase.
The most prominent new feature of this release is Wikibase API support, which allows you create Java programs that read and write data to Wikidata (or any other Wikibase site). The API write functionality checks the live data before making edits to merge statements for you and to detect edit conflicts. New example programs illustrate this functionality. Overall, we think this will make WDTK interesting for bot authors.
Other prominent features include:
* Unit support (just in time before it is enabled on Wikidata.org ;-) * Processing of local dump files not downloaded from Wikimedia (useful for other Wikibase users) * New builder classes to simplify construction of the rather complex data objects we have in Wikidata * WorldMapProcessor example (the code used to build the Wikidata maps) * Improved output file naming for examples, taking dump date into account * Several improvements in RDF export (but the general RDF structure is as in 0.4.0; updating this to the new structure we have for the official SPARQL endpoint is planned for the next release).
Maven users can get the library directly from Maven Central (see [1]); this is the preferred method of installation. It might still take a moment until the new packages become visible in Maven Central. There is also an all-in-one JAR at github [3] and of course the sources [4] and updated JavaDocs [5].
Feedback is very welcome. Developers are also invited to contribute via github.
Cheers,
Markus
[1] https://www.mediawiki.org/wiki/Wikidata_Toolkit [2] https://www.mediawiki.org/wiki/Wikidata_Toolkit/Client [3] https://github.com/Wikidata/Wikidata-Toolkit/releases [4] https://github.com/Wikidata/Wikidata-Toolkit/ [5] http://wikidata.github.io/Wikidata-Toolkit/
Hoi, It reads like a lot of work went in there.. Congratulations.
I understand that it uses the live data and that it can be using dumps of other installations :) Now how do I use it from Labs ? Thanks, GerardM
On 1 September 2015 at 23:12, Markus Kroetzsch < markus.kroetzsch@tu-dresden.de> wrote:
Hi all,
I am happy to announce the release of Wikidata Toolkit 0.5.0 [1], the Java library for programming with Wikidata and Wikibase.
The most prominent new feature of this release is Wikibase API support, which allows you create Java programs that read and write data to Wikidata (or any other Wikibase site). The API write functionality checks the live data before making edits to merge statements for you and to detect edit conflicts. New example programs illustrate this functionality. Overall, we think this will make WDTK interesting for bot authors.
Other prominent features include:
- Unit support (just in time before it is enabled on Wikidata.org ;-)
- Processing of local dump files not downloaded from Wikimedia (useful for
other Wikibase users)
- New builder classes to simplify construction of the rather complex data
objects we have in Wikidata
- WorldMapProcessor example (the code used to build the Wikidata maps)
- Improved output file naming for examples, taking dump date into account
- Several improvements in RDF export (but the general RDF structure is as
in 0.4.0; updating this to the new structure we have for the official SPARQL endpoint is planned for the next release).
Maven users can get the library directly from Maven Central (see [1]); this is the preferred method of installation. It might still take a moment until the new packages become visible in Maven Central. There is also an all-in-one JAR at github [3] and of course the sources [4] and updated JavaDocs [5].
Feedback is very welcome. Developers are also invited to contribute via github.
Cheers,
Markus
[1] https://www.mediawiki.org/wiki/Wikidata_Toolkit [2] https://www.mediawiki.org/wiki/Wikidata_Toolkit/Client [3] https://github.com/Wikidata/Wikidata-Toolkit/releases [4] https://github.com/Wikidata/Wikidata-Toolkit/ [5] http://wikidata.github.io/Wikidata-Toolkit/
-- Markus Kroetzsch Faculty of Computer Science Technische Universität Dresden +49 351 463 38486 http://korrekt.org/
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
On 02.09.2015 08:11, Gerard Meijssen wrote:
Hoi, It reads like a lot of work went in there.. Congratulations.
I understand that it uses the live data and that it can be using dumps of other installations :) Now how do I use it from Labs ?
Labs has Java installed. You can just copy any Java application to your home directory there and run it. We are running a WDTK-based application on labs to generate the RDF exports, for example.
Markus
On 02.09.2015 10:17, Markus Krötzsch wrote:
On 02.09.2015 08:11, Gerard Meijssen wrote:
Hoi, It reads like a lot of work went in there.. Congratulations.
I understand that it uses the live data and that it can be using dumps of other installations :) Now how do I use it from Labs ?
Labs has Java installed. You can just copy any Java application to your home directory there and run it. We are running a WDTK-based application on labs to generate the RDF exports, for example.
P.S. For computing-intensive jobs, you should of course not run the application directly from your home but submit it as a job. I guess Labs users know about this. Most bot-like applications won't need too much resources (since they are network I/O bound), but one should be considerate about these things when using Labs.
Markus
Hoi, So how do I get to use this tool on Labs and what makes a job computing-intensive. After all, I need to learn a tool first before I know such distinctions.. Thanks, GerardM
On 2 September 2015 at 10:20, Markus Krötzsch <markus@semantic-mediawiki.org
wrote:
On 02.09.2015 10:17, Markus Krötzsch wrote:
On 02.09.2015 08:11, Gerard Meijssen wrote:
Hoi, It reads like a lot of work went in there.. Congratulations.
I understand that it uses the live data and that it can be using dumps of other installations :) Now how do I use it from Labs ?
Labs has Java installed. You can just copy any Java application to your home directory there and run it. We are running a WDTK-based application on labs to generate the RDF exports, for example.
P.S. For computing-intensive jobs, you should of course not run the application directly from your home but submit it as a job. I guess Labs users know about this. Most bot-like applications won't need too much resources (since they are network I/O bound), but one should be considerate about these things when using Labs.
Markus
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
On 02.09.2015 11:12, Gerard Meijssen wrote:
Hoi, So how do I get to use this tool on Labs and what makes a job computing-intensive. After all, I need to learn a tool first before I know such distinctions..
Wikidata Toolkit is a programming library. You use it to develop software, for example, a bot that corrects errors on Wikidata, or a tool that creates a world map from all items on Wikdiata. You could compare it to tools like pywikibot or the Java Wiki Bot Framework.
To learn how to use such tools, it is best to look at example programs, and to ask specific questions if you get stuck. It also helps to have a rough idea of what you want to achieve.
Markus
On 2 September 2015 at 10:20, Markus Krötzsch <markus@semantic-mediawiki.org mailto:markus@semantic-mediawiki.org> wrote:
On 02.09.2015 10 <tel:02.09.2015%2010>:17, Markus Krötzsch wrote: On 02.09.2015 08 <tel:02.09.2015%2008>:11, Gerard Meijssen wrote: Hoi, It reads like a lot of work went in there.. Congratulations. I understand that it uses the live data and that it can be using dumps of other installations :) Now how do I use it from Labs ? Labs has Java installed. You can just copy any Java application to your home directory there and run it. We are running a WDTK-based application on labs to generate the RDF exports, for example. P.S. For computing-intensive jobs, you should of course not run the application directly from your home but submit it as a job. I guess Labs users know about this. Most bot-like applications won't need too much resources (since they are network I/O bound), but one should be considerate about these things when using Labs. Markus _______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org> https://lists.wikimedia.org/mailman/listinfo/wikidata
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Appreciating Wikidata's and your generativity. Congratulations on this great toolkit update.
Scott
On Wed, Sep 2, 2015 at 4:18 AM, Markus Krötzsch < markus@semantic-mediawiki.org> wrote:
On 02.09.2015 11:12, Gerard Meijssen wrote:
Hoi, So how do I get to use this tool on Labs and what makes a job computing-intensive. After all, I need to learn a tool first before I know such distinctions..
Wikidata Toolkit is a programming library. You use it to develop software, for example, a bot that corrects errors on Wikidata, or a tool that creates a world map from all items on Wikdiata. You could compare it to tools like pywikibot or the Java Wiki Bot Framework.
To learn how to use such tools, it is best to look at example programs, and to ask specific questions if you get stuck. It also helps to have a rough idea of what you want to achieve.
Markus
On 2 September 2015 at 10:20, Markus Krötzsch <markus@semantic-mediawiki.org mailto:markus@semantic-mediawiki.org> wrote:
On 02.09.2015 10 <tel:02.09.2015%2010>:17, Markus Krötzsch wrote: On 02.09.2015 08 <tel:02.09.2015%2008>:11, Gerard Meijssen wrote: Hoi, It reads like a lot of work went in there.. Congratulations. I understand that it uses the live data and that it can be using dumps of other installations :) Now how do I use it from Labs ? Labs has Java installed. You can just copy any Java application to your home directory there and run it. We are running a WDTK-based application on labs to generate the RDF exports, for example. P.S. For computing-intensive jobs, you should of course not run the application directly from your home but submit it as a job. I guess Labs users know about this. Most bot-like applications won't need too much resources (since they are network I/O bound), but one should be considerate about these things when using Labs. Markus _______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org <mailto:Wikidata@lists.wikimedia.org> https://lists.wikimedia.org/mailman/listinfo/wikidata
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata
On Tue, Sep 1, 2015 at 11:12 PM, Markus Kroetzsch markus.kroetzsch@tu-dresden.de wrote:
Hi all,
I am happy to announce the release of Wikidata Toolkit 0.5.0 [1], the Java library for programming with Wikidata and Wikibase.
The most prominent new feature of this release is Wikibase API support, which allows you create Java programs that read and write data to Wikidata (or any other Wikibase site). The API write functionality checks the live data before making edits to merge statements for you and to detect edit conflicts. New example programs illustrate this functionality. Overall, we think this will make WDTK interesting for bot authors.
Other prominent features include:
- Unit support (just in time before it is enabled on Wikidata.org ;-)
- Processing of local dump files not downloaded from Wikimedia (useful for
other Wikibase users)
- New builder classes to simplify construction of the rather complex data
objects we have in Wikidata
- WorldMapProcessor example (the code used to build the Wikidata maps)
- Improved output file naming for examples, taking dump date into account
- Several improvements in RDF export (but the general RDF structure is as in
0.4.0; updating this to the new structure we have for the official SPARQL endpoint is planned for the next release).
Maven users can get the library directly from Maven Central (see [1]); this is the preferred method of installation. It might still take a moment until the new packages become visible in Maven Central. There is also an all-in-one JAR at github [3] and of course the sources [4] and updated JavaDocs [5].
Feedback is very welcome. Developers are also invited to contribute via github.
It's present week around Wikidata ;-) All of the new shiny things. Congrats on the release!
Cheers Lydia