We have been working for some months in a wikidata project, and we have found an issue with edition performance, I began to work with wikidata java api, and when I tried to increase the edition speed the java system held editions, and inserted delays, which reduced edition output as well.
I chose the option to edit with pywikibot, but my experience was that this reduced more the edition.
At the end we use the procedure indicated here:
With multithreading, and we reach a maximum of 10,6 edition per second.
my questions is if there is some experience when has been possible to have a higher speed?.
Currently we need to write 1.500.000 items, and we would require 5 working days for such a task.
Senior Java Developer
(Semantic Web Developer)
in 2016 I wrote a small Android app, that is making use of the Wikipedia ActionApi to search for articles at the current location of a user.
Due to legal considerations I am currently trying to take down the app.
It’s not available any more in the Google PlayStore, but there are still installations out there.
That’s why I want to make these installations unusable by deactivating all backend services, that the app is using.
Unfortunately the app is (partially) directly communicating with wikipedia servers and not via a proxy under my control.
The app sends a special User-Agent HTTP header with every request to identify itself:
tagorama/v184.108.40.2063-release (http://tagorama.rocks/ <http://tagorama.rocks/>; info(a)tagorama.rocks <mailto:firstname.lastname@example.org>)
Is there any way for you to block requests from this app?
Who would I contact?
Thanks for your help,