Hello,
regarding to the actual discussion of this French Radio Transmission station, I looked up the article on WP. On enwiki it says that there is only a French article ([1]) regarding this topic, but Wikidata ([2]) shows a German one as well, which is also shown on frwiki.
So, is this a bug or will it simply take a while, until the page will be rerendered on the squid?
[1] https://en.wikipedia.org/wiki/Military_radio_station_of_Pierre-sur-Haute [2] https://www.wikidata.org/wiki/Q10369016
Cheers,
Marco
On Sat, Apr 6, 2013 at 2:55 PM, Marco Fleckinger marco.fleckinger@wikipedia.at wrote:
Hello,
regarding to the actual discussion of this French Radio Transmission station, I looked up the article on WP. On enwiki it says that there is only a French article ([1]) regarding this topic, but Wikidata ([2]) shows a German one as well, which is also shown on frwiki.
So, is this a bug or will it simply take a while, until the page will be rerendered on the squid?
Hey :)
A purge of the page (adding ?action=purge to the URL) fixes it. The issue is that the dispatch lag is too large at the moment. It should be at only a few minutes ideally but it seems it is still catching up on the massive botrun from a few days ago: http://www.wikidata.org/wiki/Special:DispatchStats When this is at a few minutes as it should be then the page gets purged automatically and links show up almost immediately.
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Hey
On 04/06/2013 03:16 PM, Lydia Pintscher wrote:
On Sat, Apr 6, 2013 at 2:55 PM, Marco Fleckinger marco.fleckinger@wikipedia.at wrote:
So, is this a bug or will it simply take a while, until the page will be rerendered on the squid?
A purge of the page (adding ?action=purge to the URL) fixes it. The issue is that the dispatch lag is too large at the moment. It should be at only a few minutes ideally but it seems it is still catching up on the massive botrun from a few days ago: http://www.wikidata.org/wiki/Special:DispatchStats When this is at a few minutes as it should be then the page gets purged automatically and links show up almost immediately.
This was kind of solution I also thought of. I didn't do it, because I was unsure if it's a bug.
Cheers
Marco
On Apr 6, 2013 9:34 AM, "Marco Fleckinger" marco.fleckinger@wikipedia.at wrote:
This was kind of solution I also thought of. I didn't do it, because I
was unsure if it's a bug.
Good. Better to let the people you're reporting the bug to see it for themselves firsthand.
-Jeremy
As the dispatch lag is getting bigger again (nearly one day) should wikidata maybe stop all botwork for one day an discuss stricter rules for bots? Or will there be software solution to handle all edits on wikidata?
Sk!d
*Severin Wünsch*
On Sat, Apr 6, 2013 at 6:44 PM, Jeremy Baron jeremy@tuxmachine.com wrote:
On Apr 6, 2013 9:34 AM, "Marco Fleckinger" marco.fleckinger@wikipedia.at wrote:
This was kind of solution I also thought of. I didn't do it, because I
was unsure if it's a bug.
Good. Better to let the people you're reporting the bug to see it for themselves firsthand.
-Jeremy
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l
On Sun, Apr 7, 2013 at 9:43 PM, swuensch swuensch@gmail.com wrote:
As the dispatch lag is getting bigger again (nearly one day) should wikidata maybe stop all botwork for one day an discuss stricter rules for bots? Or will there be software solution to handle all edits on wikidata?
For the short term it might be a good idea to get the lag down again... Long term of course we need to be able to deal with that.
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Lydia Pintscher, 07/04/2013 23:59:
On Sun, Apr 7, 2013 at 9:43 PM, swuensch swuensch@gmail.com wrote:
As the dispatch lag is getting bigger again (nearly one day) should wikidata maybe stop all botwork for one day an discuss stricter rules for bots? Or will there be software solution to handle all edits on wikidata?
For the short term it might be a good idea to get the lag down again... Long term of course we need to be able to deal with that.
The bot was blocked and backlog grows by 400.000 items every 24h, so I doubt that's the point.
Nemo
On Mon, Apr 8, 2013 at 12:13 AM, Federico Leva (Nemo) nemowiki@gmail.com wrote:
Lydia Pintscher, 07/04/2013 23:59:
On Sun, Apr 7, 2013 at 9:43 PM, swuensch swuensch@gmail.com wrote:
As the dispatch lag is getting bigger again (nearly one day) should wikidata maybe stop all botwork for one day an discuss stricter rules for bots? Or will there be software solution to handle all edits on wikidata?
For the short term it might be a good idea to get the lag down again... Long term of course we need to be able to deal with that.
The bot was blocked and backlog grows by 400.000 items every 24h, so I doubt that's the point.
It is no longer blocked afaik. And there are more bots :)
Cheers Lydia
-- Lydia Pintscher - http://about.me/lydia.pintscher Community Communications for Wikidata
Wikimedia Deutschland e.V. Obentrautstr. 72 10963 Berlin www.wikimedia.de
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
Lydia Pintscher, 08/04/2013 00:14:
The bot was blocked and backlog grows by 400.000 items every 24h, so I doubt that's the point.
It is no longer blocked afaik. And there are more bots :)
But they're not editing faster that in the last few months, so something is broken. There was that change to preserve en.wiki's DB after all, it's not unrealistic that things slowed down.
Nemo
On Sun, Apr 7, 2013 at 2:43 PM, swuensch swuensch@gmail.com wrote:
As the dispatch lag is getting bigger again (nearly one day) should wikidata maybe stop all botwork for one day an discuss stricter rules for bots? Or will there be software solution to handle all edits on wikidata?
That is not at all a workable solution. What about the bots that are
currently updating sitelinks for moved pages? Or ones that are adding in missing interwikis that were not fully in the circle?
We expect to deploy phase 2, yet most of the data hasn't even been imported yet simply because we don't have enough bots running fast enough. My bot alone already has over 300k claims queued and waiting to go, but it can't edit any faster 1 claim/sec (bandwith, etc.)
The solution here has to be software side, no one bot alone is the cause (though some *cough* have made it worse), and shutting them down is just not a solution.
-- Legoktm http://enwp.org/User:Legoktm
The problem wasn't visible before, so nothing suddenly got broken.
Last sunday I counted at one interval 600 edits/min and a lot of those was across several client sites. If I remember correct Daniel said a week or two ago that we could handle about 7K changes per minute. If our present edits are split on individual sites, then it is not unlikely we are simply way above what we can handle.
Simplest way to solve this is to use a queue order that works for the bots.
Implement a maxlag for change dispatching and set it to 5 minutes as default for bots, or something like that. All editing modules should check the maxlag. A bot that hits the maxlag should incrementally add some delay to its edits, and then slow down. A successful edit allow the bot to decrease the delay. This is slightly different to how other modules are doing this. All bots will then adapt to a maximum throughput with an acceptable lag.
An alternative is to not service an edit request before the delay (or some delay) is imposed serverside. There are several variations.
A version to enforce the delay is to serve back a special token, a waiting ticket, the bot can use to get its request handled in those cases. Without the token the bot will sit on wait forever.
If more iron is added later the throughput will increase and the maxlag will be hit more and more seldom.
Hello Marco,
Here I see a lot of articles in the section ''Other languages''. Most likely an older version of the page is shown due caching somewhere. If not all languages are shown, purging a page (go to the history page and changing "=history" in "=purge" in the url) would help mostly.
Greetings, Romaine
--- On Sat, 4/6/13, Marco Fleckinger marco.fleckinger@wikipedia.at wrote:
From: Marco Fleckinger marco.fleckinger@wikipedia.at Subject: [Wikidata-l] Interlanguage-link-bug? To: "Discussion list for the Wikidata project." Wikidata-l@lists.wikimedia.org Date: Saturday, April 6, 2013, 12:55 PM Hello,
regarding to the actual discussion of this French Radio Transmission station, I looked up the article on WP. On enwiki it says that there is only a French article ([1]) regarding this topic, but Wikidata ([2]) shows a German one as well, which is also shown on frwiki.
So, is this a bug or will it simply take a while, until the page will be rerendered on the squid?
[1] https://en.wikipedia.org/wiki/Military_radio_station_of_Pierre-sur-Haute [2] https://www.wikidata.org/wiki/Q10369016
Cheers,
Marco
Wikidata-l mailing list Wikidata-l@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-l