[MediaWiki-l] [Cargo] Duplicate Cargo Rows
yaron at wikiworks.com
Sun Dec 13 13:39:48 UTC 2015
Yes, it probably is due to a race condition.
Hopefully, there's a better solution than using unbuffered queries - if
that is a solution.
On Sun, Dec 13, 2015 at 12:51 AM, Ed <edward.hoo at gmail.com> wrote:
> The duplicate rows seem to happen as a result of a race condition when the
> new record has been queued and somewhere in the process we reedit or
> regenerate and queue a second one.
> I'm not sure if it is buffering, using multiple connections, something
> I was going to mess around with the DBO_NOBUFFER setting but the comments
> in "includes/db/Database.php" are a bit ominous:
> * Unbuffered queries are very troublesome in MySQL:
> * - If another query is executed while the first query is being
> * out, the first query is killed. This means you can't call
> * MediaWiki functions while you are reading an unbuffered
> query result
> * from a normal wfGetDB() connection.
> * - Unbuffered queries cause the MySQL server to use large
> amounts of
> * memory and to hold broad locks which block other queries.
More information about the MediaWiki-l