[Labs-l] Performance of crosswiki query tools

Jesse (Pathoschild) pathoschild at gmail.com
Sun Sep 1 18:00:47 UTC 2013


2013/9/1 Paul Selitskas <p.selitskas at gmail.com>:
> Have you tried using several concurrent db connections instead of one
> connection & `use` statements? I'm just curious how costly it is (cpu/mem).

Opening a new connection to each wiki takes ≈70 seconds. It seems
`use` statements are still much faster than new connections.


2013/9/1 Marc A. Pelletier <marc at uberbox.org>:
> the command line clients takes 2-3 seconds when it
> connects to a database to fetch table names and schemas -- perhaps your
> library is doing something comparable behind the scenes?

I stripped the code down to PHP's PDO. A quick bit of research didn't
find any indication that PDO does that.


2013-09-01 Marc A. Pelletier <marc at uberbox.org>:
> One thing that might slow things down for you is that, at this time,
> labs is in a /different/ datacenter from where the replicas are. [...]
> At least one dev that had performance issue with lazily fecthing a
> result one row at a time found an very significant boots simply by using
> fetch_many instead.

I'm guessing that's the cause. The queries already fetch_many, but
there's a separate query for each wiki. I could theoretically generate
per-slice SQL queries that UNION every wiki, but I'd prefer to just
wait until the infrastructure is combined instead. I'll keep an eye on
the mailing list to know when that's done. Thanks. :)

-- 
Jesse Plamondon-Willard (Pathoschild)



More information about the Labs-l mailing list