-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
so, it's quite common that people want various kinds of reports on wikis, which are usually generated by a standard SQL query. in fact, this is so common that we have the "query service" to handle these requests.
this isn't the most efficient way to do it, though; a toolserver user has to run the actual query, and if it's meant to be a regular report, they need to crontab it to run as needed. there's a lot of duplication of effort.
instead, what about a collaborative 'report' tool? i would envisage this working as follows: each query (for example, '100 users with most edits') is described in a file. the exact way isn't that important, but for the sake of example, the file might look like this:
name=Top 100 editors # For a slow query, run it once a day from crontab. Faster queries could be # done on demand (for example, 'when=immediately'), or cached for a certain # period of time (for example, 'cache=1h'). when=nightly query=SELECT ...
then for users, there's a web interface where they can select the wiki and the report they want to run. for queries that need parameters (e.g. those that report on a particular article), they could select the article (preferably with a nice ajaxy input box). then the SQL might look like:
SELECT ... WHERE page_namespace=<NAMESPACE> AND page_title=<TITLE>
<NAMESPACE> and <TITLE> would be filled in by the report generator.
the web interface would display the result of the query in a nice, easy-to-read table (and probably with some kind of XML or CSV export feature). any project developer would be able to add new queries, which users could request in JIRA.
as the project would have many developers, i envisage it running on the stable server. if people are actually interested in doing this, i'd be willing to create at least the basic framework (hopefully the interface would have several maintainers who would add nice features).
opinions?
- river.