Could we add some way for this query to get the combined edits of two
or more users on the same page? I know you can query several pages at
once but I'd want something like
&users=Tom|Dick|Harry&titles=The_weather_in_London if possible.
First post to the list. I've got a bunch of questions, and I hope this is
the right place to ask them.
I'm interested in the idea of wiki 'mirroring': updating a second wiki ('B')
periodically with content from wiki A. (There's of course some discussion of
this on the web, so I'm aware that there's been quite a bit of thinking on
this already, but I couldn't quite find the solution I was looking for.)
A first stab at mirroring would be to do a Special:Export on the whole of A,
and then do a Special:Import on B. But this becomes impractical for larger
wikis: Ideally, I just want to update what needs updating.
The best way to do this would probably be something like list=recentchanges
(going back to the date of last transfer). Of course this doesn't work,
because recentchanges are are periodically purged, so cannot be used between
arbitrary dates. The log doesn't seem to record edits (is this correct?), so
this can't be used to get a list of changes between two arbitrary dates.
So, question 1: Is it possible to get a list of all changes (including
edits) between two dates (in a single query)?
If one wanted the complete version history, then another way to do this
would be to get all revisions since the last transfer made, i.e. something
(then transform xml to Special:Import format, and upload). Together with a
query of the log, this would give you all changes.
But suppose the wiki is very active or you don't have much bandwidth or you
simply don't want the whole version history, but just the latest versions
(since the last transfer). The only way I can see is to do something like
- 1. Fetch the list of namespaces
- 2. Get the list of revisions in each namespace
(action=query&prop=revisions&generator=allpages for each namespace)
- 3. See what needs updating, and then fetch all the changed pages.
Question 2: Can you see a better way of doing this? Also, why won't
generator=allpages work across namespaces? (I guess there my be a reason why
that isn't possible to do easily.)
One way would be to try something like:
but this doesn't work.
So, my question 3: Do you know why this doesn't work? I assume there isn't
an efficient mysql query to accomplish this, or are there other reasons?
Finally, I guess I am wondering whether there are people actively interested
in discussing issues around wiki mirroring/synchronisation more. If so,
what's the best mailing list for this?
Sorry, the post got a bit longer than I expected - thanks for considering
All the best,
I've just started on a MediaWiki extension that uses the MW API for
bulk-editing articles. Unfortunately for some reason after making the call to
edit the pages, instead of the browser seeing whatever I have written with
$wgOut->addHTML(), the browser gets redirected to the last page that was edited.
This is the code I am using to perform the edit, where $p is a result returned
from a previous Query API call (to get the list of pages that need editing.)
$req = new FauxRequest(array(
'action' => 'edit',
'bot' => true,
'token' => $p['edittoken'],
'title' => $p['title'],
'summary' => $this->strSummary,
'text' => $newContent,
'basetimestamp' => $p['starttimestamp']
$processor = new ApiMain($req, true);
If I comment out the execute() line then I see my summary Special page, but
with execute() present I get pushed onto the last article edited instead
(although every edit does go through successfully.) The other API call (for
querying the page content) works fine, it's only the Edit call that seems to
exhibit this behaviour.
Does anyone have any idea what's going on here? I'm running MW 1.14.0.
I'm trying to use the Special:Export to export certain revisions of files.
I've tried the code written in these pages
It seems I can only get either the most current one , or all of the
I need to take exactly some particular set of consecutive revisions
starting from a date or an id number.
Is that possible ?
I've also taken a look at the page here describing the API of mediawiki
export - Export the current revisions of all given or
exportnowrap - Return the export XML without wrapping it in an XML
result (same format as Special:Export). Can only be used with export
This seems to again emphasize that it's not possible.
Is this true , it's not possible to get to some particular revisions
inside an article through the API ?
Why not allow arbitrary SQL queries on most of the database tables?
Let's see, only a few, like the user table, have much confidential
information, and even only a few columns of it too.
So api.php could drop its read privileges for (parts of?) that table
before running any queries.
It comes the time when all websites should check for link lint.
OK, so I need a list of external links that are present in my wikis.
$ echo "SELECT DISTINCT el_to FROM wiki_externallinks ORDER BY el_to;"|
mysql -B my_database
gets it for me all with one command.
Can api.php get all the external links, for all namespaces, all in one shot?
Can Special:Linksearch get them all either, all in one shot?
The sysop could also customize what tables/columns to restrict, and
how many rows to output. Also set the total row output limit too.
No need for only allowing SELECT, as api.php would first drop all
other privileges than read-only privileges, including the privilege to
GRANT its privileges back to itself... No need to even filter against
SQL injection attacks (but as I don't even know how to spell SQL,
don't quote me on that.)
Anyway, being able to do arbitrary SQL would greatly simplify many
api.php queries. Let's see, for the URL perhaps use:
(maybe use no CAPS in the examples to "sell the ease of the idea".)