Per the RfC a while back,[1] we've now removed the wddx and dump output
formats from the API. This is expected to be a very low-impact change, as
logs don't show anyone actually using these anymore.
This change should be deployed to WMF wikis with 1.26wmf13, see
https://www.mediawiki.org/wiki/MediaWiki_1.26/Roadmap for the schedule.
Note that the yaml, txt, and dbg output formats are scheduled for removal
in November 2015. Users using yaml should be able to easily change to json
as it's entirely identical.
[1]:
https://www.mediawiki.org/wiki/Requests_for_comment/Ditch_crappy_API_formats
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Note that formatversion=2 is still considered slightly experimental, in
that backwards-incompatible changes like this can still happen. If you see
any other changes like this that should be made, please report them!
Starting in 1.26wmf11, the following changes will be made in the output
from meta=siteinfo when formatversion=2 is in use:
- Output from siprop=namespaces will be an array, rather than an object
indexed by namespace number. See T102645.[1]
- The 'add', 'remove', 'add-self', and 'remove-self' subarrays in
siprop=usergroups will always be arrays, never objects with numeric keys.
- The (sub)arrays in the output from siprops restrictions,
extensiontags, functionhooks, variables, protocols, and showhooks are now
guaranteed to be arrays. Getting objects with numeric keys from these seems
to have been unlikely or impossible anyway, but now it's guaranteed.
[1]: https://phabricator.wikimedia.org/T102645
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hi,
I was using the Wikimedia api for the getting the introductory content.
But all of a sudden I get connection timed out only in the server which works fine in localhost and also the url alone works fine in a browser.
I have tried with many other REST urls, this behaviour happens only for the wikipedia url.
I have also posted a question in stack overflow http://stackoverflow.com/questions/30915871/connection-timed-out-only-for-w…
Have anybody had a similar experience, any ideas welcome.
With thanks,
Venkat
Hi I just noticed "captchaid" and "captchaword" were removed from the
"edit" api response some time ago (
https://phabricator.wikimedia.org/rSVN104064 ).
Are there no longer conditions where edits require captchas?
If not we'll gladly remove our apps edit captcha handling code :)
-Monte
Hi,
The meta query siteinfo allows to retrieve a list of magic words with basic
information about them (aliases) [1]. Is there any way to retrieve more
information about them, especially how they are supposed to be used: which
ones should be used with {{#...}}, which ones are image parameters, ...
That would help a lot to code an external parser for wikitext.
Thanks in advance
Nico
[1]
https://fr.wikipedia.org/w/api.php?continue=&siprop=magicwords&action=query…
See the announcement at
https://lists.wikimedia.org/pipermail/wikimedia-l/2015-June/078214.html
For the avoidance of confusion: Yes, the move to HTTPS will affect API
requests as well. Your HTTP library should be handling HTTPS for you
transparently, but if for some reason it doesn't you may have to update
your client.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
On Wed, Jun 3, 2015 at 7:29 AM, John Mark Vandenberg <jayvdb(a)gmail.com>
wrote:
> If possible, could you compile a list of bots affected at a lower
> threshold - maybe 1,000. That will give us a better idea of the scale
> of bots operators that will be affected when this lands - currently in
> one months time.
>
I already have the list of *accounts* affected: there are 510 with between
1000 and 10000 hits. Of those, 454 do not contain "bot" (case
insensitively), so they might be human users with user scripts, or AWB if
that's not fixed (someone please check!), or the like. For comparison, in
the over-10000 group there were 30 such that I filtered out.
I'll want to check with Legal to make sure the additional release of
account names is still compliant with the privacy policy (I'm almost but
not entirely sure it would be ok).
> Will the deploy date be moved back if the impact doesnt diminish by
> bots being fixed?
>
That's not impossible, but I wouldn't count on it.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Dear Wiki API team,
I work for Glaxo SmithKline, a pharmaceutical company, and currently preparing for a project we are planning to do with the use of Wikipedia.
We will use our computational tools to assess the accuracy of the drug-related information that can be found on Wikipedia. We will use the open database Open PHACTS as a "gold standard" and compare the information on Wikipedia to this.
On the drug pages of Wikipedia there is normally a data table on the right side with the drug/chemical/pharma related information. Our plan is, if this is possible to carry out, to assess the accuracy of this information and if necessary, correct/update it from our database. If the time constrainsts allow us, I would like to also automatically write some very basic articles on drugs which currently do not have an entry on Wikipedia.
My questions are the following: How do I obtain an API key? On the api home page I saw that I may need a special key if I would like to do so many queries. What are the limitations? Is it possible to carry out the process I have described or should I find a different approach? Previously I thought about using SPARQL to query DBpedia but I found that in the conversion process many of the strings which are important to us chemists (SMILES representations of chemical compounds) get changed because of the special characters.
Thank you very much for your response.
Best wishes,
Luca Bartek
Computational Scientist
Complementary Worker on Assignment at GSK
GSK
Medicines Research Centre, Gunnels Wood Road, Stevenage, Hertfordshire, SG1 2NY, UK
Email luca.x.bartek(a)gsk.com<mailto:monika.x.rella@gsk.com>
Tel +44 1438 762 778
gsk.com<http://www.gsk.com/> | Twitter<http://twitter.com/GSK> | YouTube<http://www.youtube.com/user/gskvision> | Facebook<http://www.facebook.com/glaxosmithkline> | Flickr<http://www.flickr.com/photos/glaxosmithkline>
[cid:image001.png@01CFFDB0.BBC53A10]
________________________________
This e-mail was sent by GlaxoSmithKline Services Unlimited
(registered in England and Wales No. 1047315), which is a
member of the GlaxoSmithKline group of companies. The
registered address of GlaxoSmithKline Services Unlimited
is 980 Great West Road, Brentford, Middlesex TW8 9GS.
If anyone is using action=parse with prop=modules and not including either
jsconfigvars or encodedjsconfigvars in the prop parameter, you will start
receiving a warning suggesting you do so.
Also, action=expandtemplates will now have prop=modules, jsconfigvars, and
encodedjsconfigvars available.
This change should be deployed to WMF wikis with 1.26wmf9, see
https://www.mediawiki.org/wiki/MediaWiki_1.26/Roadmap for the schedule.
--
Brad Jorsch (Anomie)
Software Engineer
Wikimedia Foundation
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce