Per the below, protocol-relative URLs are now enabled on
test.wikipedia.org and will be rolled out to the rest of the wikis
over the course of the next few weeks. What this means is that URLs
used in the interface will now look like //example.com instead of
http://example.com , so we can support both HTTP and HTTPS without
splitting our cache.
The API, in most cases, will not output protocol-relative URLs, but
will continue to output http:// URLs no matter whether you call it
over HTTP or HTTPS. This is because we don't expect API clients to be
able to resolve these correctly, and that the context of these URLs
(which is needed to resolve them) will frequently get lost along the
way. And we don't wanna go breaking clients, now, do we? :)
The exceptions to this, as far as I am aware, are:
* HTML produced by the parser will have protocol-relative URLs in <a
href="..."> tags etc.
* prop=extlinks and list=exturlusage will output URLs verbatim as they
appear in the article, which means they may output protocol-relative
URLs
If you are getting protocol-relative URLs in some other place, that's
probably a bug (or maybe it's intentional and I forgot to list it
here), so please let me know, or e-mail this list, or file bug, if you
see that happening.
Roan Kattouw (Catrope)
---------- Forwarded message ----------
From: Ryan Lane <rlane32(a)gmail.com>
Date: Thu, Jul 14, 2011 at 8:55 PM
Subject: [Wikitech-l] Protocol-relative URLs enabled on test.wikipedia.org
To: Wikimedia developers <wikitech-l(a)lists.wikimedia.org>
Over the past couple days Roan Kattouw and I have been pushing out
changes to enable protocol-relative URL support. We've gotten to a
point where we think it is stable and working.
We've enabled this on test.wikipedia.org, and plan on running it for
two weeks before enabling it elsewhere. Please test if everything is
working properly, especially with regards to the API and bots. Report
bugs in bugzilla if any are found.
- Ryan
_______________________________________________
Wikitech-l mailing list
Wikitech-l(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l
_______________________________________________
Mediawiki-api-announce mailing list
Mediawiki-api-announce(a)lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/mediawiki-api-announce
Hello, I'm hardly trying to get the main content of a page through the API
printed as an one page.
The point is that even when not intentionaly prining the output I get the
whole array with title etc. but I only want the HTML, not more.
That's the script I wrote this way
<?php
$ch = curl_init();
$url =
"http://www.wecowi.de/api.php?action=parse&page=Hilarie_Burton&prop=text&for
mat=txt";
curl_setopt($ch, CURLOPT_URL, $url);
curl_exec($ch);
$textarray = explode( $ch , "=>" );
curl_close($ch);
?>
Which prints the array although there is no print.
Is there any option to get only get the HTML or which of the four php
outputs is the best to get a result that can be reperated and be used for
such a task?
Thanks
DaSch
Hi All,
We are trying to get Image Attribution from Mediawiki. Is there a way to get Image Attribution without doing the Page scraping (url: http://commons.wikimedia.org/wiki/File:Madonna_by_David_Shankbone.jpg )?
In case the only way is to do page scraping and someone has already written the code for page scraping attribution that will be very helpful.
Thanks in advance!
Param
Hi everybody,
I ask here for help. Hope this is the right place. I searched for issues for
my problem on the web but unsuccesfully at time.
I have installed a mediawiki and use API.php in an ksh script with curl to
update my wiki
It works fine except for accents.
I have realised many tests but no one succeed.
exemple of the probleme . I have this character.
"Cette page est mise � jour automatiquement"
I have n problem with accent when I update the page normally"
Do you have an idea for my problem ? Charset?
here is the curl request i send
curl --dump-header headers.txt --progress-bar -g --header "Accept-Language:
fr;Content-Type: application/x-www-form-urlencoded;
charset=utf-8;Connection: Keep-Alive" --user-agent "test/upload" -b
${COOKIES_FILE} -c ${COOKIES_FILE} -d
"format=xml&action=edit&title=${PAGE2EDIT}&text=$(cat ${WIKITABLE
2IMPORT})&token=${EDIT_TOKEN}" ${WIKI_URL}/api.php
the wiki is not visible on internet. intranet use ;o(
Sorry for my poor english. I am french and have lot problems with these
accents!
Hi All,
Is there a wiki API call available using which I can download the image and corresponding attribution in one API call?
For example, for this image http://commons.wikimedia.org/wiki/File:Madonna_by_David_Shankbone.jpg, I need to fetch the actual Image and meta data information.
Thanks
Param
Hi,
I want to search Wikipedia and get first search result (in fact only the title of the result). Could someone tell me what is the API call that I need to use in MediaWiki API?
I tried "query" and "opensearch" calls but they didn't return the results I expected. They didn't return any search result. The title I used was "Anatomical structure". The requirement is that I need to access (get) the title of the first search result.
Kalpa Gunaratna
Hi,
I want to know how I can get all the tiles in a disambiguation page. I use the following API call to get the category list for a word in Wikipedia and if it returns a disambiguation page, I want to get the list of titles listed in the disambiguation page and search for each title using the following command. For example, the word I search here is "Settlement" which means several meanings.
http://en.wikipedia.org/w/api.php?action=query&prop=categories&titles=Settl…
if anybody knows how to do this, let me know. thank you.
Kalpa Gunaratna
Hi,
I am querying MediaWiki to get category information from Wikipedia for certain titles. I have written a java program to get the list but some titles do not return category results.
For example, "Populated places" does not return any category listing for the query but if you search Wikipedia, it gives categories. Following is the generated query for this particular title. I am trying to get results in JSON format. The query doesn't work for xml format either.
http://en.wikipedia.org/w/api.php?action=query&prop=categories&titles=Popul…
what is the problem here???
Kalpa Gunaratna