I have a question on image table from wikipedia DB.
Currently this is what is available from wiki dumps:
regarding populating image table content.
When imported it only populates 800000 * records which is clearly not the
info for all en wikipedia images.
Is this possible to get metadata for all the images for English wikipedia?
I hope to have found the right channel for answering our questions on using
properly Wikipedia contents.
Me and my team are crafting a mobile and web application to browse
wikipedia content in a different fashion.
We are also taking care for caching content on our servers in order to
avoid to "stress" wikipedia servers as much as possible.
Still, we have to make an important decision.
We want to do the following:
1. to "style" wikipedia pages so that we can apply a different look&feel.
2. to obtain images from wikipedia, and resize them to fit our layout
About 1., I'd like to know which is the best practice of doing it.
Should we interace with the API of wiki?
or include css within the mobile clients and override the page content?
Is there any day limit in querying the API?
Or any benchmark to refer to?
About 2., The only possible way is probably using the API.
Again, Is there any day limit in querying the API?
Is there any way to query in one time a stack of images, so that to reduce
the number of requests to wiki servers?
Finally, is there any case study to follow as benchmark?
Thank you so much for helping us!
Not sure if this is the right place, but I can't seem to find the answer to
this question. Do all MediaWiki pages require an h2 header:
In the wikitext? I'm aware that h1 headers are article titles, but there
doesn't seem to be any information on whether or not all pages contain at
least one h2 header.
Thanks Marius and Mark.
Will the API keep unaffected after interlanguages are moved to wikidata.org?
On Fri, Feb 22, 2013 at 10:44 AM, Mark A. Hershberger <mah(a)everybody.org>wrote:
> On 02/22/2013 01:25 PM, Jiang BIAN wrote:
> > Our project is still working with old interlanguage link (parsing
> > "[[en:Proxy server]]"). Is there other way to get the interlanguage link
> > besides talking with www.wikidata.org <http://www.wikidata.org>?
> > The example in API doc seems return a wrong result
> The API document is correct, but you will need to use the llcontinue
> parameter to get the next batch of results.
> You can use the lllimit parameter to get more results in any one response.
> There is no path to peace. Peace is the path.
> -- Mahatma Gandhi, "Non-Violence in Peace and War"
This email may be confidential or privileged. If you received this
communication by mistake, please don't forward it to anyone else, please
erase all copies and attachments, and please let me know that it went to
the wrong person. Thanks.
On 21.02.2013, 17:54 Ewa wrote:
> In what table is the PageImage information store?
page_props from MediaWiki proper.
> 2013/2/20 Max Semenik <maxsem.wiki(a)gmail.com>
> On 20.02.2013, 21:46 Ewa wrote:
>> Thank you very much for your answer.
>> I have triggered the maintenace/refreshLinks.php and it is still running.
>> For the extensions/PageImages/initImageData.php
>> I got the following error:
>> The last attempted database query was:
>> "DELETE FROM `geo_tags` WHERE gt_page_id = '12'"
>> from within function "GeoDataHooks::doDumbUpdate".
>> Database returned error "1146: Table 'wikimirror.geo_tags' doesn't exist (localhost)"
> You didn't install GeoData properly - run update.php
>> I have also noticed the the page_images table is not created unless I run the script manually.
>> It is not being populated at all now while the
>> maintenace/refreshLinks.php is running also. Is this table a source for PageImages API?
> Current design for PageImages (the one deployed on Wikimedia
> wikis) doesn't require a separate table.
> Best regards,
> Max Semenik ([[User:MaxSem]])
Max Semenik ([[User:MaxSem]])
I'm trying to see if it is possible to post an api request to all
wiktionaries with a single query?
As interwikis in wiktionary interlink to other wiktionaries on
homonyms, I thought it to be useful to present users who end up at
404-not found pages with interwiki links to possible existing articles
on other wiktionaries.
These interwikis could be dynamically added to
I wrote a little php script which recursively queries all wiktionaries
to see if an article exists, but it takes too long to query all
wiktionaries to verify the existence of a certain.
See the (draft) script at http://pastebin.com/mLAdDpwA
More on the application of this you can read at
I'd be happy to know if there's a smoother way to approach this,
otherwise a batch-scripting-option?
I have a question on the PageImages MediaWiki extension. I have installed
it on my instanse of MediaWiki apache. It looks like it is working because
with the following hit:
I am seeing the following response:
<page pageid="50116" ns="0" title="Lille"/>
<page pageid="22989" ns="0" title="Paris"/>
<page pageid="46132" ns="0" title="Rennes"/>
I am not able to see any images info in the response, was expecting to see
something similar to:
In mysql wikipedia DB I have populated *image* and *imagelinks* table using
dumps from wikipedia.
I am assuming that this extension uses data from *page_images* table which
is empty on my instance. How to populate this data (where to get the dump
I would appreciate any answer.